Skip to main content
The Converse API supports three processing modes, controlled by the config.agent_model parameter. Each mode offers a different balance between capability and speed.

Agentic mode

Full agent with multi-step planning, reasoning, and tool execution. This is the default.
agent_modelDescription
magpie-2.0Default. Agentic model with proactive planning and reasoning.
magpie-2.5Beta. Our latest agentic model — faster, more adaptable, and built to handle a broader range of real-world tasks.
magpie-1.1Previous-generation agentic model.
All tools are available in Agentic mode. Structured outputs are supported via text.format (see Structured outputs).
response = datagrid.converse(
    prompt="Analyze my sales data and create a summary report",
    config={
        "agent_model": "magpie-2.0",
        "tools": ["data_analysis", "semantic_search", "create_dataset"]
    }
)

Ask mode

Lightweight single-turn Q&A optimized for RAG (Retrieval-Augmented Generation) use cases. Faster than agentic mode with lower latency.
agent_modelDescription
magpie-1.1-flashFast model that only supports the semantic_search tool.
Only the semantic_search tool is supported in Ask mode. Requests specifying other tools will be rejected. Structured outputs are not supported.
response = datagrid.converse(
    prompt="What is our refund policy?",
    config={
        "agent_model": "magpie-1.1-flash",
        "tools": ["semantic_search"]
    }
)

Fastest mode

Direct LLM response with no planning or tool execution. Lowest latency, best for simple conversational queries that don’t require data retrieval or actions.
agent_modelDescription
llm-onlyDirect LLM conversation with no tool calls.
No tools are executed in Fastest mode. Requests specifying tools will be rejected. Structured output is supported the same way as in agentic mode: include text.format with your JSON Schema on the Converse request (see Structured outputs).
response = datagrid.converse(
    prompt="Summarize the key differences between GAAP and IFRS",
    config={
        "agent_model": "llm-only"
    }
)

Choosing a mode

ModeUse whenLatencyToolsStructured outputs
AgenticYou need multi-step reasoning, tool calls, or data analysisHigherAllYes (text.format)
AskYou need fast answers from knowledge bases (RAG)Mediumsemantic_search onlyNo
FastestYou need quick conversational responses without toolsLowestNoneYes (text.format)
When config.agent_model is omitted, the API defaults to magpie-2.0 (Agentic mode).