Skip to main content

Documentation Index

Fetch the complete documentation index at: https://developers.datagrid.com/llms.txt

Use this file to discover all available pages before exploring further.

In the Datagrid web app, chat uses three modes—Ask, Extended, and Execute—which correspond to the Converse chat_mode field (llm_router, light_agent, full_agent). This page documents config.agent_model, which selects the model implementation and which tools are allowed. Structured outputs: For any agent_model, you can request JSON that matches a schema by passing text.format (JSON Schema) on the Converse request. The same mechanism applies across modes; tool limits below still apply. Important: The Ask mode in the web app is the llm_router chat_mode, not magpie-1.1-flash. The magpie-1.1-flash model aligns with Extended in the app (search-backed, lighter agent). Use both chat_mode and agent_model when reproducing in-app behavior from the API.

Ask (in-app; chat_mode: llm_router)

Ask is a product / routing mode, not a value of config.agent_model. In the web app it is selected with chat_mode: llm_router. What it is good for: Fast, LLM-forward answers when the product has already chosen an agent for instructions and context. The server resolves routing (including agent context), then follows an LLM-focused path rather than full Execute-style multi-tool planning. How it differs from llm-only: llm-only is an explicit agent_model on the Converse API: you get the direct LLM tier with no tool execution and predictable behavior from that string alone. llm_router instead means “use the in-app Ask routing rules” (agent resolution, mode refinement, default models) on top of whatever agent_model and agent_id you pass. To mirror Ask from the API, set chat_mode to llm_router and the same agent identifiers you use in the app; do not assume agent_model: llm-only alone reproduces Ask, and do not confuse llm_router with Extended (light_agent / magpie-1.1-flash). Structured outputs for turns that use llm_router still use text.format like any other Converse call.

Execute (full agent models)

Full agent with multi-step planning, reasoning, and broad tool execution. Matches Execute in the Datagrid web app (chat_mode: full_agent). This is the default when agent_model is omitted.
agent_modelDescription
magpie-2.0Default. Full agent model with proactive planning and reasoning.
magpie-2.5Beta. Latest full-agent model — faster, more adaptable, and built to handle a broader range of real-world tasks.
magpie-1.1Previous-generation full agent model.
All tools listed for agents are available for Execute-tier models. Structured outputs use text.format (see Structured outputs).
response = datagrid.converse(
    prompt="Analyze my sales data and create a summary report",
    config={
        "agent_model": "magpie-2.0",
        "tools": ["data_analysis", "semantic_search", "create_dataset"]
    }
)

Extended (search-focused agent)

Lightweight turns optimized for RAG (retrieval-augmented generation). Matches Extended in the Datagrid web app (chat_mode: light_agent). Faster than Execute-tier models with lower latency.
agent_modelDescription
magpie-1.1-flashFast model that only supports the semantic_search tool.
Only the semantic_search tool is supported for magpie-1.1-flash. Requests specifying other tools will be rejected. Structured outputs use text.format like other models (see Structured outputs).
response = datagrid.converse(
    prompt="What is our refund policy?",
    config={
        "agent_model": "magpie-1.1-flash",
        "tools": ["semantic_search"]
    }
)

Direct LLM (llm-only)

Direct LLM response with no planning or tool execution. Lowest latency, best for simple conversational or structured JSON answers that do not need retrieval or actions.
agent_modelDescription
llm-onlyDirect LLM conversation with no tool calls.
llm-only is the API’s explicit tool-free model key. It is not the same thing as selecting Ask in the web app: Ask is chat_mode: llm_router (see the Ask section at the top of this page). No tools are executed for llm-only. Requests specifying tools will be rejected. Structured outputs use text.format (see Structured outputs).
response = datagrid.converse(
    prompt="Summarize the key differences between GAAP and IFRS",
    config={
        "agent_model": "llm-only"
    }
)

Choosing a mode

Tier (in-app label)API surfaceUse whenLatencyToolsStructured outputs
Askchat_mode: llm_router (not an agent_model string)Routed LLM-first answers with agent context in the productLowerPer routing / resolved pathYes
Executemagpie-2.0, magpie-2.5, magpie-1.1Multi-step reasoning, tool calls, or data analysisHigherAllYes
Extendedmagpie-1.1-flashFast answers from knowledge bases (RAG) with semantic_search onlyMediumsemantic_search onlyYes
Direct LLMllm-onlyTool-free conversational or structured JSON from a fixed model keyLowestNoneYes
When config.agent_model is omitted, the API defaults to magpie-2.0 (Execute tier).