Supported models
Specify models inprovider:model format (for example, google_genai:gemini-3.1-pro-preview, openai:gpt-5.4, or anthropic:claude-sonnet-4-6). For valid provider strings, see the model_provider parameter of init_chat_model. For provider-specific configuration, see chat model integrations.
Suggested models
These models perform well on the Deep Agents eval suite, which tests basic agent operations. Passing these evals is necessary but not sufficient for strong performance on longer, more complex tasks.| Provider | Models |
|---|---|
gemini-3.1-pro-preview, gemini-3-flash-preview | |
| OpenAI | gpt-5.4, gpt-4o, gpt-5.4, o4-mini, gpt-5.2-codex, gpt-4o-mini, o3 |
| Anthropic | claude-opus-4-6, claude-opus-4-5, claude-sonnet-4-6, claude-sonnet-4, claude-sonnet-4-5, claude-haiku-4-5, claude-opus-4-1 |
| Open-weight | GLM-5, Kimi-K2.5, MiniMax-M2.5, qwen3.5-397B-A17B, devstral-2-123B |
Configure model parameters
Pass a model string tocreateDeepAgent in provider:model format, or pass a configured model instance for full control. Under the hood, model strings are resolved via init_chat_model.
To configure model-specific parameters, use init_chat_model or instantiate a provider model class directly:
Available parameters vary by provider. See the chat model integrations page for provider-specific configuration options.
Provider profiles
AProviderProfile packages initialization parameters that apply when you provide a provider:model string when creating the deep agent. It does not apply when you pass a preconfigured model with init_chat_model.
You can register at two levels, and both can coexist:
- Provider level — a bare provider key like
"openai"applies to every model from theopenaiprovider. - Model level — a
provider:modelkey like"openai:gpt-5.4"applies only to that specific model, and merges on top of any matching provider-level profile.
Select a model at runtime
If your application lets users choose a model (for example using a dropdown in the UI), use middleware to swap the model at runtime without rebuilding the agent.Learn more
- Models in LangChain: chat model features including tool calling, structured output, and multimodality
Connect these docs to Claude, VSCode, and more via MCP for real-time answers.

