Documentation Index
Fetch the complete documentation index at: https://docs.crewai.com/llms.txt
Use this file to discover all available pages before exploring further.
Connect CrewAI to LLMs
CrewAI connects to LLMs through native SDK integrations for the most popular providers (OpenAI, Anthropic, Google Gemini, Azure, and AWS Bedrock), and uses LiteLLM as a flexible fallback for all other providers.By default, CrewAI uses the
gpt-4o-mini model. This is determined by the OPENAI_MODEL_NAME environment variable, which defaults to “gpt-4o-mini” if not set.
You can easily configure your agents to use a different model or provider as described in this guide.Supported Providers
LiteLLM supports a wide range of providers, including but not limited to:- OpenAI
- Anthropic
- Google (Vertex AI, Gemini)
- Azure OpenAI
- AWS (Bedrock, SageMaker)
- Cohere
- VoyageAI
- Hugging Face
- Ollama
- Mistral AI
- Replicate
- Together AI
- AI21
- Cloudflare Workers AI
- DeepInfra
- Groq
- SambaNova
- Nebius AI Studio
- NVIDIA NIMs
- And many more!
To use any provider not covered by a native integration, add LiteLLM as a dependency to your project:Native providers (OpenAI, Anthropic, Google Gemini, Azure, AWS Bedrock) use their own SDK extras — see the Provider Configuration Examples.
Changing the LLM
To use a different LLM with your CrewAI agents, you have several options:- Using a String Identifier
- Using the LLM Class
Pass the model name as a string when initializing the agent:
Configuration Options
When configuring an LLM for your agent, you have access to a wide range of parameters:| Parameter | Type | Description |
|---|---|---|
| model | str | The name of the model to use (e.g., “gpt-4”, “claude-2”) |
| temperature | float | Controls randomness in output (0.0 to 1.0) |
| max_tokens | int | Maximum number of tokens to generate |
| top_p | float | Controls diversity of output (0.0 to 1.0) |
| frequency_penalty | float | Penalizes new tokens based on their frequency in the text so far |
| presence_penalty | float | Penalizes new tokens based on their presence in the text so far |
| stop | str, List[str] | Sequence(s) to stop generation |
| base_url | str | The base URL for the API endpoint |
| api_key | str | Your API key for authentication |
Connecting to OpenAI-Compatible LLMs
You can connect to OpenAI-compatible LLMs using either environment variables or by setting specific attributes on the LLM class:- Using Environment Variables
- Using LLM Class Attributes
Using Local Models with Ollama
For local models like those provided by Ollama:Download and install Ollama
Changing the Base API URL
You can change the base API URL for any LLM provider by setting thebase_url parameter:
Code
