Overview
CrewAI supports custom LLM implementations through theBaseLLM
abstract base class. This allows you to integrate any LLM provider that doesn’t have built-in support in LiteLLM, or implement custom authentication mechanisms.
Quick Start
Here’s a minimal custom LLM implementation:Using Your Custom LLM
Required Methods
Constructor: __init__()
Critical: You must call super().__init__(model, temperature)
with the required parameters:
Abstract Method: call()
The call()
method is the heart of your LLM implementation. It must:
- Accept messages (string or list of dicts with ‘role’ and ‘content’)
- Return a string response
- Handle tools and function calling if supported
- Raise appropriate exceptions for errors
Optional Methods
Common Patterns
Error Handling
Custom Authentication
Stop Words Support
CrewAI automatically adds"\nObservation:"
as a stop word to control agent behavior. If your LLM supports stop words:
Function Calling
If your LLM supports function calling, implement the complete flow:Troubleshooting
Common Issues
Constructor Errors- Ensure
supports_function_calling()
returnsTrue
- Check that you handle
tool_calls
in the response - Verify
available_functions
parameter is used correctly
- Verify API key format and permissions
- Check authentication header format
- Ensure endpoint URLs are correct
- Validate response structure before accessing nested fields
- Handle cases where content might be None
- Add proper error handling for malformed responses