Learn how to create custom LLM implementations in CrewAI.
BaseLLM
abstract base class. This allows you to integrate any LLM provider that doesn’t have built-in support in LiteLLM, or implement custom authentication mechanisms.
__init__()
super().__init__(model, temperature)
with the required parameters:
call()
call()
method is the heart of your LLM implementation. It must:
"\nObservation:"
as a stop word to control agent behavior. If your LLM supports stop words:
supports_function_calling()
returns True
tool_calls
in the responseavailable_functions
parameter is used correctly