Configure Azure OpenAI with Crew Studio for enterprise LLM connections
This guide walks you through connecting Azure OpenAI with Crew Studio for seamless enterprise AI operations.
Access Azure OpenAI Studio
Azure AI Services > select your deployment > open Azure OpenAI Studio
.Deployments
. If you don’t have one, create a deployment with your desired model.Target URI
and Key
on the right side of the page. Keep this page open, as you’ll need this information.
Configure CrewAI Enterprise Connection
CrewAI Enterprise > LLM Connections
. Name your LLM Connection, select Azure as the provider, and choose the same model you selected in Azure.AZURE_DEPLOYMENT_TARGET_URL
(using the Target URI). The URL should look like this: https://your-deployment.openai.azure.com/openai/deployments/gpt-4o/chat/completions?api-version=2024-08-01-previewAZURE_API_KEY
(using the Key).Add Connection
to save your LLM Connection.Set Default Configuration
CrewAI Enterprise > Settings > Defaults > Crew Studio LLM Settings
, set the new LLM Connection and model as defaults.Configure Network Access
Azure OpenAI > select your deployment
.Resource Management > Networking
.Allow access from all networks
is enabled. If this setting is restricted, CrewAI may be blocked from accessing your Azure OpenAI endpoint.You’re all set! Crew Studio will now use your Azure OpenAI connection. Test the connection by creating a simple crew or task to ensure everything is working properly.
If you encounter issues: