This guide walks you through connecting Azure OpenAI with Crew Studio for seamless enterprise AI operations.

Setup Process

1

Access Azure OpenAI Studio

  1. In Azure, go to Azure AI Services > select your deployment > open Azure OpenAI Studio.
  2. On the left menu, click Deployments. If you don’t have one, create a deployment with your desired model.
  3. Once created, select your deployment and locate the Target URI and Key on the right side of the page. Keep this page open, as you’ll need this information.
    Azure OpenAI Studio
2

Configure CrewAI Enterprise Connection

  1. In another tab, open CrewAI Enterprise > LLM Connections. Name your LLM Connection, select Azure as the provider, and choose the same model you selected in Azure.
  2. On the same page, add environment variables from step 3:
  3. Click Add Connection to save your LLM Connection.
3

Set Default Configuration

  1. In CrewAI Enterprise > Settings > Defaults > Crew Studio LLM Settings, set the new LLM Connection and model as defaults.
4

Configure Network Access

  1. Ensure network access settings:
    • In Azure, go to Azure OpenAI > select your deployment.
    • Navigate to Resource Management > Networking.
    • Ensure that Allow access from all networks is enabled. If this setting is restricted, CrewAI may be blocked from accessing your Azure OpenAI endpoint.

Verification

You’re all set! Crew Studio will now use your Azure OpenAI connection. Test the connection by creating a simple crew or task to ensure everything is working properly.

Troubleshooting

If you encounter issues:

  • Verify the Target URI format matches the expected pattern
  • Check that the API key is correct and has proper permissions
  • Ensure network access is configured to allow CrewAI connections
  • Confirm the deployment model matches what you’ve configured in CrewAI