Agent Monitoring with Arize Phoenix
Learn how to integrate Arize Phoenix with CrewAI via OpenTelemetry using OpenInference
Integrate Arize Phoenix with CrewAI
This guide demonstrates how to integrate Arize Phoenix with CrewAI using OpenTelemetry via the OpenInference SDK. By the end of this guide, you will be able to trace your CrewAI agents and easily debug your agents.
What is Arize Phoenix? Arize Phoenix is an LLM observability platform that provides tracing and evaluation for AI applications.
Get Started
We’ll walk through a simple example of using CrewAI and integrating it with Arize Phoenix via OpenTelemetry using OpenInference.
You can also access this guide on Google Colab.
Step 1: Install Dependencies
Step 2: Set Up Environment Variables
Setup Phoenix Cloud API keys and configure OpenTelemetry to send traces to Phoenix. Phoenix Cloud is a hosted version of Arize Phoenix, but it is not required to use this integration.
You can get your free Serper API key here.
Step 3: Initialize OpenTelemetry with Phoenix
Initialize the OpenInference OpenTelemetry instrumentation SDK to start capturing traces and send them to Phoenix.
Step 4: Create a CrewAI Application
We’ll create a CrewAI application where two agents collaborate to research and write a blog post about AI advancements.
Step 5: View Traces in Phoenix
After running the agent, you can view the traces generated by your CrewAI application in Phoenix. You should see detailed steps of the agent interactions and LLM calls, which can help you debug and optimize your AI agents.
Log into your Phoenix Cloud account and navigate to the project you specified in the project_name
parameter. You’ll see a timeline view of your trace with all the agent interactions, tool usages, and LLM calls.
Version Compatibility Information
- Python 3.8+
- CrewAI >= 0.86.0
- Arize Phoenix >= 7.0.1
- OpenTelemetry SDK >= 1.31.0
References
- Phoenix Documentation - Overview of the Phoenix platform.
- CrewAI Documentation - Overview of the CrewAI framework.
- OpenTelemetry Docs - OpenTelemetry guide
- OpenInference GitHub - Source code for OpenInference SDK.
Was this page helpful?