Integrate Datadog with CrewAI
This guide will demonstrate how to integrate Datadog LLM Observability with CrewAI using Datadog auto-instrumentation. By the end of this guide, you will be able to submit LLM Observability traces to Datadog and view your CrewAI agent runs in Datadog LLM Observability’s Agentic Execution View.What is Datadog LLM Observability?
Datadog LLM Observability helps AI engineers, data scientists, and application developers quickly develop, evaluate, and monitor LLM applications. Confidently improve output quality, performance, costs, and overall risk with structured experiments, end-to-end tracing across AI agents, and evaluations.Getting Started
Install Dependencies
Set Environment Variables
If you do not have a Datadog API key, you can create an account and get your API key. You will also need to specify an ML Application name in the following environment variables. An ML Application is a grouping of LLM Observability traces associated with a specific LLM-based application. See ML Application Naming Guidelines for more information on limitations with ML Application names.Create a CrewAI Agent Application
Run the Application with Datadog Auto-Instrumentation
With the environment variables set, you can now run the application with Datadog auto-instrumentation.View the Traces in Datadog
After running the application, you can view the traces in Datadog LLM Observability’s Traces View, selecting the ML Application name you chose from the top-left dropdown. Clicking on a trace will show you the details of the trace, including total tokens used, number of LLM calls, models used, and estimated cost. Clicking into a specific span will narrow down these details, and show related input, output, and metadata.

