How to monitor cost, latency, and performance of CrewAI Agents using Langtrace, an external observability tool.
Langtrace is an open-source, external tool that helps you set up observability and evaluations for Large Language Models (LLMs), LLM frameworks, and Vector Databases. While not built directly into CrewAI, Langtrace can be used alongside CrewAI to gain deep visibility into the cost, latency, and performance of your CrewAI Agents. This integration allows you to log hyperparameters, monitor performance regressions, and establish a process for continuous improvement of your Agents.
Sign up for Langtrace
Sign up by visiting https://langtrace.ai/signup.
Create a project
Set the project type to CrewAI
and generate an API key.
Install Langtrace in your CrewAI project
Use the following command:
Import Langtrace
Import and initialize Langtrace at the beginning of your script, before any CrewAI imports:
LLM Token and Cost Tracking
Trace Graph for Execution Steps
Dataset Curation with Manual Annotation
Prompt Versioning and Management
Prompt Playground with Model Comparisons
Testing and Evaluations