CrewAI
Learn how to instrument the CrewAI Python SDK with LangWatch.
LangWatch does not have a built-in auto-tracking integration for CrewAI. However, you can use community-provided instrumentors to integrate CrewAI with LangWatch.
Community Instrumentors
There are two main community instrumentors available for CrewAI:
To use these instrumentors with LangWatch, you would typically configure them to export telemetry data via OpenTelemetry, which LangWatch can then ingest.
Integrating Community Instrumentors with LangWatch
Community-provided OpenTelemetry instrumentors for CrewAI, like those from OpenLLMetry or OpenInference, allow you to automatically capture detailed trace data from your CrewAI agents and tasks. LangWatch can seamlessly integrate with these instrumentors.
There are two main ways to integrate these:
1. Via langwatch.setup()
You can pass an instance of the CrewAI instrumentor to the instrumentors
list in the langwatch.setup()
call. LangWatch will then manage the lifecycle of this instrumentor.
Ensure you have the respective community instrumentation library installed:
- For OpenLLMetry:
pip install opentelemetry-instrumentation-crewai
- For OpenInference:
pip install openinference-instrumentation-crewai
Consult the specific library’s documentation for the exact package name and instrumentor class if the above assumptions are incorrect.
2. Direct Instrumentation
If you have an existing OpenTelemetry TracerProvider
configured in your application (or if LangWatch is configured to use the global provider), you can use the community instrumentor’s instrument()
method directly. LangWatch will automatically pick up the spans generated by these instrumentors as long as its exporter is part of the active TracerProvider
.
Key points for community instrumentors:
- These instrumentors typically patch CrewAI at a global level or integrate deeply with its execution flow, meaning all CrewAI operations (agents, tasks, tools) should be captured once instrumented.
- If using
langwatch.setup(instrumentors=[...])
, LangWatch handles the setup and lifecycle of the instrumentor. - If instrumenting directly (e.g.,
CrewAIInstrumentor().instrument()
), ensure that theTracerProvider
used by the instrumentor is the same one LangWatch is exporting from. This usually means LangWatch is configured to use an existing global provider or one you explicitly pass tolangwatch.setup()
. - Always refer to the specific documentation of the community instrumentor (OpenLLMetry or OpenInference) for the most accurate and up-to-date installation and usage instructions, including the correct class names for instrumentors and any specific setup requirements.