Other OpenTelemetry Instrumentors
Learn how to use any OpenTelemetry-compatible instrumentor with LangWatch.
LangWatch is designed to be compatible with the broader OpenTelemetry ecosystem. Beyond the specifically documented integrations, you can use LangWatch with any Python library that has an OpenTelemetry instrumentor, provided that the instrumentor adheres to the standard OpenTelemetry Python BaseInstrumentor
interface.
Using Custom/Third-Party OpenTelemetry Instrumentors
If you have a specific library you want to trace, and there’s an OpenTelemetry instrumentor available for it (either a community-provided one not yet listed in our specific integrations, or one you’ve developed yourself), you can integrate it with LangWatch.
The key is that the instrumentor should be an instance of a class that inherits from opentelemetry.instrumentation.instrumentor.BaseInstrumentor
. You can find the official documentation for this base class here:
Integration via langwatch.setup()
To use such an instrumentor, you simply pass an instance of it to the instrumentors
list in the langwatch.setup()
call. LangWatch will then manage its lifecycle (calling its instrument()
and uninstrument()
methods appropriately).
Here’s a conceptual example using the OpenTelemetry LoggingInstrumentor
:
When this code runs, the LoggingInstrumentor
(managed by langwatch.setup()
) will automatically create OpenTelemetry spans for any log messages emitted by the standard Python logging
module. LangWatch will then capture these spans.
Discovering More Community Instrumentors
Many Python libraries, especially in the AI/ML space, are instrumented by community-driven OpenTelemetry projects. If you’re looking for pre-built instrumentors, these are excellent places to start:
-
OpenInference (by Arize AI): https://github.com/Arize-ai/openinference
- This project provides instrumentors for a wide range of AI/ML libraries and frameworks. Examples include:
- OpenAI
- Anthropic
- LiteLLM
- Haystack
- LlamaIndex
- LangChain
- Groq
- Google Gemini
- And more (check their repository for the full list).
- This project provides instrumentors for a wide range of AI/ML libraries and frameworks. Examples include:
-
OpenLLMetry (by Traceloop): https://github.com/traceloop/openllmetry
- This project also offers a comprehensive suite of instrumentors for LLM applications and related tools. Examples include:
- OpenAI
- CrewAI
- Haystack
- LangChain
- LlamaIndex
- Pinecone
- ChromaDB
- And more (explore their repository for details).
- This project also offers a comprehensive suite of instrumentors for LLM applications and related tools. Examples include:
You can browse these repositories to find instrumentors for other libraries you might be using. If an instrumentor from these projects (or any other source) adheres to the BaseInstrumentor
interface, you can integrate it with LangWatch using the langwatch.setup(instrumentors=[...])
method described above.
Key Considerations:
BaseInstrumentor
Compliance: Ensure the instrumentor correctly implements theBaseInstrumentor
interface, particularly theinstrument()
anduninstrument()
methods, andinstrumentation_dependencies()
.- Installation: You’ll need to have the custom instrumentor package installed in your Python environment, along with the library it instruments.
- TracerProvider: LangWatch configures an OpenTelemetry
TracerProvider
. The instrumentor, when activated by LangWatch, will use this provider to create spans. If you are managing your OpenTelemetry setup more directly (e.g., providing your ownTracerProvider
tolangwatch.setup()
), the instrumentor will use that instead. - Data Quality: The quality and detail of the telemetry data captured will depend on how well the custom instrumentor is written.
By leveraging the BaseInstrumentor
interface, LangWatch remains flexible and extensible, allowing you to bring telemetry from a wide array of Python libraries into your observability dashboard.