OpenTelemetry Integration Guide
Use OpenTelemetry to capture LLM traces and send them to LangWatch
OpenTelemetry is a standard protocol for tracing, and LangWatch is fully compatible with OpenTelemetry, you can use any OpenTelemetry compatible library to capture your LLM traces and send them to LangWatch.
This guide demonstrates the OpenTelemetry integration using Python, but the same principles apply to integration with OpenTelemetry instrumentation in other languages.
Prerequisites
- Obtain your
LANGWATCH_API_KEY
from the LangWatch dashboard.
Installation
Configuration
Set up LangWatch as the OpenTelemetry exporter endpoint:
Capturing LLM Traces
Currently, there are different open initiatives for LLM instrumentation libraries, here we show some examples on how to capture LLM traces with a couple of them.
Installation:
Then, instrument your OpenAI calls:
That’s it! You can now see the traces for your OpenAI calls in the LangWatch dashboard:
Capturing Metadata
You can use OpenInference’s using_attributes
context manager to capture additional information for your LLM calls, such as the user_id, session_id (equivalent to thread id), tags and metadata:
Installation:
Then, instrument your OpenAI calls:
That’s it! You can now see the traces for your OpenAI calls in the LangWatch dashboard:
Capturing Metadata
You can use OpenInference’s using_attributes
context manager to capture additional information for your LLM calls, such as the user_id, session_id (equivalent to thread id), tags and metadata:
Installation:
Then, instrument your AWS calls:
That’s it! You can now see the traces for your AWS Bedrock calls in the LangWatch dashboard.
Capturing Metadata
You can use OpenInference’s using_attributes
context manager to capture additional information for your LLM calls, such as the user_id, session_id (equivalent to thread id), tags and metadata:
Installation:
Then, instrument your DSPy calls:
That’s it! You can now see the traces for your DSPy calls in the LangWatch dashboard:
Capturing Metadata
You can use OpenInference’s using_attributes
context manager to capture additional information for your LLM calls, such as the user_id, session_id (equivalent to thread id), tags and metadata:
Installation:
Then, instrument your Haystack calls:
That’s it! You can now see the traces for your Haystack calls in the LangWatch dashboard:
Capturing Metadata
You can use OpenInference’s using_attributes
context manager to capture additional information for your LLM calls, such as the user_id, session_id (equivalent to thread id), tags and metadata:
Installation:
Then, instrument your LangChain calls:
That’s it! You can now see the traces for your LangChain calls in the LangWatch dashboard:
Capturing Metadata
You can use OpenInference’s using_attributes
context manager to capture additional information for your LLM calls, such as the user_id, session_id (equivalent to thread id), tags and metadata:
Installation:
Then, instrument your LangChain calls. CrewAI uses LangChain under the hood, so we instrument both:
That’s it! You can now see the traces for your CrewAI calls in the LangWatch dashboard.
Capturing Metadata
You can use OpenInference’s using_attributes
context manager to capture additional information for your LLM calls, such as the user_id, session_id (equivalent to thread id), tags and metadata:
Installation:
Note: The Autogen integration is currently experimental and may have limitations or unexpected behavior.
Then, instrument Autogen:
That’s it! You can now see the traces from inside Autogen in the LangWatch dashboard.
Capturing Metadata
You can use OpenInference’s using_attributes
context manager to capture additional information for your LLM calls, such as the user_id, session_id (equivalent to thread id), tags and metadata:
Installation:
Then, instrument your OpenAI calls:
That’s it! You can now see the traces for your OpenAI calls in the LangWatch dashboard:
Capturing Metadata
You can use OpenInference’s using_attributes
context manager to capture additional information for your LLM calls, such as the user_id, session_id (equivalent to thread id), tags and metadata:
Installation:
Then, instrument your OpenAI calls:
That’s it! You can now see the traces for your OpenAI calls in the LangWatch dashboard:
Capturing Metadata
You can use OpenInference’s using_attributes
context manager to capture additional information for your LLM calls, such as the user_id, session_id (equivalent to thread id), tags and metadata:
Installation:
Then, instrument your AWS calls:
That’s it! You can now see the traces for your AWS Bedrock calls in the LangWatch dashboard.
Capturing Metadata
You can use OpenInference’s using_attributes
context manager to capture additional information for your LLM calls, such as the user_id, session_id (equivalent to thread id), tags and metadata:
Installation:
Then, instrument your DSPy calls:
That’s it! You can now see the traces for your DSPy calls in the LangWatch dashboard:
Capturing Metadata
You can use OpenInference’s using_attributes
context manager to capture additional information for your LLM calls, such as the user_id, session_id (equivalent to thread id), tags and metadata:
Installation:
Then, instrument your Haystack calls:
That’s it! You can now see the traces for your Haystack calls in the LangWatch dashboard:
Capturing Metadata
You can use OpenInference’s using_attributes
context manager to capture additional information for your LLM calls, such as the user_id, session_id (equivalent to thread id), tags and metadata:
Installation:
Then, instrument your LangChain calls:
That’s it! You can now see the traces for your LangChain calls in the LangWatch dashboard:
Capturing Metadata
You can use OpenInference’s using_attributes
context manager to capture additional information for your LLM calls, such as the user_id, session_id (equivalent to thread id), tags and metadata:
Installation:
Then, instrument your LangChain calls. CrewAI uses LangChain under the hood, so we instrument both:
That’s it! You can now see the traces for your CrewAI calls in the LangWatch dashboard.
Capturing Metadata
You can use OpenInference’s using_attributes
context manager to capture additional information for your LLM calls, such as the user_id, session_id (equivalent to thread id), tags and metadata:
Installation:
Note: The Autogen integration is currently experimental and may have limitations or unexpected behavior.
Then, instrument Autogen:
That’s it! You can now see the traces from inside Autogen in the LangWatch dashboard.
Capturing Metadata
You can use OpenInference’s using_attributes
context manager to capture additional information for your LLM calls, such as the user_id, session_id (equivalent to thread id), tags and metadata:
Installation:
Then, instrument your OpenAI calls:
That’s it! You can now see the traces for your OpenAI calls in the LangWatch dashboard:
Installation:
Then, instrument your OpenAI calls:
That’s it! You can now see the traces for your OpenAI calls in the LangWatch dashboard:
Installation:
Then, instrument your Anthropic calls:
That’s it! You can now see the traces for your Anthropic calls in the LangWatch dashboard:
Installation:
Then, instrument your LangChain calls:
That’s it! You can now see the traces for your LangChain calls in the LangWatch dashboard:
Installation:
Then, instrument your LlamaIndex calls:
That’s it! You can now see the traces for your LlamaIndex calls in the LangWatch dashboard.