OpenTelemetry Integration Guide
OpenTelemetry is a standard protocol for tracing, and LangWatch is fully compatible with OpenTelemetry, you can use any OpenTelemetry compatible library to capture your LLM traces and send them to LangWatch.
This guide demonstrates the OpenTelemetry integration using Python, but the same principles apply to integration with OpenTelemetry instrumentation in other languages.
Prerequisites
- Obtain your
LANGWATCH_API_KEY
from the LangWatch dashboard.
Installation
Configuration
Set up LangWatch as the OpenTelemetry exporter endpoint:
Capturing LLM Traces
Currently, there are different open initiatives for LLM instrumentation libraries, here we show some examples on how to capture LLM traces with a couple of them.
Installation:
Then, instrument your OpenAI calls:
That’s it! You can now see the traces for your OpenAI calls in the LangWatch dashboard:
Capturing Metadata
You can use OpenInference’s using_attributes
context manager to capture additional information for your LLM calls, such as the user_id, session_id (equivalent to thread id), tags and metadata: