OpenTelemetry is a standard protocol for tracing, and LangWatch is fully compatible with OpenTelemetry, you can use any OpenTelemetry compatible library to capture your LLM traces and send them to LangWatch.

This guide demonstrates the OpenTelemetry integration using Python, but the same principles apply to integration with OpenTelemetry instrumentation in other languages.

Prerequisites

Installation

pip install opentelemetry

Configuration

Set up LangWatch as the OpenTelemetry exporter endpoint:

import os
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk import trace as trace_sdk
from opentelemetry.sdk.trace.export import ConsoleSpanExporter, SimpleSpanProcessor

# Set up OpenTelemetry trace provider with LangWatch as the endpoint
tracer_provider = trace_sdk.TracerProvider()
tracer_provider.add_span_processor(
    SimpleSpanProcessor(
        OTLPSpanExporter(
            endpoint="https://app.langwatch.ai/api/otel/v1/traces",
            headers={"Authorization": "Bearer " + os.environ["LANGWATCH_API_KEY"]},
        )
    )
)
# Optionally, you can also print the spans to the console.
tracer_provider.add_span_processor(SimpleSpanProcessor(ConsoleSpanExporter()))

Capturing LLM Traces

Currently, there are different open initiatives for LLM instrumentation libraries, here we show some examples on how to capture LLM traces with a couple of them.

Installation:

pip install openinference-instrumentation-openai

Then, instrument your OpenAI calls:

from openinference.instrumentation.openai import OpenAIInstrumentor

OpenAIInstrumentor().instrument(tracer_provider=tracer_provider)

That’s it! You can now see the traces for your OpenAI calls in the LangWatch dashboard:

Capturing Metadata

You can use OpenInference’s using_attributes context manager to capture additional information for your LLM calls, such as the user_id, session_id (equivalent to thread id), tags and metadata:

from openinference.instrumentation import using_attributes

def main():
  with using_attributes(
      session_id="my-test-session",
      user_id="my-test-user",
      tags=["tag-1", "tag-2"],
      metadata={"foo": "bar"},
  ):
      # Your LLM call