Prerequisites
-
Install LangWatch SDK:
-
Install LlamaIndex and OpenInference instrumentor:
- Set up your LLM provider: You’ll need to configure your preferred LLM provider (OpenAI, Anthropic, etc.) with the appropriate API keys.
Instrumentation with OpenInference
LangWatch supports seamless observability for LlamaIndex using the OpenInference LlamaIndex instrumentor. This approach automatically captures traces from your LlamaIndex applications and sends them to LangWatch.Basic Setup (Automatic Tracing)
Here’s the simplest way to instrument your application:Optional: Using Decorators for Additional Context
If you want to add additional context or metadata to your traces, you can optionally use the@langwatch.trace()
decorator:
How it Works
-
langwatch.setup()
: Initializes the LangWatch SDK, which includes setting up an OpenTelemetry trace exporter. This exporter is ready to receive spans from any OpenTelemetry-instrumented library in your application. -
LlamaIndexInstrumentor()
: The OpenInference instrumentor automatically patches LlamaIndex components to create OpenTelemetry spans for their operations, including:- Document loading and processing
- Index creation and updates
- Query execution
- LLM calls
- Retrieval operations
-
Optional Decorators: You can optionally use
@langwatch.trace()
to add additional context and metadata to your traces, but it’s not required for basic functionality.
Notes
- You do not need to set any OpenTelemetry environment variables or configure exporters manually—
langwatch.setup()
handles everything. - You can combine LlamaIndex instrumentation with other instrumentors (e.g., OpenAI, LangChain) by adding them to the
instrumentors
list. - The
@langwatch.trace()
decorator is optional - the OpenInference instrumentor will capture all LlamaIndex activity automatically. - For advanced configuration (custom attributes, endpoint, etc.), see the Python integration guide.
Troubleshooting
- Make sure your
LANGWATCH_API_KEY
is set in the environment. - If you see no traces in LangWatch, check that the instrumentor is included in
langwatch.setup()
and that your LlamaIndex code is being executed. - Ensure you have the correct API keys set for your chosen LLM provider.