Protip: wanna to get started even faster? Copy our llms.txt and ask an AI to do this integration
Prerequisites
- Obtain your
LANGWATCH_API_KEY
from the LangWatch dashboard - Install the OpenTelemetry SDK for your programming language
LangWatch OTEL API Endpoint
LangWatch provides a standard OpenTelemetry Protocol (OTLP) endpoint for receiving traces:General Setup Pattern
The setup follows this general pattern across all languages:- Install OpenTelemetry SDK for your language
- Configure the OTLP exporter to point to LangWatch’s endpoint
- Set up authentication using your API key
- Initialize the trace provider with the exporter
- Instrument your LLM calls using available instrumentation libraries
Language-Specific Examples
- Java
- C#/.NET
1
Install OpenTelemetry
Add to your
pom.xml
:2
Configure the exporter
3
Instrument your LLM calls
Available Instrumentation Libraries
LangWatch works with any OpenTelemetry-compatible instrumentation library. Here are some popular options:Java Libraries
- Spring AI: Spring AI provides built-in observability support for AI applications, including OpenTelemetry integration for tracing LLM calls and AI operations
- OpenTelemetry Java SDK: Use OpenTelemetry Java SDK with custom spans
.NET Libraries
- Azure Monitor OpenTelemetry: Azure Monitor OpenTelemetry provides comprehensive OpenTelemetry support for .NET applications, including automatic instrumentation and Azure-specific features
- OpenTelemetry .NET SDK: Use OpenTelemetry .NET SDK with custom instrumentation
Manual Instrumentation
If no automatic instrumentation is available for your LLM provider, you can manually create spans:Environment Variables
Set these environment variables for authentication:Verification
After setting up your instrumentation, you can verify that traces are being sent to LangWatch by:- Making a few LLM calls in your application
- Checking the LangWatch dashboard for incoming traces
- Looking for spans with your service name and LLM call details
Troubleshooting
Traces not appearing in LangWatch
Traces not appearing in LangWatch
- Verify your API key is correct and has proper permissions
- Check that the endpoint URL is correct:
https://app.langwatch.ai/api/otel/v1/traces
- Ensure your application is making LLM calls after instrumentation is set up
- Check network connectivity to the LangWatch endpoint
Authentication errors
Authentication errors
- Verify the Authorization header format:
Bearer YOUR_API_KEY
- Ensure the API key is valid and not expired
- Check that the API key has the necessary permissions for trace ingestion
Performance issues
Performance issues
- Consider using batch span processors for high-volume applications
- Implement sampling to reduce the number of traces sent
- Use async span processors to avoid blocking your application
Next Steps
- Explore the LangWatch dashboard to view your traces
- Set up custom evaluations for your LLM calls