Protip: wanna to get started even faster? Copy our llms.txt and ask an AI to do this integration
Prerequisites
Before you begin, ensure you have:- Go 1.19 or later installed on your system
- A LangWatch account at app.langwatch.ai
- An OpenAI API key (or other LLM provider key)
- Basic familiarity with Go and OpenTelemetry concepts
If you’re new to OpenTelemetry, don’t worry! The LangWatch SDK handles most of the complexity for you. You only need to understand the basic concepts of traces and spans.
Setup
Get started in just a few minutes by installing the SDK and instrumenting your application.1
Get your LangWatch API Key
Sign up at app.langwatch.ai and find your API key in your project settings. Set it as an environment variable:
You can verify your API key is set by running
echo $LANGWATCH_API_KEY
2
Install SDK Packages
Add the required dependencies to your Go module:
Verify installation by running
go mod tidy
and checking that all dependencies are resolved.3
Configure OpenTelemetry
Set up the LangWatch exporter in your application initialization:
Always call the shutdown function to ensure traces are flushed when your application exits.
4
Instrument Your OpenAI Client
Add the LangWatch middleware to your OpenAI client:
The middleware automatically captures all OpenAI API calls, including streaming responses, token usage, and model information.
5
Create Your First Trace
Start a root span to capture your LLM interaction:
View your traces at app.langwatch.ai within seconds of making your first API call.
Complete Working Example
Here’s a minimal working example that combines all the setup steps:Core Concepts
The Go SDK is designed to feel familiar to anyone who has used OpenTelemetry. It provides a thin wrapper to simplify LangWatch-specific functionality.- Each message triggering your LLM pipeline as a whole is captured with a Trace.
- A Trace contains multiple Spans, which are the steps inside your pipeline.
- Traces can be grouped into a conversation by assigning a common
thread_id
. - Use User ID to track individual user interactions.
- Apply Labels for custom categorization and filtering.
For detailed API documentation of all available methods and options, see the Go SDK API Reference.
Creating a Trace
A trace represents a single, end-to-end task, like handling a user request. You create a trace by starting a “root” span. All other spans created within its context will be nested under it. Before creating traces, ensure you have configured the SDK as shown in the Setup Guide.For complete API documentation of the
Tracer()
and LangWatchSpan
methods, see the Core SDK section in the reference.Creating Nested Spans
To instrument specific parts of your pipeline (like a RAG query or a tool call), create nested spans within an active trace.The context
ctx
is crucial. It carries the active span information, ensuring that tracer.Start()
correctly creates a nested span instead of a new trace.Integrations
LangWatch offers integrations with popular OpenAI, and any other OpenAI-compatible providers: See the dedicated guides for more details:- Anthropic - Claude models via OpenAI-compatible API
- OpenAI - GPT models and OpenAI API
- Azure OpenAI - Azure-hosted OpenAI models
- Groq - Fast inference with Groq API
- Google Gemini - Google’s Gemini models
- Ollama - Local model inference
- OpenRouter - Multi-provider model access
For detailed configuration options and middleware settings, see the OpenAI Instrumentation section in the API reference.
Environment Variables
The SDK respects these environment variables for configuration:Variable | Description |
---|---|
LANGWATCH_API_KEY | Required. Your LangWatch project API key. |
LANGWATCH_ENDPOINT | The LangWatch collector endpoint. Defaults to https://app.langwatch.ai . |
OTEL_EXPORTER_OTLP_TRACES_ENDPOINT | A standard OpenTelemetry variable that can also be used to set the endpoint URL. If set, it overrides LANGWATCH_ENDPOINT . This must be set to the trace-specific endpoint https://app.langwatch.ai/api/otel/v1/traces |
Features
- 🔗 Seamless OpenTelemetry integration - Works with your existing OTel setup
- 🚀 OpenAI instrumentation - Automatic tracing for OpenAI API calls
- 🌐 Multi-provider support - OpenAI, Anthropic, Azure, local models, and more
- 📊 Rich LLM telemetry - Capture inputs, outputs, token usage, and model information
- 🔍 Specialized span types - LLM, Chain, Tool, Agent, RAG, and more
- 🧵 Thread support - Group related LLM interactions together
- 📝 Custom input/output recording - Fine-grained control over what’s captured
- 🔄 Streaming support - Real-time capture of streaming responses
Examples
Explore real working examples to learn different patterns and use cases:Example | What It Shows | Description |
---|---|---|
Simple | Basic OpenAI instrumentation | Simple chat completion with tracing |
Custom Input/Output | Recording custom data | Fine-grained control over captured data |
Streaming | Streaming completions | Real-time capture of streaming responses |
Threads | Grouping conversations | Managing multi-turn conversations |
RAG | Retrieval patterns | Document retrieval and context tracking |
For comprehensive code examples including RAG pipelines, error handling, and best practices, see the Complete Example section in the API reference.
Troubleshooting
Common Issues
No traces appearing in LangWatch dashboard:- Verify your
LANGWATCH_API_KEY
is set correctly - Check that you’re calling the shutdown function to flush traces
- Ensure your application is making OpenAI API calls
- Run
go mod tidy
to ensure all dependencies are properly resolved - Verify you’re using Go 1.19 or later
- Check that the LangWatch endpoint URL is correct
- Verify your API key has the correct permissions
Getting Help
If you’re still having issues:- Check the API reference for detailed function documentation
- Review the OpenTelemetry Go documentation
- Join our community support for additional help
Next Steps
- Learn about specific OpenAI integration patterns
- Check the API reference for detailed documentation
- Explore span types for specialized LLM operations
- Review collected attributes for comprehensive tracing data