Prerequisites
-
Install LangWatch SDK:
-
Install AutoGen and OpenInference instrumentor:
- Set up your LLM provider: You’ll need to configure your preferred LLM provider (OpenAI, Anthropic, etc.) with the appropriate API keys.
Instrumentation with OpenInference
LangWatch supports seamless observability for AutoGen using the OpenInference AutoGen instrumentor. This approach automatically captures traces from your AutoGen agents and sends them to LangWatch.Basic Setup (Automatic Tracing)
Here’s the simplest way to instrument your application:Optional: Using Decorators for Additional Context
If you want to add additional context or metadata to your traces, you can optionally use the@langwatch.trace()
decorator:
How it Works
-
langwatch.setup()
: Initializes the LangWatch SDK, which includes setting up an OpenTelemetry trace exporter. This exporter is ready to receive spans from any OpenTelemetry-instrumented library in your application. -
AutoGenInstrumentor()
: The OpenInference instrumentor automatically patches AutoGen components to create OpenTelemetry spans for their operations, including:- Agent initialization
- Multi-agent conversations
- LLM calls
- Tool executions
- Code execution
- Message passing between agents
-
Optional Decorators: You can optionally use
@langwatch.trace()
to add additional context and metadata to your traces, but it’s not required for basic functionality.
Notes
- You do not need to set any OpenTelemetry environment variables or configure exporters manually—
langwatch.setup()
handles everything. - You can combine AutoGen instrumentation with other instrumentors (e.g., OpenAI, LangChain) by adding them to the
instrumentors
list. - The
@langwatch.trace()
decorator is optional - the OpenInference instrumentor will capture all AutoGen activity automatically. - For advanced configuration (custom attributes, endpoint, etc.), see the Python integration guide.
Troubleshooting
- Make sure your
LANGWATCH_API_KEY
is set in the environment. - If you see no traces in LangWatch, check that the instrumentor is included in
langwatch.setup()
and that your agent code is being executed. - Ensure you have the correct API keys set for your chosen LLM provider.