Azure OpenAI Instrumentation
Learn how to instrument Azure OpenAI API calls with the LangWatch Python SDK
LangWatch offers robust integration with Azure OpenAI, allowing you to capture detailed information about your LLM calls automatically. There are two primary approaches to instrumenting your Azure OpenAI interactions:
- Using
autotrack_openai_calls()
: This method, part of the LangWatch SDK, dynamically patches yourAzureOpenAI
client instance to capture calls made through it within a specific trace. - Using Community OpenTelemetry Instrumentors: Leverage existing OpenTelemetry instrumentation libraries like those from OpenInference or OpenLLMetry. These can be integrated with LangWatch by either passing them to the
langwatch.setup()
function or by using their nativeinstrument()
methods if you’re managing your OpenTelemetry setup more directly.
This guide will walk you through both methods.
Using autotrack_openai_calls()
The autotrack_openai_calls()
function provides a straightforward way to capture all Azure OpenAI calls made with a specific client instance for the duration of the current trace.
You typically call this method on the trace object obtained via langwatch.get_current_trace()
inside a function decorated with @langwatch.trace()
.
Key points for autotrack_openai_calls()
with Azure OpenAI:
- It must be called on an active trace object (e.g., obtained via
langwatch.get_current_trace()
). - It instruments a specific instance of the
AzureOpenAI
client. If you have multiple clients, you’ll need to call it for each one you want to track. - Ensure your
AzureOpenAI
client is correctly configured withazure_endpoint
,api_key
,api_version
, and you use the deployment name for themodel
parameter.
Using Community OpenTelemetry Instrumentors
If you prefer to use broader OpenTelemetry-based instrumentation, or are already using libraries like OpenInference
or OpenLLMetry
, LangWatch can seamlessly integrate with them. These libraries provide instrumentors that automatically capture data from the openai
library, which AzureOpenAI
is part of.
There are two main ways to integrate these:
1. Via langwatch.setup()
You can pass an instance of the instrumentor (e.g., OpenAIInstrumentor
from OpenInference) to the instrumentors
list in the langwatch.setup()
call. LangWatch will then manage the lifecycle of this instrumentor.
Ensure you have the respective community instrumentation library installed (e.g., pip install openllmetry-instrumentation-openai
or pip install openinference-instrumentation-openai
). The instrumentor works with AzureOpenAI
as it’s part of the same openai
Python package.
2. Direct Instrumentation
If you have an existing OpenTelemetry TracerProvider
configured in your application (or if LangWatch is configured to use the global provider), you can use the community instrumentor’s instrument()
method directly. LangWatch will automatically pick up the spans generated by these instrumentors as long as its exporter is part of the active TracerProvider
.
Key points for community instrumentors with Azure OpenAI:
- These instrumentors often patch the
openai
library at a global level, meaning all calls from anyOpenAI
orAzureOpenAI
client instance will be captured once instrumented. - If using
langwatch.setup(instrumentors=[...])
, LangWatch handles the instrumentor’s setup. - If instrumenting directly (e.g.,
OpenAIInstrumentor().instrument()
), ensure that theTracerProvider
used by the instrumentor is the same one LangWatch is exporting from. This typically happens automatically if LangWatch initializes the global provider or if you configure them to use the same explicit provider.
Which Approach to Choose?
autotrack_openai_calls()
is ideal for targeted instrumentation within specific traces or when you want fine-grained control over whichAzureOpenAI
client instances are tracked. It’s simpler if you’re not deeply invested in a separate OpenTelemetry setup.- Community Instrumentors are powerful if you’re already using OpenTelemetry, want to capture Azure OpenAI calls globally across your application, or need to instrument other libraries alongside Azure OpenAI with a consistent OpenTelemetry approach. They provide a more holistic observability solution if you have multiple OpenTelemetry-instrumented components.
Choose the method that best fits your existing setup and instrumentation needs. Both approaches effectively send Azure OpenAI call data to LangWatch for monitoring and analysis.