Learn how to instrument Azure OpenAI API calls with the LangWatch Python SDK
autotrack_openai_calls()
: This method, part of the LangWatch SDK, dynamically patches your AzureOpenAI
client instance to capture calls made through it within a specific trace.langwatch.setup()
function or by using their native instrument()
methods if you’re managing your OpenTelemetry setup more directly.autotrack_openai_calls()
autotrack_openai_calls()
function provides a straightforward way to capture all Azure OpenAI calls made with a specific client instance for the duration of the current trace.
You typically call this method on the trace object obtained via langwatch.get_current_trace()
inside a function decorated with @langwatch.trace()
.
autotrack_openai_calls()
with Azure OpenAI:
langwatch.get_current_trace()
).AzureOpenAI
client. If you have multiple clients, you’ll need to call it for each one you want to track.AzureOpenAI
client is correctly configured with azure_endpoint
, api_key
, api_version
, and you use the deployment name for the model
parameter.OpenInference
or OpenLLMetry
, LangWatch can seamlessly integrate with them. These libraries provide instrumentors that automatically capture data from the openai
library, which AzureOpenAI
is part of.
There are two main ways to integrate these:
langwatch.setup()
OpenAIInstrumentor
from OpenInference) to the instrumentors
list in the langwatch.setup()
call. LangWatch will then manage the lifecycle of this instrumentor.
pip install openllmetry-instrumentation-openai
or pip install openinference-instrumentation-openai
). The instrumentor works with AzureOpenAI
as it’s part of the same openai
Python package.TracerProvider
configured in your application (or if LangWatch is configured to use the global provider), you can use the community instrumentor’s instrument()
method directly. LangWatch will automatically pick up the spans generated by these instrumentors as long as its exporter is part of the active TracerProvider
.
openai
library at a global level, meaning all calls from any OpenAI
or AzureOpenAI
client instance will be captured once instrumented.langwatch.setup(instrumentors=[...])
, LangWatch handles the instrumentor’s setup.OpenAIInstrumentor().instrument()
), ensure that the TracerProvider
used by the instrumentor is the same one LangWatch is exporting from. This typically happens automatically if LangWatch initializes the global provider or if you configure them to use the same explicit provider.autotrack_openai_calls()
is ideal for targeted instrumentation within specific traces or when you want fine-grained control over which AzureOpenAI
client instances are tracked. It’s simpler if you’re not deeply invested in a separate OpenTelemetry setup.