Learn how to instrument OpenAI API calls with the LangWatch Go SDK using middleware.
openai-go
client library through a dedicated middleware. This approach captures detailed information about your OpenAI API calls—including requests, responses, token usage, and streaming data—without requiring manual tracing around each call.
otelopenai.Middleware
to its configuration. The middleware will then automatically create detailed trace spans for each API call.
Middleware
function accepts a required instrumentationName
string followed by optional configuration functions to customize its behavior.
llm.request.body
span attribute.llm.response.body
span attribute. For streams, this contains the final, accumulated response.gen_ai.system
attribute on spans, which is useful for identifying the underlying model provider. Defaults to "openai"
.TracerProvider
to use. Defaults to the global provider.API | Support Level | Details |
---|---|---|
Chat Completions | Full | Captures request/response, token usage, model info. |
Chat Completions Streaming | Full | Accumulates streaming chunks into a final response. |
Embeddings | Full | Captures input text and embedding vector dimensions. |
Images | Partial (Input) | Captures image generation prompts and parameters. |
Audio | Partial (Input) | Captures audio generation request parameters. |
gen_ai.system
(e.g., "openai"
)gen_ai.request.model
gen_ai.request.temperature
gen_ai.request.top_p
gen_ai.request.top_k
gen_ai.request.frequency_penalty