LangWatch provides automatic instrumentation for the official openai-go client library through a dedicated middleware that captures detailed information about your OpenAI API calls.
Installation
go get github.com/langwatch/langwatch/sdk-go github.com/openai/openai-go
Usage
The LangWatch API key is configured by default via the LANGWATCH_API_KEY environment variable.
Add the otelopenai.Middleware to your OpenAI client configuration. The middleware automatically creates detailed trace spans for each API call.
package main
import (
"context"
"log"
"os"
otelopenai "github.com/langwatch/langwatch/sdk-go/instrumentation/openai"
"github.com/openai/openai-go"
oaioption "github.com/openai/openai-go/option"
)
func main() {
ctx := context.Background()
client := openai.NewClient(
oaioption.WithAPIKey(os.Getenv("OPENAI_API_KEY")),
oaioption.WithMiddleware(otelopenai.Middleware("<project_name>",
otelopenai.WithCaptureInput(),
otelopenai.WithCaptureOutput(),
)),
)
response, err := client.Chat.Completions.New(ctx, openai.ChatCompletionNewParams{
Model: openai.ChatModelGPT5,
Messages: []openai.ChatCompletionMessageParamUnion{
openai.SystemMessage("You are a helpful assistant."),
openai.UserMessage("Hello, OpenAI!"),
},
})
if err != nil {
log.Fatalf("Chat completion failed: %v", err)
}
log.Printf("Chat completion: %s", response.Choices[0].Message.Content)
}
The middleware automatically captures request/response content, token usage, and model information. Streaming responses are fully supported and automatically accumulated.