Skip to main content
LangWatch supports tracing Azure OpenAI API calls using the same otelopenai middleware used for OpenAI. Configure the client to point to your Azure endpoint.

Installation

go get github.com/langwatch/langwatch/sdk-go github.com/openai/openai-go

Usage

The LangWatch API key is configured by default via the LANGWATCH_API_KEY environment variable.
Configure the OpenAI client with your Azure endpoint and API key:
package main

import (
	"context"
	"log"
	"os"

	otelopenai "github.com/langwatch/langwatch/sdk-go/instrumentation/openai"
	"github.com/openai/openai-go"
	oaioption "github.com/openai/openai-go/option"
)

func main() {
	ctx := context.Background()

	client := openai.NewClient(
		oaioption.WithAPIKey(os.Getenv("AZURE_OPENAI_API_KEY")),
		oaioption.WithBaseURL(os.Getenv("AZURE_OPENAI_ENDPOINT")),
		oaioption.WithMiddleware(otelopenai.Middleware("<project_name>",
			otelopenai.WithCaptureInput(),
			otelopenai.WithCaptureOutput(),
		)),
	)

	response, err := client.Chat.Completions.New(ctx, openai.ChatCompletionNewParams{
		Model: openai.ChatModelGPT5,
		Messages: []openai.ChatCompletionMessageParamUnion{
			openai.SystemMessage("You are a helpful assistant."),
			openai.UserMessage("Hello, OpenAI!"),
		},
	})
	if err != nil {
		log.Fatalf("Chat completion failed: %v", err)
	}

	log.Printf("Chat completion: %s", response.Choices[0].Message.Content)
}
Set AZURE_OPENAI_ENDPOINT to your Azure OpenAI resource endpoint URL (e.g., https://your-resource.openai.azure.com/openai/deployments/your-deployment).