LangWatch provides seamless instrumentation for Anthropic’s Claude models through its OpenAI-compatible middleware. This allows you to capture detailed traces of your Claude API calls with minimal code changes.

Setup

The key to this integration is to configure the openai.Client to point to Anthropic’s API endpoint and use your Anthropic API key. You will need your Anthropic API key, which you can find in your Anthropic dashboard. Set it as an environment variable:
export ANTHROPIC_API_KEY="your-anthropic-api-key"

Example

When creating your openai.Client, configure it with Anthropic’s base URL and your API key. It’s also a best practice to set the gen_ai.system attribute to "anthropic" for clear identification in LangWatch.
The following example assumes you have already configured the LangWatch SDK. See the Go setup guide for details. For detailed configuration options and middleware settings, see the OpenAI Instrumentation section in the reference.
package main

import (
	"context"
	"log"
	"os"

	"github.com/langwatch/langwatch/sdk-go"
	otelopenai "github.com/langwatch/langwatch/sdk-go/instrumentation/openai"
	"github.com/openai/openai-go"
	oaioption "github.com/openai/openai-go/option"
)

func main() {
	ctx := context.Background()

	client := openai.NewClient(
		// Use the Anthropic API endpoint
		oaioption.WithBaseURL("https://api.anthropic.com/v1"),

		// Use your Anthropic API key
		oaioption.WithAPIKey(os.Getenv("ANTHROPIC_API_KEY")),

		// Add the middleware, identifying the system as "anthropic"
		oaioption.WithMiddleware(otelopenai.Middleware("my-anthropic-app",
			otelopenai.WithGenAISystem("anthropic"),
			otelopenai.WithCaptureInput(),
			otelopenai.WithCaptureOutput(),
		)),
	)

	// Make a call to a Claude model
	response, err := client.Chat.Completions.New(ctx, openai.ChatCompletionNewParams{
		Model: "claude-3-5-sonnet-20241022",
		Messages: []openai.ChatCompletionMessageParamUnion{
			openai.UserMessage("Hello, Claude! What are you?"),
		},
		MaxTokens: openai.I(100),
	})

	if err != nil {
		log.Fatalf("Anthropic API call failed: %v", err)
	}

	log.Printf("Response from Claude: %s", response.Choices[0].Message.Content)
}
When using the OpenAI-compatible endpoint, you may need to add the MaxTokens parameter to your request, as it is often required by Anthropic.