LangWatch offers robust integration with Anthropic, allowing you to capture detailed information about your Claude API calls automatically. The recommended approach is to use OpenInference instrumentation, which provides comprehensive tracing for Anthropic API calls and integrates seamlessly with LangWatch.
Using OpenInference Instrumentation
The recommended approach for instrumenting Anthropic calls with LangWatch is to use the OpenInference instrumentation library, which provides comprehensive tracing for Anthropic API calls.
Installation and Setup
If you prefer to use broader OpenTelemetry-based instrumentation, or are already using libraries like OpenInference
or OpenLLMetry
, LangWatch can seamlessly integrate with them. These libraries provide instrumentors that automatically capture data from various LLM providers, including Anthropic.
There are two main ways to integrate these:
1. Via langwatch.setup()
You can pass an instance of the instrumentor (e.g., AnthropicInstrumentor
from OpenInference or OpenLLMetry) to the instrumentors
list in the langwatch.setup()
call. LangWatch will then manage the lifecycle of this instrumentor.
import langwatch
from anthropic import Anthropic
import os
# Example using OpenInference's AnthropicInstrumentor
from openinference.instrumentation.anthropic import AnthropicInstrumentor
# Initialize LangWatch with the AnthropicInstrumentor
langwatch.setup(
instrumentors=[AnthropicInstrumentor()]
)
client = Anthropic(api_key=os.getenv("ANTHROPIC_API_KEY"))
@langwatch.trace(name="Anthropic Call with Community Instrumentor")
def generate_text_with_community_instrumentor(prompt: str):
# No need to call autotrack explicitly, the community instrumentor handles Anthropic calls globally.
response = client.messages.create(
model="claude-3-5-sonnet-20241022",
max_tokens=1024,
messages=[{"role": "user", "content": prompt}]
)
return response.content[0].text
if __name__ == "__main__":
user_query = "Tell me a joke about Python programming."
response = generate_text_with_community_instrumentor(user_query)
print(f"User: {user_query}")
print(f"AI: {response}")
Ensure you have the respective community instrumentation library installed (e.g., pip install openinference-instrumentation-anthropic
or pip install opentelemetry-instrumentation-anthropic
).
2. Direct Instrumentation
If you have an existing OpenTelemetry TracerProvider
configured in your application (or if LangWatch is configured to use the global provider), you can use the community instrumentor’s instrument()
method directly. LangWatch will automatically pick up the spans generated by these instrumentors as long as its exporter is part of the active TracerProvider
.
import langwatch
from anthropic import Anthropic
import os
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import SimpleSpanProcessor, ConsoleSpanExporter
from openinference.instrumentation.anthropic import AnthropicInstrumentor
langwatch.setup()
client = Anthropic(api_key=os.getenv("ANTHROPIC_API_KEY"))
# Instrument Anthropic directly using the community library
AnthropicInstrumentor().instrument()
@langwatch.trace(name="Anthropic Call with Direct Community Instrumentation")
def get_story_ending(beginning: str):
response = client.messages.create(
model="claude-3-5-sonnet-20241022",
max_tokens=1024,
messages=[
{"role": "system", "content": "You are a creative writer. Complete the story."},
{"role": "user", "content": beginning}
]
)
return response.content[0].text
if __name__ == "__main__":
story_start = "In a land of dragons and wizards, a young apprentice found a mysterious map..."
ending = get_story_ending(story_start)
print(f"Story Start: {story_start}")
print(f"AI's Ending: {ending}")
- These instrumentors often patch Anthropic at a global level, meaning all Anthropic calls from any client instance will be captured once instrumented.
- If using
langwatch.setup(instrumentors=[...])
, LangWatch handles the setup.
- If instrumenting directly (e.g.,
AnthropicInstrumentor().instrument()
), ensure that the TracerProvider
used by the instrumentor is the same one LangWatch is exporting from. This usually means LangWatch is configured to use an existing global provider or one you explicitly pass to langwatch.setup()
.
Advanced Usage Examples
When using Anthropic’s tool calling capabilities, the instrumentation will capture both the initial request and the tool execution:
import langwatch
from anthropic import Anthropic
import os
langwatch.setup()
client = Anthropic(api_key=os.getenv("ANTHROPIC_API_KEY"))
@langwatch.trace(name="Anthropic Tool Call")
def get_weather_with_tools(city: str):
response = client.messages.create(
model="claude-3-5-sonnet-20241022",
max_tokens=1024,
tools=[{
"type": "function",
"function": {
"name": "get_weather",
"description": "Get the current weather for a city",
"parameters": {
"type": "object",
"properties": {
"city": {"type": "string", "description": "The city name"}
},
"required": ["city"]
}
}
}],
messages=[{"role": "user", "content": f"What's the weather like in {city}?"}]
)
return response
if __name__ == "__main__":
result = get_weather_with_tools("San Francisco")
print(f"Response: {result}")
Streaming Responses
For streaming responses, the instrumentation captures the entire streaming session:
import langwatch
from anthropic import Anthropic
import os
langwatch.setup()
client = Anthropic(api_key=os.getenv("ANTHROPIC_API_KEY"))
@langwatch.trace(name="Anthropic Streaming")
def stream_response(prompt: str):
with client.messages.stream(
model="claude-3-5-sonnet-20241022",
max_tokens=1024,
messages=[{"role": "user", "content": prompt}]
) as stream:
for text in stream.text_stream:
print(text, end="", flush=True)
print() # New line after streaming
if __name__ == "__main__":
stream_response("Write a short story about a robot learning to paint.")
Which Approach to Choose?
autotrack_anthropic_calls()
is ideal for targeted instrumentation within specific traces or when you want fine-grained control over which Anthropic client instances are tracked. It’s simpler if you’re not deeply invested in a separate OpenTelemetry setup.
- Community Instrumentors are powerful if you’re already using OpenTelemetry, want to capture Anthropic calls globally across your application, or need to instrument other libraries alongside Anthropic with a consistent OpenTelemetry approach. They provide a more holistic observability solution if you have multiple OpenTelemetry-instrumented components.
Choose the method that best fits your existing setup and instrumentation needs. Both approaches effectively send Anthropic call data to LangWatch for monitoring and analysis.