LangWatch integrates with Strands Agents to automatically capture traces of agent interactions, model calls, and tool executions through OpenTelemetry.
Installation
pip install langwatch strands-agents strands-agents-tools
Usage
The LangWatch API key is configured by default via the LANGWATCH_API_KEY environment variable.
Initialize LangWatch and create your Strands Agent. All agent interactions will be automatically traced.
import langwatch
import os
from strands import Agent
from strands.models.litellm import LiteLLMModel
langwatch.setup()
class MyAgent:
def __init__(self):
# Configure the model using LiteLLM for provider flexibility
self.model = LiteLLMModel(
client_args={"api_key": os.getenv("OPENAI_API_KEY")},
model_id="openai/gpt-5-mini",
)
# Create the agent with tracing attributes
self.agent = Agent(
name="my-agent",
model=self.model,
system_prompt="You are a helpful AI assistant.",
)
def run(self, prompt: str):
return self.agent(prompt)
agent = MyAgent()
response = agent.run("Tell me a joke")
print(response)
LangWatch automatically captures all agent interactions, model calls, and tool executions through Strands Agents’ built-in OpenTelemetry support. Use @langwatch.trace() decorators to add custom traces and metadata as needed.