Observability for Mastra With LangWatch
This guide shows you how to integrate Mastra with LangWatch for observability and tracing. By following these steps, you’ll be able to monitor and debug your Mastra agents in the LangWatch dashboard.Integration
1
Create a Mastra project
Create a Mastra project using the Mastra CLI:Move into the project directory:For more information, view Mastra installation instructions here
2
Set up LangWatch project
Create a project in LangWatch and get your API keys from the project settings page.
3
Add environment variables
Create or update your
.env
file with the following variables:4
Install required packages
Add the necessary packages to your project:
5
Set up LangWatch observability
Set up LangWatch observability in your main application file using
setupObservability
:6
Configure your Mastra instance
Configure your Mastra instance with telemetry enabled:
7
Add tracing to your agent calls
Use the LangWatch tracer to add detailed tracing to your agent interactions:
8
Run your Mastra application
Start your Mastra development server:Or run your application:
Visit your LangWatch dashboard to explore detailed insights into your agent interactions. Monitor and analyze every aspect of your AI conversations, from prompt engineering to response quality, helping you optimize your AI applications.
Example Project
You can find a complete example project demonstrating Mastra integration with LangWatch on our GitHub. This example includes:- Weather Agent: An AI agent that fetches weather data and suggests activities
- Weather Tool: A tool that fetches real-time weather data from Open-Meteo API
- CLI Chatbox Interface: Interactive command-line interface for chatting with the weather agent
- Workflow Example: Demonstrates Mastra workflows for programmatic weather data fetching
- Full LangWatch Integration: Complete observability and tracing setup
Key Features
- Automatic Tracing: All agent interactions are automatically traced and sent to LangWatch
- Custom Spans: Create custom spans for detailed monitoring of specific operations
- Input/Output Tracking: Track conversation history and agent responses
- Thread Management: Organize conversations by thread ID for better analysis
- Tagging: Add custom tags to categorize and filter your traces
- Tool Integration: Demonstrates how to trace custom tools and their usage
- Workflow Patterns: Shows how to build and trace complex agent workflows
Related Documentation
For more advanced Mastra integration patterns and best practices:- Integration Guide - Basic setup and core concepts
- Manual Instrumentation - Advanced span management for Mastra operations
- Semantic Conventions - Mastra-specific attributes and conventions
- Debugging and Troubleshooting - Debug Mastra integration issues
- Capturing Metadata - Adding custom metadata to Mastra calls
For production Mastra applications, combine manual instrumentation with Semantic Conventions for consistent observability and better analytics.