Observability for Mastra With LangWatch

This guide shows you how to integrate Mastra with LangWatch for observability and tracing. By following these steps, you’ll be able to monitor and debug your Mastra agents in the LangWatch dashboard.

Integration

1

Create a Mastra project

Create a Mastra project using the Mastra CLI:
npx create-mastra
Move into the project directory:
cd your-mastra-project
For more information, view Mastra installation instructions here
2

Set up LangWatch project

Create a project in LangWatch and get your API keys from the project settings page.
3

Add environment variables

Create or update your .env file with the following variables:
# Your LLM API key
OPENAI_API_KEY=your-api-key

# LangWatch credentials
LANGWATCH_API_KEY=sk-...
4

Install required packages

Add the necessary packages to your project:
npm install langwatch @opentelemetry/context-async-hooks @opentelemetry/sdk-node
5

Set up LangWatch observability

Set up LangWatch observability in your main application file using setupObservability:
import { setupObservability } from "langwatch/observability/node";

setupObservability();
6

Configure your Mastra instance

Configure your Mastra instance with telemetry enabled:
import { Mastra } from '@mastra/core/mastra';
import { PinoLogger } from '@mastra/loggers';
import { LibSQLStore } from '@mastra/libsql';
import { weatherAgent } from './agents/weather-agent.js';

export const mastra = new Mastra({
  agents: { weatherAgent },
  storage: new LibSQLStore({
    url: ":memory:", // or "file:./mastra.db" for persistence
  }),
  logger: new PinoLogger({
    name: 'Mastra',
    level: 'info',
  }),
  telemetry: {
    enabled: true,
  },
});
7

Add tracing to your agent calls

Use the LangWatch tracer to add detailed tracing to your agent interactions:
import { getLangWatchTracer } from "langwatch";

const tracer = getLangWatchTracer("mastra-weather-agent-example");

// In your agent interaction code
await tracer.withActiveSpan("agent-interaction", {
  attributes: {
    "langwatch.thread_id": threadId,
    "langwatch.tags": ["mastra.sdk.example"],
  },
}, async (span) => {
  // Set input for tracing
  span.setInput("chat_messages", conversationHistory);

  const agent = mastra.getAgent("weatherAgent");
  const response = await agent.generate(conversationHistory, {
    // Optionally provide a tracer to have more control over tracing
    telemetry: { isEnabled: true, tracer: tracer },
  });

  // Set output for tracing
  span.setOutput("chat_messages", [{ role: "assistant", content: response.text }]);
});
8

Run your Mastra application

Start your Mastra development server:
npm run dev
Or run your application:
npm start
Visit your LangWatch dashboard to explore detailed insights into your agent interactions. Monitor and analyze every aspect of your AI conversations, from prompt engineering to response quality, helping you optimize your AI applications.

Example Project

You can find a complete example project demonstrating Mastra integration with LangWatch on our GitHub. This example includes:
  • Weather Agent: An AI agent that fetches weather data and suggests activities
  • Weather Tool: A tool that fetches real-time weather data from Open-Meteo API
  • CLI Chatbox Interface: Interactive command-line interface for chatting with the weather agent
  • Workflow Example: Demonstrates Mastra workflows for programmatic weather data fetching
  • Full LangWatch Integration: Complete observability and tracing setup

Key Features

  • Automatic Tracing: All agent interactions are automatically traced and sent to LangWatch
  • Custom Spans: Create custom spans for detailed monitoring of specific operations
  • Input/Output Tracking: Track conversation history and agent responses
  • Thread Management: Organize conversations by thread ID for better analysis
  • Tagging: Add custom tags to categorize and filter your traces
  • Tool Integration: Demonstrates how to trace custom tools and their usage
  • Workflow Patterns: Shows how to build and trace complex agent workflows
For more advanced Mastra integration patterns and best practices:
For production Mastra applications, combine manual instrumentation with Semantic Conventions for consistent observability and better analytics.