Skip to main content

Observability for Mastra With LangWatch

This guide shows you how to integrate Mastra with LangWatch for observability and tracing. By following these steps, you’ll be able to monitor and debug your Mastra agents in the LangWatch dashboard.

Integration

1

Create a Mastra project

Create a Mastra project using the Mastra CLI:
npx create-mastra
Move into the project directory:
cd your-mastra-project
For more information, view Mastra installation instructions here
2

Set up LangWatch project

Create a project in LangWatch and get your API keys from the project settings page.
3

Add environment variables

Create or update your .env file with the following variables:
# Your LLM API key
OPENAI_API_KEY=your-api-key

# LangWatch credentials
LANGWATCH_API_KEY=sk-...
4

Install required packages

Add the necessary packages to your project:
npm install langwatch mastra @mastra/core @mastra/libsql @mastra/loggers @mastra/otel-exporter
5

Configure your Mastra instance

Configure your Mastra instance with telemetry enabled:
import { Agent } from "@mastra/core/agent";
import { Mastra } from "@mastra/core";
import { openai } from "@ai-sdk/openai";
import { OtelExporter } from "@mastra/otel-exporter";
import { PinoLogger } from "@mastra/loggers";
import { LibSQLStore } from "@mastra/libsql";

export const mastra = new Mastra({
  agents: {
    assistant: new Agent({
      name: "assistant",
      instructions: "You are a helpful assistant.",
      model: openai("gpt-5"),
    }),
  },
  // Storage is required for tracing in Mastra
  storage: new LibSQLStore({ url: ":memory:" }),
  logger: new PinoLogger({ name: "mastra", level: "info" }),
  observability: {
    configs: {
      otel: {
        serviceName: "<project_name>",
        exporters: [
          new OtelExporter({
            provider: {
              custom: {
                endpoint: "https://app.langwatch.ai/api/otel/v1/traces",
                headers: { "Authorization": `Bearer ${process.env.LANGWATCH_API_KEY}` },
              },
            },
          }),
        ],
      },
    },
  },
});

6

Run your Mastra application

Start your Mastra development server:
npm run dev
Or run your application:
npm run start
Visit your LangWatch dashboard to explore detailed insights into your agent interactions. Monitor and analyze every aspect of your AI conversations, from prompt engineering to response quality, helping you optimize your AI applications.

Example Project

You can find a complete example project demonstrating Mastra integration with LangWatch on our GitHub. This example includes:
  • Weather Agent: An AI agent that fetches weather data and suggests activities
  • Weather Tool: A tool that fetches real-time weather data from Open-Meteo API
  • CLI Chatbox Interface: Interactive command-line interface for chatting with the weather agent
  • Workflow Example: Demonstrates Mastra workflows for programmatic weather data fetching
  • Full LangWatch Integration: Complete observability and tracing setup
For more advanced Mastra integration patterns and best practices:
For production Mastra applications, combine with manual instrumentation using Semantic Conventions for consistent observability and better analytics.
⌘I