Skip to main content
LangWatch integrates with Mastra through OpenTelemetry to capture traces from your Mastra agents automatically.

Installation

npm i langwatch @mastra/core @ai-sdk/openai @mastra/otel-exporter @mastra/loggers @mastra/libsql

Usage

Configure your Mastra instance with OpenTelemetry exporter pointing to LangWatch:
import { Agent } from "@mastra/core/agent";
import { Mastra } from "@mastra/core";
import { openai } from "@ai-sdk/openai";
import { OtelExporter } from "@mastra/otel-exporter";
import { PinoLogger } from "@mastra/loggers";
import { LibSQLStore } from "@mastra/libsql";

export const mastra = new Mastra({
  agents: {
    assistant: new Agent({
      name: "assistant",
      instructions: "You are a helpful assistant.",
      model: openai("gpt-5"),
    }),
  },
  // Storage is required for tracing in Mastra
  storage: new LibSQLStore({ url: ":memory:" }),
  logger: new PinoLogger({ name: "mastra", level: "info" }),
  observability: {
    configs: {
      otel: {
        serviceName: "<project_name>",
        exporters: [
          new OtelExporter({
            provider: {
              custom: {
                endpoint: "https://app.langwatch.ai/api/otel/v1/traces",
                headers: { "Authorization": `Bearer ${process.env.LANGWATCH_API_KEY}` },
              },
            },
          }),
        ],
      },
    },
  },
});
Mastra automatically sends traces to LangWatch through the OpenTelemetry exporter. All agent interactions, tool calls, and workflow executions will be captured.