Skip to main content
LangWatch TypeScript Repo
LangWatch TypeScript SDK version
LangWatch library is the easiest way to integrate your TypeScript application with LangWatch, the messages are synced on the background so it doesnโ€™t intercept or block your LLM calls.
Protip: wanna to get started even faster? Copy our llms.txt and ask an AI to do this integration

Prerequisites

Installation

npm install langwatch

Configuration

Ensure LANGWATCH_API_KEY is set:
  • Environment variable
  • Client parameters
.env
LANGWATCH_API_KEY='your_api_key_here'

Basic Concepts

  • Each message triggering your LLM pipeline as a whole is captured with a Trace.
  • A Trace contains multiple Spans, which are the steps inside your pipeline.
    • A span can be an LLM call, a database query for a RAG retrieval, or a simple function transformation.
    • Different types of Spans capture different parameters.
    • Spans can be nested to capture the pipeline structure.
  • Traces can be grouped together on LangWatch Dashboard by having the same thread_id in their metadata, making the individual messages become part of a conversation.
    • It is also recommended to provide the user_id metadata to track user analytics.

Installation

npm i langwatch ai @ai-sdk/openai

Usage

The LangWatch API key is configured by default via the LANGWATCH_API_KEY environment variable.
Set up observability and enable telemetry on your Vercel AI SDK calls:
import { setupObservability } from "langwatch/observability/node";
import { generateText } from "ai";
import { openai } from "@ai-sdk/openai";

setupObservability({ serviceName: "<project_name>" });

async function main(message: string): Promise<string> {
  const response = await generateText({
    model: openai("gpt-5-mini"),
    prompt: message,
    experimental_telemetry: { isEnabled: true },
  });
  return response.text;
}

console.log(await main("Hey, tell me a joke"));
The Vercel AI SDK automatically sends traces to LangWatch when experimental_telemetry.isEnabled is set to true. For Next.js applications, configure OpenTelemetry in your instrumentation.ts file using LangWatchExporter.