Skip to main content
LangWatch integrates with Haystack through OpenInference instrumentation to capture traces from your Haystack pipelines and components.

Installation

pip install langwatch openinference-instrumentation-haystack haystack-ai

Usage

The LangWatch API key is configured by default via the LANGWATCH_API_KEY environment variable.
Use the OpenInference instrumentation for Haystack by passing HaystackInstrumentor to langwatch.setup().
import os
import langwatch

from haystack.components.agents import Agent
from haystack.components.generators.chat import OpenAIChatGenerator
from haystack.dataclasses import ChatMessage
from openinference.instrumentation.haystack import HaystackInstrumentor

langwatch.setup(instrumentors=[HaystackInstrumentor()])

basic_agent = Agent(
    chat_generator=OpenAIChatGenerator(model="gpt-4o-mini"),
    system_prompt="You are a helpful web agent.",
    tools=[],
)

result = basic_agent.run(messages=[ChatMessage.from_user("Tell me a joke")])

print(result["last_message"].text)
The HaystackInstrumentor automatically captures Haystack pipeline operations, component executions, and model interactions. Use @langwatch.trace() to create a parent trace under which Haystack operations will be nested.