Learn how to instrument Haystack pipelines with LangWatch using community OpenTelemetry instrumentors.
langwatch.setup()
instrumentors
list in the langwatch.setup()
call. LangWatch will then manage the lifecycle of this instrumentor.
pip install openinference-instrumentation-haystack
pip install opentelemetry-instrumentation-haystack
You’ll also need Haystack: pip install farm-haystack[openai]
(if using OpenAI models).
Consult the specific library’s documentation for the exact package name and instrumentor class if the above assumptions are incorrect.TracerProvider
configured in your application (or if LangWatch is configured to use the global provider), you can use the community instrumentor’s instrument()
method directly. LangWatch will automatically pick up the spans generated by these instrumentors as long as its exporter is part of the active TracerProvider
.
BaseComponent
or Pipeline
hooks), meaning most Haystack operations should be captured once instrumented.langwatch.setup(instrumentors=[...])
, LangWatch handles the setup and lifecycle of the instrumentor.HaystackInstrumentor().instrument()
), ensure that the TracerProvider
used by the instrumentor is the same one LangWatch is exporting from. This usually means LangWatch is configured to use an existing global provider or one you explicitly pass to langwatch.setup()
.PromptNode
with an OpenAI model for simplicity. Ensure you have the necessary API keys (e.g., OPENAI_API_KEY
) set in your environment if you run these examples.