Prerequisites
-
Install LangWatch SDK:
-
Install PromptFlow and OpenInference instrumentor:
- Set up your LLM provider: You’ll need to configure your preferred LLM provider (OpenAI, Anthropic, etc.) with the appropriate API keys.
Instrumentation with OpenInference
LangWatch supports seamless observability for PromptFlow using the OpenInference PromptFlow instrumentor. This approach automatically captures traces from your PromptFlow flows and sends them to LangWatch.Basic Setup (Automatic Tracing)
Here’s the simplest way to instrument your application:Optional: Using Decorators for Additional Context
If you want to add additional context or metadata to your traces, you can optionally use the@langwatch.trace()
decorator:
How it Works
-
langwatch.setup()
: Initializes the LangWatch SDK, which includes setting up an OpenTelemetry trace exporter. This exporter is ready to receive spans from any OpenTelemetry-instrumented library in your application. -
PromptFlowInstrumentor()
: The OpenInference instrumentor automatically patches PromptFlow components to create OpenTelemetry spans for their operations, including:- Flow execution
- Node execution
- LLM calls
- Tool executions
- Data processing
- Input/output handling
-
Optional Decorators: You can optionally use
@langwatch.trace()
to add additional context and metadata to your traces, but it’s not required for basic functionality.
Environment Variables
Make sure to set the following environment variables:Supported Models
PromptFlow supports various LLM providers including:- OpenAI (GPT-5, GPT-4o, etc.)
- Anthropic (Claude models)
- Local models (via Ollama, etc.)
- Other providers supported by PromptFlow
Notes
- You do not need to set any OpenTelemetry environment variables or configure exporters manually—
langwatch.setup()
handles everything. - You can combine PromptFlow instrumentation with other instrumentors (e.g., OpenAI, LangChain) by adding them to the
instrumentors
list. - The
@langwatch.trace()
decorator is optional - the OpenInference instrumentor will capture all PromptFlow activity automatically. - For advanced configuration (custom attributes, endpoint, etc.), see the Python integration guide.
Troubleshooting
- Make sure your
LANGWATCH_API_KEY
is set in the environment. - If you see no traces in LangWatch, check that the instrumentor is included in
langwatch.setup()
and that your PromptFlow code is being executed. - Ensure you have the correct API keys set for your chosen LLM provider.