AWS Bedrock Instrumentation
Learn how to instrument AWS Bedrock calls with the LangWatch Python SDK using OpenInference.
AWS Bedrock, accessed via the boto3
library, allows you to leverage powerful foundation models. By using the OpenInference Bedrock instrumentor, you can automatically capture OpenTelemetry traces for your Bedrock API calls. LangWatch, being an OpenTelemetry-compatible observability platform, can seamlessly ingest these traces, providing insights into your LLM interactions.
This guide explains how to configure your Python application to send Bedrock traces to LangWatch.
Prerequisites
-
Install LangWatch SDK:
-
Install Bedrock Instrumentation and Dependencies: You’ll need
boto3
to interact with AWS Bedrock, and the OpenInference instrumentation library for Bedrock.Note:
openinference-instrumentation-bedrock
will install necessary OpenTelemetry packages. Ensure yourboto3
andbotocore
versions are compatible with the Bedrock features you intend to use (e.g.,botocore >= 1.34.116
for theconverse
API).
Instrumenting AWS Bedrock with LangWatch
The integration involves initializing LangWatch to set up the OpenTelemetry environment and then applying the Bedrock instrumentor.
Steps:
- Initialize LangWatch: Call
langwatch.setup()
at the beginning of your application. This configures the global OpenTelemetry SDK to export traces to LangWatch. - Instrument Bedrock: Import
BedrockInstrumentor
and call itsinstrument()
method. This will patchboto3
to automatically create spans for Bedrock client calls.
Key points for this approach:
langwatch.setup()
: Initializes the global OpenTelemetry environment configured for LangWatch. This must be called before any instrumented code is run.BedrockInstrumentor().instrument()
: This call patches theboto3
library. Any subsequent Bedrock calls made using aboto3.client("bedrock-runtime")
will automatically generate OpenTelemetry spans.@langwatch.trace()
: Creates a parent trace in LangWatch. The automated Bedrock spans generated by OpenInference will be nested under this parent trace if the Bedrock calls are made within the decorated function. This provides a clear hierarchy for your operations.- API Versions: The example shows both
invoke_model
andconverse
APIs. Theconverse
API requiresbotocore
version1.34.116
or newer.
By following these steps, your application’s interactions with AWS Bedrock will be traced, and the data will be sent to LangWatch for monitoring and analysis. This allows you to observe latencies, errors, and other metadata associated with your foundation model calls. For more details on the specific attributes captured by the OpenInference Bedrock instrumentor, please refer to the OpenInference Semantic Conventions. (Note: Link to general OTel AI/OpenInference conventions, specific Bedrock attributes might be detailed in OpenInference’s own docs).
Remember to replace placeholder values for AWS credentials and adapt the model IDs and prompts to your specific use case.