Skip to main content
Linking prompts to traces enables tracking of metrics and evaluations per prompt version. Itโ€™s the foundation of improving prompt quality over time. After linking prompts and traces, you will see information about the prompt in the traceโ€™s metadata.
Prompt information in trace span details
For more information about traces and spans, see the Concepts guide. When you use langwatch.prompts.get() within a trace context, LangWatch automatically links the prompt to the trace:
  • Python SDK
  • TypeScript SDK
import langwatch
from litellm import completion

# Initialize LangWatch
langwatch.setup()

@langwatch.trace()
def customer_support_generation():
    # Autotrack LiteLLM calls
    langwatch.get_current_trace().autotrack_litellm_calls(litellm)

    # Get prompt (automatically linked to trace when API key is present)
    prompt = langwatch.prompts.get("customer-support-bot")

    # Compile prompt with variables
    compiled_prompt = prompt.compile(
        user_name="John Doe",
        user_email="[email protected]",
        input="I need help with my account"
    )

    response = completion(
        model=prompt.model,
        messages=compiled_prompt.messages
    )

    return response.choices[0].message.content

# Call the function
result = customer_support_generation()
For more detailed information about setting up tracing in your application, see the Python Integration Guide or TypeScript Integration Guide.
โ† Back to Prompt Management Overview
โŒ˜I