Learn how to create your first prompt in LangWatch and use it in your application with dynamic variables. This enables your team to update AI interactions without code changes.

Get API keys

  1. Create a LangWatch account or set up self-hosted LangWatch
  2. Create new API credentials in your project settings
  3. Note your API key for use in the steps below

Create a prompt

Use the LangWatch UI to create a new prompt or update an existing one.
  1. Navigate to your project dashboard
  2. Go to Prompt Management in the sidebar
  3. Click “Create New Prompt”
  4. Fill in the prompt details and save
Editing a prompt in LangWatch UI

Use prompt

At runtime, you can fetch the latest version of your prompt from LangWatch using either the prompt handle or ID.
use_prompt.py
import langwatch
from openai import OpenAI

# Get the latest prompt (can use handle or ID)
prompt = langwatch.prompt.get_prompt("customer-support-bot")  # by handle
# or
# prompt = langwatch.prompt.get_prompt("prompt_TrYXZLsiTJkn9N6PiZiae")  # by ID

# Format messages with variables
messages = prompt.format_messages(
    user_name="John Doe",
    user_email="[email protected]",
    input="How do I reset my password?"
)

# Use with OpenAI client
client = OpenAI()
completion = client.chat.completions.create(
    model=prompt.model.split("openai/")[1],  # Remove "openai/" prefix
    messages=messages
)

print(completion.choices[0].message.content)
You can link your prompt to LLM generation traces to track performance and see which prompt versions work best. For detailed information about linking prompts to traces, see the Link to Traces page.
tracing.py
import langwatch
from openai import OpenAI

# Initialize LangWatch
langwatch.setup()

# Create a trace function
@langwatch.trace()
def customer_support_generation():
    # Get prompt (automatically linked to trace when API key is present)
    prompt = langwatch.prompt.get_prompt("customer-support-bot")

    # Format messages with variables
    messages = prompt.format_messages(
        user_name="John Doe",
        user_email="[email protected]",
        input="I need help with my account"
    )

    # Use with OpenAI client
    client = OpenAI()
    completion = client.chat.completions.create(
        model=prompt.model.split("openai/")[1],
        messages=messages
    )

    return completion.choices[0].message.content

# Call the function
result = customer_support_generation()

← Back to Prompt Management Overview