LangWatch home page
Search...
⌘K
llms.txt
Support
Dashboard
langwatch/langwatch
langwatch/langwatch
Search...
Navigation
Advanced
Using Prompts in the Optimization Studio
Documentation
Open Dashboard
GitHub Repo
Get Started
Introduction
Self Hosting
Cookbooks
Agent Simulations
Introduction to Agent Testing
Overview
Getting Started
Simulation Sets
Batch Runs
Individual Run View
LLM Observability
Overview
Concepts
Quick Start
Language APIs & SDKs
Integrations
Tutorials
User Events
Monitoring & Alerts
Code Examples
LLM Evaluation
Overview
Evaluation Tracking API
Evaluation Wizard
Real-Time Evaluation
Built-in Evaluators
Datasets
Annotations
Prompt Management
Overview
Get Started
Data Model
Prompts CLI
Features
Essential
Advanced
Link to Traces
Using Prompts in the Optimization Studio
Guaranteed Availability
LLM Development
Prompt Optimization Studio
DSPy Visualization
LangWatch MCP
API Endpoints
Traces
Prompts
Annotations
Datasets
Triggers
Scenarios
Use Cases
Evaluating a RAG Chatbot for Technical Manuals
Evaluating an AI Coach with LLM-as-a-Judge
Evaluating Structured Data Extraction
Support
Troubleshooting and Support
Status Page
On this page
Watch: Prompt Management Tutorial
Using Prompts in the Optimization Studio
Advanced
Using Prompts in the Optimization Studio
Copy page
Use prompts in the Optimization Studio to test and optimize your prompts
Copy page
Watch: Prompt Management Tutorial
Get a quick visual overview of how to use the prompt management features in LangWatch:
Using Prompts in the Optimization Studio
To get started with prompt versioning in the Optimization Studio:
Create a new workflow or open an existing one
Drag a signature node onto the workspace
Click on the node to access configuration options in the right side panel
Make your desired changes to the prompt configuration
Save your changes as a new version
Was this page helpful?
Yes
No
Link to Traces
Guaranteed Availability
Assistant
Responses are generated using AI and may contain mistakes.