The langwatch prompt command provides dependency management for AI prompts as plain YAML files, enabling you to version prompts locally with Git while synchronizing with the LangWatch platform for testing, evaluation, and team collaboration.

Installation

Install the CLI globally:
npm install -g langwatch
Authenticate:
langwatch login

Quick Start

1. Initialize Your Project

Create a new prompts project:
langwatch prompt init
This will create the following structure to manage your prompts:
├── prompts # directory to create your prompts
│   └── .materialized # where remote fetched prompts will be stored
├── prompts.json # prompt dependencies
└── prompts-lock.json # lock file

2. Add Your First Prompt

Create a local prompt:
langwatch prompt create my-summarizer
Or add an existing remote prompt dependency:
langwatch prompt add agent/customer-service

3. Synchronize

Sync all prompts (fetch remote, push local changes):
langwatch prompt sync
Go to app.langwatch.ai to see your new synced prompts.

Core Concepts

Dependency Management

The CLI uses two configuration files: prompts.json - Declares your prompt dependencies:
{
  "prompts": {
    "agent/customer-service": "latest",
    "shared/guidelines": "5",
    "my-local-prompt": "file:./prompts/my-local-prompt.prompt.yaml"
  }
}
prompts-lock.json - Tracks resolved versions and materialized file paths:
{
  "lockfileVersion": 1,
  "prompts": {
    "agent/customer-service": {
      "version": 12,
      "versionId": "prompt_version_scRQwSRMIyJvoSxTqP2nR",
      "materialized": "prompts/.materialized/agent/customer-service.prompt.yaml"
    }
  }
}

Local vs Remote Prompts

Remote Prompts (agent/customer-service@latest)
  • Pulled from LangWatch platform
  • Fetched and materialized locally in ./prompts/.materialized/
  • Read-only locally
Local Prompts (file:./prompts/my-prompt.prompt.yaml)
  • Stored as local YAML files
  • Version controlled with Git
  • Pushed to platform during sync for sharing and evaluation

YAML Format

Prompts files end with .prompt.yaml extension and follow this format:
model: openai/gpt-4o-mini
modelParameters:
  temperature: 0.7
  max_tokens: 1000
messages:
  - role: system
    content: You are a helpful assistant specializing in customer service.
  - role: user
    content: |
      Please help the customer with their inquiry:

      {{customer_message}}
This is the same structure as GitHub Prompts.

Commands Reference

langwatch prompt init

Initialize a new prompts project in the current directory.
langwatch prompt init

langwatch prompt add <spec> [localFile]

Add a new prompt dependency and immediately fetch/materialize it.
# Add remote prompt
langwatch prompt add shared/guidelines@latest

# Add specific version
langwatch prompt add agent/support@5

# Add local file as dependency
langwatch prompt add my-prompt ./prompts/my-prompt.prompt.yaml
Arguments:
  • <spec> - Prompt specification (name@version or name for latest)
  • [localFile] - Optional path to local YAML file to add
Behavior:
  • Updates prompts.json with new dependency
  • Fetches prompt from server and materializes locally
  • Updates prompts-lock.json with resolved version

langwatch prompt remove <name>

Remove a prompt dependency and clean up associated files.
langwatch prompt remove agent/support
Behavior:
  • Removes entry from prompts.json
  • Removes entry from prompts-lock.json
  • Deletes materialized file
  • For local prompts: deletes source file and warns about server state

langwatch prompt create <name>

Create a new local prompt file with default content.
langwatch prompt create my-new-prompt
Behavior:
  • Creates ./prompts/<name>.prompt.yaml with template content
  • Automatically adds to prompts.json as file: dependency
  • Updates prompts-lock.json

langwatch prompt sync

Synchronize all prompts between local files and the server.
langwatch prompt sync
Behavior:
  • Fetches remote prompts if new versions available
  • Pushes local prompt changes to server
  • Handles conflict resolution interactively
  • Cleans up orphaned materialized files
  • Reports what was synced
Conflict Resolution: When local and remote versions have both changed:
Conflict detected for my-prompt:
Local changes: Updated system message
Remote changes: Added temperature parameter

Choose resolution:
  l) Use local version (will create new version on server)
  r) Use remote version (will overwrite local file)
  a) Abort sync for this prompt

langwatch prompt list

Display current prompt dependencies and their status.
langwatch prompt list

CI/CD Integration

Integrate prompt materialization into your deployment pipeline:
.github/workflows/deploy.yml
name: Deploy with Prompts

jobs:
  deploy:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4

      - name: Setup Node.js
        uses: actions/setup-node@v4
        with:
          node-version: "18"

      - name: Install LangWatch CLI
        run: npm install -g langwatch

      - name: Materialize prompts
        env:
          LANGWATCH_API_KEY: ${{ secrets.LANGWATCH_API_KEY }}
        run: langwatch prompt sync

      - name: Build application
        run: npm run build

      - name: Deploy with materialized prompts
        run: |
          # Deploy application including prompts/.materialized/
          # Your deployment commands here

Workflows

Team Collaboration

Setup:
  1. One team member initializes project with langwatch prompt init
  2. Commit prompts.json and prompts-lock.json to Git
  3. Add prompts/.materialized to .gitignore
  4. Team members run langwatch prompt sync after pulling
Adding Shared Prompts:
# Add organization prompts
langwatch prompt add company/brand-guidelines@latest
langwatch prompt add legal/privacy-policy@^2

# Sync to materialize
langwatch prompt sync
Creating Local Prompts:
# Create and develop locally
langwatch prompt create feature/user-onboarding

# Edit the file
vim prompts/feature/user-onboarding.prompt.yaml

# Push to platform for evaluation
langwatch prompt sync

Version Management

Pinning Versions:
{
  "prompts": {
    "critical/system-prompt": "5", // Exact version
    "experimental/new-feature": "latest" // Auto-update
  }
}
Upgrading Dependencies:
# Edit prompts.json to change versions
# Then sync to fetch new versions
langwatch prompt sync
Rolling Back:
# Restore from Git
git checkout prompts.json prompts-lock.json

# Re-sync to match committed state
langwatch prompt sync
Development Workflow:
# Create new feature prompt
langwatch prompt create features/new-capability

# Edit and test locally
# When ready, sync to platform
langwatch prompt sync

# Platform team can now evaluate and provide feedback

Coding Assistant Integration

Since prompts are just YAML files, you refer to them directly from other tools or coding assistants.

Cursor Integration

Reference prompts in a .cursor/rules/*.mdc file:
---
description:
globs:
alwaysApply: true
---

@/prompts/.materialized/company/java-code-guidelines.prompt.yaml

Cloud Code Integration

Include prompt content in cloud development environments by referencing the YAML files in the prompts/.materialized directory.