Guaranteed availability ensures your application can continue operating with prompts even when disconnected from the LangWatch platform. This is achieved through local prompt materialization using the Prompts CLI.

How It Works

When you use the Prompts CLI to manage dependencies, prompts are materialized locally as standard YAML files. The LangWatch SDKs automatically detect and use these materialized prompts when available, providing seamless fallback behavior. Benefits:
  • Offline operation - Your application works without internet connectivity
  • Air-gapped deployments - Deploy in secure environments with no external access
  • Reduced latency - No network calls for prompt retrieval
  • Guaranteed consistency - Prompts are locked to specific versions in your deployment

Setting Up Local Materialization

1. Initialize Prompt Dependencies

# Install CLI and authenticate
npm install -g langwatch
langwatch login

# Initialize in your project
langwatch prompt init

2. Add Prompt Dependencies

Add the prompts your application needs:
# Add specific prompts your app uses
langwatch prompt add customer-support-bot@5
langwatch prompt add data-analyzer@latest
langwatch prompt add error-handler@3
This creates a prompts.json file:
{
  "prompts": {
    "customer-support-bot": "5",
    "data-analyzer": "latest",
    "error-handler": "3"
  }
}

3. Materialize Prompts Locally

# Fetch and materialize all prompts locally
langwatch prompt sync
This creates materialized YAML files:
prompts/
└── .materialized/
    ├── customer-support-bot.prompt.yaml
    ├── data-analyzer.prompt.yaml
    └── error-handler.prompt.yaml

4. Deploy with Materialized Prompts

Include the materialized prompts in your deployment package. Your application can now run completely offline.

Using Materialized Prompts in Code

The SDKs automatically detect and use materialized prompts when available, falling back to API calls only when necessary.
offline_app.py
import langwatch
from litellm import completion

# Initialize LangWatch
langwatch.setup()

# The SDK will automatically use materialized prompts if available
# No network call needed if prompt is materialized locally
prompt = langwatch.prompts.get("customer-support-bot")

# Compile prompt with variables
compiled_prompt = prompt.compile(
    user_name="John Doe",
    user_email="[email protected]",
    input="How do I reset my password?"
)

# Use with LiteLLM (no need to strip provider prefixes)
response = completion(
    model=compiled_prompt.model,
    messages=compiled_prompt.messages
)

print(response.choices[0].message.content)
Behavior:
  1. SDK checks for ./prompts/.materialized/customer-support-bot.prompt.yaml
  2. If found, loads prompt from local file (no network call)
  3. If not found, attempts to fetch from LangWatch API
  4. Throws error if both local file and API are unavailable

Air-Gapped Deployment

For completely air-gapped environments:

1. Prepare on Connected Environment

# On development machine with internet access
langwatch prompt sync

# Verify all prompts are materialized
ls prompts/.materialized/

2. Package for Deployment

Include these files in your deployment package:
  • prompts/.materialized/ directory (all YAML files)
  • Your application code
  • Dependencies

3. Deploy to Air-Gapped Environment

The application will run entirely offline, using only materialized prompts. No LangWatch API access required.

CI/CD Integration

Integrate prompt materialization into your deployment pipeline:
.github/workflows/deploy.yml
name: Deploy with Prompts

jobs:
  deploy:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4

      - name: Setup Node.js
        uses: actions/setup-node@v4
        with:
          node-version: '18'

      - name: Install LangWatch CLI
        run: npm install -g langwatch

      - name: Materialize prompts
        env:
          LANGWATCH_API_KEY: ${{ secrets.LANGWATCH_API_KEY }}
        run: langwatch prompt sync

      - name: Build application
        run: npm run build

      - name: Deploy with materialized prompts
        run: |
          # Deploy application including prompts/.materialized/
          # Your deployment commands here