How It Works
When you use the Prompts CLI to manage dependencies, prompts are materialized locally as standard YAML files. The LangWatch SDKs automatically detect and use these materialized prompts when available, providing seamless fallback behavior. Benefits:- Offline operation - Your application works without internet connectivity
- Air-gapped deployments - Deploy in secure environments with no external access
- Reduced latency - No network calls for prompt retrieval
- Guaranteed consistency - Prompts are locked to specific versions in your deployment
Setting Up Local Materialization
1. Initialize Prompt Dependencies
2. Add Prompt Dependencies
Add the prompts your application needs:prompts.json
file:
3. Materialize Prompts Locally
4. Deploy with Materialized Prompts
Include the materialized prompts in your deployment package. Your application can now run completely offline.Using Materialized Prompts in Code
The SDKs automatically detect and use materialized prompts when available, falling back to API calls only when necessary.offline_app.py
- SDK checks for
./prompts/.materialized/customer-support-bot.prompt.yaml
- If found, loads prompt from local file (no network call)
- If not found, attempts to fetch from LangWatch API
- Throws error if both local file and API are unavailable
Air-Gapped Deployment
For completely air-gapped environments:1. Prepare on Connected Environment
2. Package for Deployment
Include these files in your deployment package:prompts/.materialized/
directory (all YAML files)- Your application code
- Dependencies
3. Deploy to Air-Gapped Environment
The application will run entirely offline, using only materialized prompts. No LangWatch API access required.CI/CD Integration
Integrate prompt materialization into your deployment pipeline:.github/workflows/deploy.yml