If you are building a custom DSPy optimizer, then LangWatch won’t support tracking it out of the box, but adding track to any custom optimizer is also very simple.

1. Initialize LangWatch DSPy with optimizer=None

Before the compilation step, explicitly provide None on the optimizer parameter to be able to track the steps manually:

langwatch.dspy.init(experiment="dspy-custom-optimizer-example", optimizer=None)

compiled_rag = my_awesome_optimizer.compile(RAG(), trainset=trainset)

2. Track the metric function

Either before instantiating your optimizer, or inside the compilation step, don’t forget to wrap the metric function with langwatch.dspy.track_metric so that it’s tracked:

metric = langwatch.dspy.track_metric(metric)

3. Track each step

Now at each step your optimizer progresses, call langwatch.dspy.log_step to capture the score at the current step index, optimizer info and predictors being used on this step evaluation:

langwatch.dspy.log_step(
    optimizer=DSPyOptimizer(
        name="MyAwesomeOptimizer",
        parameters={
            "hyperparam": 1,
        },
    ),
    index="1", # step index
    score=0.5,
    label="score",
    predictors=candidate_program.predictors(),
)

The LLM calls and examples being evaluated with be tracked automatically and logged in together with calling log_step.

Wrapping up

That’s it! You should see the steps of the optimizer in the LangWatch dashboard now.

For any questions or issues, feel free to contact our support, join our channel on Discord or open an issue on our GitHub.