Learn how to trace local LLMs running via Ollama in Go using the LangWatch SDK.
otelopenai
middleware.
http://localhost:11434
.
openai.Client
to point to the local Ollama endpoint. While Ollama doesn’t require an API key, the openai-go
library requires one to be set, so you can use a non-empty string like "ollama"
.
Set the gen_ai.system
attribute to "ollama"
to identify the provider in your LangWatch traces.