Automatic LLM observability for OpenCode using Langfuse via OpenTelemetry.
Zero-config tracing of sessions, messages, tool calls, costs, and performance.
npm install opencode-plugin-langfuse
# or
bun add opencode-plugin-langfuseSign up at cloud.langfuse.com and create a project.
Go to Settings → API Keys and copy your keys.
export LANGFUSE_PUBLIC_KEY="pk-lf-..."
export LANGFUSE_SECRET_KEY="sk-lf-..."
export LANGFUSE_BASEURL="https://cloud.langfuse.com" # OptionalIn .opencode/opencode.json:
{
"experimental": {
"openTelemetry": true
},
"plugin": ["opencode-plugin-langfuse"]
}That's it! All traces appear automatically in your Langfuse dashboard.
This plugin initializes a LangfuseSpanProcessor that captures all OpenTelemetry spans emitted by OpenCode when experimental.openTelemetry is enabled.
OpenCode (OTEL spans) → LangfuseSpanProcessor → Langfuse Dashboard
| Variable | Required | Default | Description |
|---|---|---|---|
LANGFUSE_PUBLIC_KEY |
Yes | - | Langfuse public key |
LANGFUSE_SECRET_KEY |
Yes | - | Langfuse secret key |
LANGFUSE_BASEURL |
No | https://cloud.langfuse.com |
Self-hosted instance |
export LANGFUSE_BASEURL="https://langfuse.yourcompany.com"See Langfuse self-hosting docs.
- Verify
experimental.openTelemetry: trueis set - Check credentials:
echo $LANGFUSE_PUBLIC_KEY - Check Langfuse health:
curl https://cloud.langfuse.com/api/public/health
- Ensure
opencode-plugin-langfuseis independencies(notdevDependencies) - Verify
.opencode/opencode.jsonsyntax
MIT © omercnet