PyPI-distributable LLM control plane: gateway choke point, cost attribution, OTel instrumentation, and offline reporting as an inspectable engineering artifact.
-
Updated
Mar 30, 2026 - Python
PyPI-distributable LLM control plane: gateway choke point, cost attribution, OTel instrumentation, and offline reporting as an inspectable engineering artifact.
Databricks App for cross-workspace AI stack governance. Unified view of agents, serving endpoints, knowledge bases, and AI Gateway usage, backed by system tables + Lakebase with auto-refreshing discovery workflows.
Lightweight AWS Lambda proxy for OpenAI Codex CLI — per-developer token auth, full request/response logging, cost attribution, and centralized access control.
OTel-native typed primitives for LLM cost attribution and telemetry — published on PyPI.
Policy-bounded decision traces for AI-assisted financial operations: versioned policy enforcement, human-in-the-loop gating, cost attribution, and auditable case outcomes.
Pre-dispatch policy evaluation and cost attribution for LLM inference, built on llmscope.
FinOps cost attribution for AI code agents: maps Kiro, Cursor, and Claude Code spend to git commits to reveal per-task cost, waste patterns, and agent ROI signals.
AI FinOps Engine - GPU/LLM cost attribution for Kubernetes
Trace and cost capture for Azure-backed LLM calls with local aggregation and deployment-level summaries.
Per-subagent cost attribution for Claude Code. Reads local JSONL session logs, computes shadow cost at marginal Anthropic API rates with proper TTL pricing.
Add a description, image, and links to the cost-attribution topic page so that developers can more easily learn about it.
To associate your repository with the cost-attribution topic, visit your repo's landing page and select "manage topics."