Architecture
llm-usage-metrics is organized as a reporting pipeline:
- parse source data (
pi,codex,opencode) - normalize events into shared domain objects
- resolve pricing and estimate unresolved costs
- aggregate by period/source/model
- render (
terminal,json,markdown)
Runtime flow
Section titled “Runtime flow”CLI entrypoint→Update notifierUpdate notifier→Command parserCommand parser→buildUsageDatabuildUsageData→Source adaptersSource adapters→Normalized UsageEvent streamNormalized UsageEvent stream→Pricing resolutionPricing resolution→AggregationAggregation→renderUsageReportrenderUsageReport→stdout reportbuildUsageData→emitDiagnostics→stderr diagnostics
Module boundaries
Section titled “Module boundaries”src/cli: orchestration, option handling, diagnostics emissionsrc/sources: source adapters + discovery/parsing concernssrc/domain: normalized contracts and constructorssrc/pricing: pricing loader + cost enginesrc/aggregate: period/source bucketing and totalssrc/render: output formatters
Design principles
Section titled “Design principles”- source-specific parsing is isolated per adapter
- stdout remains data-only for JSON/Markdown modes
- diagnostics are emitted to stderr
- sorting and aggregation are deterministic