Skip to content

Architecture

llm-usage-metrics is organized as a reporting pipeline:

  1. parse source data (pi, codex, opencode)
  2. normalize events into shared domain objects
  3. resolve pricing and estimate unresolved costs
  4. aggregate by period/source/model
  5. render (terminal, json, markdown)
  1. CLI entrypointUpdate notifier
  2. Update notifierCommand parser
  3. Command parserbuildUsageData
  4. buildUsageDataSource adapters
  5. Source adaptersNormalized UsageEvent stream
  6. Normalized UsageEvent streamPricing resolution
  7. Pricing resolutionAggregation
  8. AggregationrenderUsageReport
  9. renderUsageReportstdout report
  10. buildUsageDataemitDiagnosticsstderr diagnostics
  • src/cli: orchestration, option handling, diagnostics emission
  • src/sources: source adapters + discovery/parsing concerns
  • src/domain: normalized contracts and constructors
  • src/pricing: pricing loader + cost engine
  • src/aggregate: period/source bucketing and totals
  • src/render: output formatters
  • source-specific parsing is isolated per adapter
  • stdout remains data-only for JSON/Markdown modes
  • diagnostics are emitted to stderr
  • sorting and aggregation are deterministic