Use CodexBar CLI local cost usage to summarize per-model usage for Codex or Claude, including the current (most recent) model or a full model breakdown. Trigger when asked for model-level usage/cost data from codexbar, or when you need a scriptable per-model summary from codexbar cost JSON.
Quick Install
bunx add-skill clawdbot/clawdbot -s model-usage
aiassistantcrustaceanmoltyopenclawown-your-data
Instructions
Loading…
Model usage
Overview
Get per-model usage cost from CodexBar's local cost logs. Supports "current model" (most recent daily entry) or "all models" summaries for Codex or Claude.
TODO: add Linux CLI support guidance once CodexBar CLI install path is documented for Linux.
Quick start
Fetch cost JSON via CodexBar CLI or pass a JSON file.
Use the bundled script to summarize by model.
python {baseDir}/scripts/model_usage.py --provider codex --mode current
python {baseDir}/scripts/model_usage.py --provider codex --mode all
python {baseDir}/scripts/model_usage.py --provider claude --mode all --format json --pretty
Current model logic
Uses the most recent daily row with modelBreakdowns.
Picks the model with the highest cost in that row.
Falls back to the last entry in modelsUsed when breakdowns are missing.
Override with --model <name> when you need a specific model.