Token tracking y cálculo de costes por mensaje

- Config: COST_PER_1M_INPUT y COST_PER_1M_OUTPUT configurables via .env
- OpenAI adapter: stream_options include_usage para capturar tokens reales
- base.py: acumula input/output tokens de cada iteración del agente
- planner.py: devuelve usage junto con el plan
- engine.py: suma tokens de planner + steps + review, calcula coste USD
- Response incluye usage{input_tokens, output_tokens} y total_cost_usd

Formato compatible con el bridge de Claude Code CLI para integración
con el frontend y reporting a Acai webservice.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
This commit is contained in:
Jordan Diaz
2026-04-03 14:18:23 +00:00
parent 2712c2fd49
commit 7c891cf023
5 changed files with 60 additions and 10 deletions

View File

@@ -44,6 +44,7 @@ class OpenAIAdapter(ModelAdapter):
"temperature": config.temperature,
"messages": messages,
"stream": True,
"stream_options": {"include_usage": True},
}
if tools:
kwargs["tools"] = self._format_tools(tools)
@@ -52,9 +53,22 @@ class OpenAIAdapter(ModelAdapter):
tool_calls_acc: dict[int, dict[str, str]] = {}
final_usage: dict[str, int] = {}
async for chunk in stream:
# With include_usage, the last chunk has usage but no choices
if chunk.usage:
final_usage = {
"input_tokens": chunk.usage.prompt_tokens or 0,
"output_tokens": chunk.usage.completion_tokens or 0,
}
choice = chunk.choices[0] if chunk.choices else None
if not choice:
# Usage-only chunk (last one with include_usage) — emit it
if final_usage:
yield StreamChunk(usage=final_usage)
final_usage = {} # Only emit once
continue
delta = choice.delta
@@ -99,16 +113,15 @@ class OpenAIAdapter(ModelAdapter):
tool_arguments=acc["arguments"],
finish_reason="tool_use",
)
# Emit usage after tool_use chunks
if final_usage:
yield StreamChunk(usage=final_usage)
else:
yield StreamChunk(
finish_reason="end_turn"
if choice.finish_reason == "stop"
else choice.finish_reason,
usage={
"output_tokens": chunk.usage.completion_tokens
if chunk.usage
else 0
},
usage=final_usage,
)
# ------------------------------------------------------------------