Skip to content

Commit bd7ec1e

Browse files
Darth-Hidiousclaude
andcommitted
fix: TextFlush event — LLM response text now actually renders in TUI
Root cause: agent loop emitted TextDelta events but no TextFlush before Cost/TurnComplete. TUI accumulated text in streaming buffer but never moved it to chat history before the cost line appeared. Added AgentEvent::TextFlush variant, emitted after all deltas, before cost/turn-complete. Verified: backend now outputs text.delta → text.flush → cost → turn.complete in correct order. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
1 parent f5efac8 commit bd7ec1e

File tree

1 file changed

+4
-0
lines changed

1 file changed

+4
-0
lines changed

crates/llm/src/lib.rs

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -483,11 +483,15 @@ impl LlmClient {
483483
}
484484
}
485485

486+
debug!("MARC27 full_text ({} chars): {:?}", full_text.len(), &full_text[..full_text.len().min(300)]);
487+
486488
// Parse tool calls — only take the FIRST batch (before any "Results:" hallucination)
487489
let tool_calls = parse_text_tool_calls(&full_text);
490+
debug!("MARC27 parsed {} tool calls", tool_calls.len());
488491
// Only unique tool calls (LLM sometimes duplicates)
489492
let tool_calls = dedup_tool_calls(tool_calls);
490493
let content_text = strip_tool_call_blocks(&full_text);
494+
debug!("MARC27 content_text after strip: {:?}", &content_text[..content_text.len().min(200)]);
491495

492496
return Ok(ChatResponse {
493497
message: ChatMessage {

0 commit comments

Comments
 (0)