fix: detect incomplete Anthropic SSE stream and deduplicate system prompt#26
Open
aguung wants to merge 1 commit into
Open
fix: detect incomplete Anthropic SSE stream and deduplicate system prompt#26aguung wants to merge 1 commit into
aguung wants to merge 1 commit into
Conversation
…ompt stream_anthropic_sse previously returned partial output silently when the connection dropped before message_stop arrived. Now tracks message_stop_received and returns an explicit error so the user knows to retry rather than seeing a silently truncated response. Also propagates JSON parse errors in parse_anthropic_sse_line via ? instead of silently returning Ok(false), consistent with the OpenAI SSE parser. Extracts the duplicated CHAT_SYSTEM_INSTRUCTIONS into a single module- level constant — previously maintained as two identical inline concat! blocks in send_openai_compatible and send_anthropic.
9 tasks
aguung
pushed a commit
to aguung/enowX-Coder
that referenced
this pull request
May 14, 2026
Tracks prompt and completion token counts from both OpenAI-compatible and Anthropic SSE streams and accumulates them per session in the frontend store. Backend: - TokenUsage / ChatUsageEvent / UsageAccumulator structs in chat_service.rs using serde::Serialize (no serde_json::json! to avoid clippy::disallowed_methods) - stream_openai_sse: requests stream_options.include_usage=true so the final SSE chunk carries usage; parsed in parse_openai_sse_line - stream_anthropic_sse: captures input_tokens from message_start and output_tokens from message_delta events - Emits chat-usage Tauri event after each completed completion - Also fixes stream_anthropic_sse to return error on missing message_stop (same as the pending PR enowdev#26) Frontend: - TokenUsage / ChatUsageEvent types added to types/index.ts - useChatStore: sessionUsage record, addTokenUsage (cumulative per-session sum), clearSessionUsage - AppShell: listens for chat-usage, calls addTokenUsage - ChatHeader: shows total token badge next to session title; tooltip shows split prompt / completion counts; formats as 4.2k for readability
9 tasks
enowdev
reviewed
May 15, 2026
Owner
enowdev
left a comment
There was a problem hiding this comment.
The change itself looks reasonable, but this PR now conflicts with the current main branch in src-tauri/src/services/chat_service.rs. Please rebase onto the latest main and keep the newer provider-routing/stream handling changes while re-applying the incomplete-Anthropic-stream detection.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Description
stream_anthropic_ssepreviously returned partial output silently when the network connection dropped before themessage_stopevent arrived. The user would see a truncated response with no indication that something went wrong.This PR fixes that, and also cleans up two related issues found in the same file.
Changes:
Track
message_stopin Anthropic SSE stream — addsmessage_stop_receivedflag; if the stream ends without it, returns an explicit error so the user sees a clear message to retry instead of silently getting a truncated response.Propagate JSON parse errors in
parse_anthropic_sse_line— was silently returningOk(false)on malformed SSE payloads; now uses?to propagate the error, consistent withparse_openai_sse_line.Deduplicate
CHAT_SYSTEM_INSTRUCTIONS— the system instructions string was copy-pasted as two identical inlineconcat!blocks insend_openai_compatibleandsend_anthropic; extracted into a singleCHAT_SYSTEM_INSTRUCTIONSmodule-level constant.Type of Change
How Has This Been Tested?
bunx tsc --noEmitpasses (TypeScript) — no TypeScript changes in this PRcargo clippy -- -D warningspasses (Rust) — could not run in current environment due to missing GTK system deps on WSLManual verification:
message_stop_receivedflag is set only inside theparse_anthropic_sse_linereturn path formessage_stopevent'outerlabel correctly breaks out of thetokio::select!loop so the completion check runsCHAT_SYSTEM_INSTRUCTIONSconstant replaces both occurrences with identical content — no behavioral changeChecklist