Skip to content

fix(runtime): decouple summarization from state persistence to preserve full conversation history (#1393)#1430

Open
sjhddh wants to merge 1 commit intobytedance:mainfrom
sjhddh:fix/issue-1393
Open

fix(runtime): decouple summarization from state persistence to preserve full conversation history (#1393)#1430
sjhddh wants to merge 1 commit intobytedance:mainfrom
sjhddh:fix/issue-1393

Conversation

@sjhddh
Copy link
Copy Markdown

@sjhddh sjhddh commented Mar 26, 2026

Fixes #1393 and #1000.\n\nCurrently, the native SummarizationMiddleware replaces the conversation history in state["messages"] with a generated summary plus a trailing window of messages. This permanently erases the original history from LangGraph checkpoints, causing the frontend to render truncated dialogue histories upon page refresh.\n\nThis patch introduces a SafeSummarizationMiddleware wrapper that hijacks SummarizationMiddleware.abefore_model. Instead of returning a destructive state update with RemoveMessage(id=REMOVE_ALL_MESSAGES), it captures the summarized window and injects it directly into the ModelRequest.messages via awrap_model_call. \n\nThis ensures that the token reduction strictly applies to the current inference window passed to the LLM, while leaving the authoritative state["messages"] and DB checkpoints completely intact and immutable.

yanglinzhi added a commit to yanglinzhi/deer-flow that referenced this pull request Mar 27, 2026
@WillemJiang WillemJiang added the question Further information is requested label Mar 28, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

question Further information is requested

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[runtime] 对话历史自动压缩后,原始完整消息丢失,前端无法回溯完整历史对话

2 participants