feat(ai): implement streamModelCall function for streaming text gener…#13682
feat(ai): implement streamModelCall function for streaming text gener…#13682
Conversation
…ation - Added a new `streamModelCall` function to handle streaming text generation with customizable tool choices and retry logic. - Integrated this function into the existing `DefaultStreamTextResult` class, replacing previous inline logic for improved modularity and readability. - Enhanced notification handling during the streaming process to include prompt messages and step details. This change aims to streamline the text generation process and improve the overall architecture by promoting code reuse.
Claude's assessmentHere's the root cause of the abort signal test failures: Root Cause The issue is a subtle microtask timing difference caused by extracting pipeThrough(createStreamTextPartTransform) into the async function streamModelCall. Original code (working) After await retry(() => doStream()), everything happens synchronously in a single block: There's no microtask boundary between pipeThrough(createStreamTextPartTransform) and addStream. The internal pipe from modelStream → createStreamTextPartTransform is set up but doesn't PR code (broken) pipeThrough(createStreamTextPartTransform) is called inside the async function streamModelCall, right before return. Because streamModelCall is async, its return implicitly wraps the During this microtask boundary, the pipe from modelStream → createStreamTextPartTransform (set up by pipeThrough) gets a chance to run and read ahead from the model stream. This changes Concrete execution trace comparison Original: After the resilient stream reads start-step, its next pull triggers model pull 3 (abort) as a side-effect, and the error propagates before text-start can be read from the PR: After the resilient stream reads start-step, the buffering state is different due to the microtask gap. text-start is successfully read from the buffer before model pull 3 triggers → Fix The fix is to keep pipeThrough(createStreamTextPartTransform) in the same synchronous block as addStream. Either:
Option 1 is the simplest and most correct fix. |
|
This breaks LLM suspense stopStream behavior. There may be fixes for that that we can explore further. |
|
As a next step, we will look deeper into stitchable stream and abort timing. |
…ation
streamModelCallfunction to handle streaming text generation with customizable tool choices and retry logic.DefaultStreamTextResultclass, replacing previous inline logic for improved modularity and readability.This change aims to streamline the text generation process and improve the overall architecture by promoting code reuse.
Background
Summary
Manual Verification
Checklist
pnpm changesetin the project root)Future Work
Related Issues