-
Notifications
You must be signed in to change notification settings - Fork 2
Closed
Description
Summary
Improve the streaming experience in the chat UI so that tokens appear with a smooth, natural cadence rather than arriving in visually jarring chunks.
Context
The current streaming implementation (landed in #123) renders tokens as they arrive from the backend, which can feel choppy — especially on slower connections or when the model emits tokens in bursts.
Desired behavior
- Tokens should animate into view with a smooth, typewriter-like cadence
- Buffer incoming tokens and flush them at a consistent visual rate
- Maintain low perceived latency (first token should still appear quickly)
- Smooth scrolling to keep the latest content in view during generation
Acceptance criteria
- Streaming text appears smoothly without visible chunking or jitter
- First-token latency is not noticeably degraded
- Auto-scroll remains smooth and doesn't jump
- Works correctly with markdown rendering and code blocks
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels