Skip to content

feat(ui): smooth streaming for chat messages #129

@karthikmudunuri

Description

@karthikmudunuri

Summary

Improve the streaming experience in the chat UI so that tokens appear with a smooth, natural cadence rather than arriving in visually jarring chunks.

Context

The current streaming implementation (landed in #123) renders tokens as they arrive from the backend, which can feel choppy — especially on slower connections or when the model emits tokens in bursts.

Desired behavior

  • Tokens should animate into view with a smooth, typewriter-like cadence
  • Buffer incoming tokens and flush them at a consistent visual rate
  • Maintain low perceived latency (first token should still appear quickly)
  • Smooth scrolling to keep the latest content in view during generation

Acceptance criteria

  • Streaming text appears smoothly without visible chunking or jitter
  • First-token latency is not noticeably degraded
  • Auto-scroll remains smooth and doesn't jump
  • Works correctly with markdown rendering and code blocks

Metadata

Metadata

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions