fix: now reasoning output is rendered in the UI#29
Conversation
079b6b1 to
42cb73b
Compare
There was a problem hiding this comment.
Pull Request Overview
This PR updates the chat API to support streaming reasoning output and adjusts UI components to render that output progressively.
- Enables streaming of reasoning output when the model starts with "o3"/"o4"
- Buffers reasoning summary updates and flushes them upon completion
- Replaces ReactMarkdown with a custom MarkdownContent component in chat-related UI components
Reviewed Changes
Copilot reviewed 6 out of 6 changed files in this pull request and generated 2 comments.
Show a summary per file
| File | Description |
|---|---|
| src/routes/api/chat.ts | Adds reasoning configuration to API request based on model prefix |
| src/lib/streaming.ts | Buffers and flushes reasoning summary text events in the stream |
| src/components/ReasoningMessage.tsx | Creates a dedicated component to render reasoning messages |
| src/components/MarkdownContent.tsx | Extracts markdown rendering into a reusable component |
| src/components/ChatMessage.tsx | Replaces direct ReactMarkdown usage with MarkdownContent |
| src/components/Chat.tsx | Incorporates streaming reasoning events into the chat UI |
1ff6422 to
d60c113
Compare
|
This is good to go. Do you still get crashes with this @wasaga? I was never able to replicate this with the reported crashes. Also, for
it's already set to |
d60c113 to
4c5b4f7
Compare
4c5b4f7 to
6eed40a
Compare
✅ Deploy Preview for mcp-app-demo ready!
To edit notification comments on pull requests, go to your Netlify project configuration. |
Now reasoning output is rendered in the UI. Currently all the chunks for a reasoning summary get flushed at the same time. We can probably stream those in.
Closes #16
Before
After