Skip to content

Conversation

@mare5x
Copy link
Contributor

@mare5x mare5x commented Oct 30, 2025

Stream LLM outputs token by token to stdout.

TODO / Future ideas:

  • Use colored outputs to distinguish between different message types.
  • Jupyter notebook specific formatting (e.g., markdown rendering).
  • Add streaming to edaplot.

Closes #32

@mare5x mare5x changed the title Add text-based token streaming Add stdout token streaming Oct 30, 2025
@mare5x mare5x requested a review from kosstbarz October 30, 2025 15:38
@mare5x mare5x merged commit e8b03b7 into main Nov 3, 2025
2 checks passed
@mare5x mare5x deleted the text-streaming branch November 3, 2025 13:01
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Stream LLM responses

3 participants