fix: ollama returns io.EOF on Done chunk #499
Closed
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Describe your changes
Returning a zero-value Error response on Done caused consumer of Recv() to process an extra, empty/duplicated message. This previously resulted in printing of things like 'hello' from the LLM as 'helloo' in mods. Done now signals end-of-stream as expected.
Related issue/discussion:
N/A - I can open if desired. Ollama streams currently print the last chunk multiple times (at least for me, locally):
Checklist before requesting a review
CONTRIBUTING.mdIf this is a feature
N/A - bugfix