Skip to content

v0.21.1 Streaming for llama.cpp

Choose a tag to compare

@neph1 neph1 released this 10 Jan 19:32
· 300 commits to master since this release
dd7a3bb

What's Changed

  • Llama.cpp now has streaming for location descriptions, as well.
  • Some io refactoring as part of that.
  • Handling error messages from server (needs some further work, but at least doesn't throw exception now)
  • I'm playing with prompts some, moving much of the context for prompts to 'memory' for kobold. Should it be in 'system' for llama.cpp/openai? I'll try it out.

Full Changelog: v0.21.0...v0.21.1