v2.18.0 #2673
mudler
announced in
Announcements
v2.18.0
#2673
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
⭐ Highlights
Here’s a quick overview of what’s new in 2.18.0:
translate
repeat_last_n
andproperties_order
as model configurations🐋 Support for OCI Images and Ollama Models
You can now specify models using
oci://
andollama://
prefixes in your YAML config files. Here’s an example for Ollama models:Start the Ollama model directly with:
Or download only the model by using:
For OCI images, use the
oci://
prefix. To build a compatible container image, your Dockerfile should look like this:🌋 Vulkan Support for Llama.cpp
We’ve introduced Vulkan support for Llama.cpp! Check out our new image tags
latest-vulkan-ffmpeg-core
andv2.18.0-vulkan-ffmpeg-core
.🗣️ Transcription and Translation
Our transcription endpoint now supports translation! Simply add
translate: true
to your transcription requests to translate the transcription to English.⚙️ Enhanced Model Configuration
We’ve added new configuration options
repeat_last_n
andproperties_order
to give you more control. Here’s how you can set them up in your model YAML file:And for setting
repeat_last_n
(specific to Llama.cpp):💎 Gemma 2!
Google has just dropped gemma 2 models (blog post here), you can already install and run gemma 2 models in LocalAI with
What's Changed
Bug fixes 🐛
Exciting New Features 🎉
repeat_last_n
by @mudler in feat(options): addrepeat_last_n
#2660🧠 Models
📖 Documentation and examples
👒 Dependencies
Other Changes
New Contributors
Full Changelog: v2.17.1...v2.18.0
This discussion was created from the release v2.18.0.
Beta Was this translation helpful? Give feedback.
All reactions