Major Architecture Update: Simplified LlamaDeploy #545
masci
announced in
Announcements
Replies: 1 comment
-
|
When are you planning to release v1.0.0? Any estimate would be appreciated. So v0.9.1 is currently missing a few things such as
I assume these things will be back in the upcoming major release? Very much looking forward to it :). |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
We're excited to announce a significant architectural change in LlamaDeploy that dramatically simplifies how workflows are deployed and executed. This change represents our commitment to making LlamaDeploy more accessible while maintaining its core Workflow orchestration capabilities. The new architecture prioritizes simplicity and performance over distributed scaling, making it ideal for most deployment scenarios.
♻️ What Changed
We've removed the control plane and message queue concepts from LlamaDeploy, moving from a distributed microservices architecture to a streamlined single-process execution model.
Before
After
💆 Benefits
🛑 Breaking Changes
The update will be rolled out with a major release 1.0 to reduce the impact on existing installations of LlamaDeploy, but given the size and impact of the breaking changes we cannot guarantee a clear migration path. In general:
SimpleMessageQueuethen nothing changes, your project can be upgraded to 1.0llama_deployrequirement in your dependencies as<1.0Beta Was this translation helpful? Give feedback.
All reactions