Releases: agent0ai/agent-zero
Releases Β· agent0ai/agent-zero
v0.9.7 - Projects
v0.9.7 - Projects
- Projects management
- Support for custom instructions
- Integration with memory, knowledge, files
- Project specific secrets
- New Welcome screen/Dashboard
- New Wait tool
- Subordinate agent configuration override support
- Support for multiple documents at once in document_query_tool
- Improved context on interventions
- Openrouter embedding support
- Frontend components refactor and polishing
- SSH metadata output fix
- Support for windows powershell in local TTY utility
- More efficient selective streaming for LLMs
- UI output length limit improvements
- Update checker
v0.9.6 - Memory Dashboard
v0.9.6 - Memory Dashboard
- Memory Management Dashboard
- Kali update
- Python update + dual installation
- Browser Use update
- New login screen
- LiteLLM retry on temporary errors
- Github Copilot provider support
v0.9.5.1 - Secrets
0.9.5.1
- added support for Agent Zero Venice provider
- added support for xAI provider
0.9.5
- Secrets management - agent can use credentials without seeing them
- Agent can copy paste messages and files without rewriting them
- LiteLLM global configuration field
- Custom HTTP headers field for browser agent
- Progressive web app support
- Extra model params support for JSON
- Short IDs for files and memories to prevent LLM errors
- Tunnel component frontend rework
- Fix for timezone change bug
- Notifications z-index fix
v0.9.5 - Secrets
- Secrets management - agent can use credentials without seeing them
- Agent can copy paste messages and files without rewriting them
- LiteLLM global configuration field
- Custom HTTP headers field for browser agent
- Progressive web app support
- Extra model params support for JSON
- Short IDs for files and memories to prevent LLM errors
- Tunnel component frontend rework
- Fix for timezone change bug
- Notifications z-index fix
v0.9.4 - Connectivity, UI
v0.9.4 - Connectivity, UI
- External API endpoints
- Streamable HTTP MCP A0 server
- A2A (Agent to Agent) protocol - server+client
- New notifications system
- New local terminal interface for stability
- Rate limiter integration to models
- Delayed memory recall
- Smarter autoscrolling in UI
- Action buttons in messages
- Multiple API keys support
- Tunnel URL QR code
- Download streaming
- Internal fixes and optimizations
v0.9.3 - Subordinates, memory, providers
- Faster startup/restart
- Subordinate agents can have dedicated prompts, tools and system extensions
- Streamable HTTP MCP server support
- Memory loading enhanced by AI filter
- Memory AI consolidation when saving memories
- Auto memory system configuration in settings
- LLM providers available are set by providers.yaml configuration file
- Venice.ai LLM provider supported
- Initial agent message for user + as example for LLM
- Docker build support for local images
- File browser fix
v0.9.2 - Speech, attachments
v0.9.2 - Kokoro TTS, Attachments
- Kokoro text-to-speech integration
- New message attachments system
- Minor updates: log truncation, hyperlink targets, component examples, api cleanup
v0.9.1.1 - LiteLLM, UI improvements
v0.9.1.1
- LLM API base URL field fix
v0.9.1 - LiteLLM, UI improvements
- Langchain replaced with LiteLLM
- Support for reasoning models streaming
- Support for more providers
- Openrouter set as default instead of OpenAI
- UI improvements
- New message grouping system
- Communication smoother and more efficient
- Collapsible messages by type
- Code execution tool output improved
- Tables and code blocks scrollable
- More space efficient on mobile
- Streamable HTTP MCP servers support
- LLM API URL added to models config for Azure, local and custom providers
v0.9.1 - LiteLLM, UI improvements
v0.9.1 - LiteLLM, UI improvements
- Langchain replaced with LiteLLM
- Support for reasoning models streaming
- Support for more providers
- Openrouter set as default instead of OpenAI
- UI improvements
- New message grouping system
- Communication smoother and more efficient
- Collapsible messages by type
- Code execution tool output improved
- Tables and code blocks scrollable
- More space efficient on mobile
- Streamable HTTP MCP servers support
- LLM API URL added to models config for Azure, local and custom providers