Skip to content

Releases: agent0ai/agent-zero

v0.9.7 - Projects

19 Nov 11:41

Choose a tag to compare

v0.9.7 - Projects

Release video

  • Projects management
    • Support for custom instructions
    • Integration with memory, knowledge, files
    • Project specific secrets
  • New Welcome screen/Dashboard
  • New Wait tool
  • Subordinate agent configuration override support
  • Support for multiple documents at once in document_query_tool
  • Improved context on interventions
  • Openrouter embedding support
  • Frontend components refactor and polishing
  • SSH metadata output fix
  • Support for windows powershell in local TTY utility
  • More efficient selective streaming for LLMs
  • UI output length limit improvements
  • Update checker

v0.9.6 - Memory Dashboard

02 Oct 14:05

Choose a tag to compare

v0.9.6 - Memory Dashboard

Release video

  • Memory Management Dashboard
  • Kali update
  • Python update + dual installation
  • Browser Use update
  • New login screen
  • LiteLLM retry on temporary errors
  • Github Copilot provider support

v0.9.5.1 - Secrets

19 Sep 07:16

Choose a tag to compare

0.9.5.1

  • added support for Agent Zero Venice provider
  • added support for xAI provider

0.9.5

  • Secrets management - agent can use credentials without seeing them
  • Agent can copy paste messages and files without rewriting them
  • LiteLLM global configuration field
  • Custom HTTP headers field for browser agent
  • Progressive web app support
  • Extra model params support for JSON
  • Short IDs for files and memories to prevent LLM errors
  • Tunnel component frontend rework
  • Fix for timezone change bug
  • Notifications z-index fix

v0.9.5 - Secrets

03 Sep 08:57

Choose a tag to compare

  • Secrets management - agent can use credentials without seeing them
  • Agent can copy paste messages and files without rewriting them
  • LiteLLM global configuration field
  • Custom HTTP headers field for browser agent
  • Progressive web app support
  • Extra model params support for JSON
  • Short IDs for files and memories to prevent LLM errors
  • Tunnel component frontend rework
  • Fix for timezone change bug
  • Notifications z-index fix

v0.9.4 - Connectivity, UI

18 Aug 11:09

Choose a tag to compare

v0.9.4 - Connectivity, UI

  • External API endpoints
  • Streamable HTTP MCP A0 server
  • A2A (Agent to Agent) protocol - server+client
  • New notifications system
  • New local terminal interface for stability
  • Rate limiter integration to models
  • Delayed memory recall
  • Smarter autoscrolling in UI
  • Action buttons in messages
  • Multiple API keys support
  • Tunnel URL QR code
  • Download streaming
  • Internal fixes and optimizations

v0.9.3 - Subordinates, memory, providers

04 Aug 17:45

Choose a tag to compare

  • Faster startup/restart
  • Subordinate agents can have dedicated prompts, tools and system extensions
  • Streamable HTTP MCP server support
  • Memory loading enhanced by AI filter
  • Memory AI consolidation when saving memories
  • Auto memory system configuration in settings
  • LLM providers available are set by providers.yaml configuration file
  • Venice.ai LLM provider supported
  • Initial agent message for user + as example for LLM
  • Docker build support for local images
  • File browser fix

v0.9.2 - Speech, attachments

21 Jul 08:44

Choose a tag to compare

v0.9.2 - Kokoro TTS, Attachments

  • Kokoro text-to-speech integration
  • New message attachments system
  • Minor updates: log truncation, hyperlink targets, component examples, api cleanup

v0.9.1.1 - LiteLLM, UI improvements

09 Jul 08:46

Choose a tag to compare

v0.9.1.1

  • LLM API base URL field fix

v0.9.1 - LiteLLM, UI improvements

  • Langchain replaced with LiteLLM
    • Support for reasoning models streaming
    • Support for more providers
    • Openrouter set as default instead of OpenAI
  • UI improvements
    • New message grouping system
    • Communication smoother and more efficient
    • Collapsible messages by type
    • Code execution tool output improved
    • Tables and code blocks scrollable
    • More space efficient on mobile
  • Streamable HTTP MCP servers support
  • LLM API URL added to models config for Azure, local and custom providers

v0.9.1 - LiteLLM, UI improvements

07 Jul 15:03

Choose a tag to compare

v0.9.1 - LiteLLM, UI improvements

  • Langchain replaced with LiteLLM
    • Support for reasoning models streaming
    • Support for more providers
    • Openrouter set as default instead of OpenAI
  • UI improvements
    • New message grouping system
    • Communication smoother and more efficient
    • Collapsible messages by type
    • Code execution tool output improved
    • Tables and code blocks scrollable
    • More space efficient on mobile
  • Streamable HTTP MCP servers support
  • LLM API URL added to models config for Azure, local and custom providers

v0.9.0

26 Jun 08:14

Choose a tag to compare

Starting 0.9 series of updates

  • backup and restore feature for easy upgradability
  • subordinate agent promp profiles (roles)
  • developer and researcher agent prototypes
  • security fixes