All notable changes to Flexo will be documented in this file.
The format is based on Keep a Changelog, and this project adheres to Semantic Versioning.
v0.2.2 - 2025-03-25
- Added
/models
endpoint and enabled CORS support. - Initiated MCP client and tool registry functionality (WIP).
- Updated tool patterns for enhanced consistency.
- Fixed multi tool accumulation issue.
- Fixed mid-response tool buffer leak.
- Updated chat completions data models: updated
FunctionDetail
and removedname
fromToolCall
.
- MCP configuration in
agent.yaml
is now commented out by default.
v0.2.1 - 2025-03-14
- Fixed issue with LLMFactory not recognizing openai-compat vendor names
v0.2.0 - 2025-03-10
- Added multiple new LLM adapters:
- Anthropic adapter with dedicated prompt builder
- OpenAI compatible adapter with dedicated prompt builder (allows connecting to vLLM, Ollama, etc.)
- xAI adapter with dedicated prompt builder
- Mistral AI adapter with SSE conversion
- Refactored tool registration to use
agent.yaml
config definitions instead of class decorators - Enhanced pattern detection with improved Aho-Corasick method that handles spaces and linebreaks
- Added fallback JSON parsing logic for improved robustness
- Added example DuckDuckGo tool implementation
- Fixed issue to allow context in ChatCompletionRequest to be empty dict
- Improved tool registry logging
- Updated documentation across multiple components
v0.1.1 - 2024-02-14
- Streamlined tool creation and registration workflow with new loading approach
- Added Wikipedia tool support and documentation
- Enhanced streaming implementation with improved context handling and LLM integration
- Added Llama tool structure example
- Improved nested JSON tool parsing capability
- Fixed streaming process bug and removed unused type adapters
- Updated tool configuration and path structures
- Added Elasticsearch SSL certificate documentation
v0.1.0 - 2024-01-31
- Configurable AI agent framework with YAML-based configuration
- FastAPI-based interaction endpoint with streaming support
- Tool calling capabilities for Python functions and REST APIs
- Integration with IBM watsonx.ai models (Granite, Mistral, Llama)
- Integration with OpenAI models
- Docker and Podman containerization support
- Complete documentation and deployment guides
- Database integration (Milvus and Elastic)
- Robust prompt building and parsing systems
- Comprehensive LLM integration components