Releases: abacusai/codellm-releases
Releases · abacusai/codellm-releases
1.96.4.25115
Changes in 1.96.4.25115
- Improve autocomplete responsiveness and quality
Changes in 1.96.4.25114
- Built-in support for WSL (Windows Subsystem for Linux)
- Autocomplete responsiveness and quality improvements
- Device authentication flow and auth improvements
- Various bug fixes
1.96.4.25114
- Built-in support for WSL (Windows Subsystem for Linux)
- Autocomplete improvements
- Device authentication flow and auth improvements
- Bug fixes
1.96.4.25087
- Stream agent responses
- Consider the file contexts for the agent
- Improve Autocomplete UI and functionality
- Fix autocomplete suggestions preview ordering
- Disable Autocomplete when the suggestion widget is open
1.96.4.25084
What's Changed
- Builtin Language Server Support for Python
- Autocomplete UI improvements
- Autocomplete quality improvements
- Compute points reported for Chat messages
- Agent diff UI improvements
1.96.4.25069
- Improve Autocomplete efficiency
1.96.4.25067
- Improve autocomplete efficiency
1.96.4.25066
- Improved UI for agent, chat, code edit, diff, etc.
- Added editor diagnostics context for autocomplete
- Use the file context for the agent
- Improved various auto-complete behaviours
- Added pasting images functionality in chat
- Implemented autocomplete status bar indicator
- Updated the built-in remote-ssh extension to the latest version
- Fixed the remote-ssh server downloading issue
- Improved agent stop behaviour
- Improved file path handling for workspace paths
- Resolved autofocus issues on selection context and chat opening
- Added error and crash reporting
- Fix CodeLLM release notes inside CodeLLM
1.96.4.25052
- Update VSCode base to 1.96.4
- Fixes for o1 in chat
- Autocomplete improvements
1.95.3.25046
- Introducing CodeLLM Agent
- Move selection into context for chat directly
- Code Apply improvements
- Autocomplete improvements
- Autocomplete works on a larger range
- More crash fixes
1.95.3.25035
- Windows code apply fixes