Releases: logancyang/obsidian-copilot
3.1.3
This release includes
- Significant enhancements to AWS Bedrock support
- A new automatic text selection to chat context feature (default to off under Basic setting)
- Better user experience with composer - skip confirmation with an explicit instruction
- Reduced popups during onboarding
More details in the changelog:
Improvements
- #2023 Enable agent by default @logancyang
- #2018 Add auto selection to context setting @logancyang
- #2017 Implement auto context inclusion on text selection @logancyang
- #2015 Improve onboarding by removing the popups @logancyang
- #2011 Update bedrock model support @logancyang
- #2008 Add anthropic version required field for bedrock @logancyang
- #2010 Multiple UX improvement @zeroliu
- #2002 Enhance writeToFile tool with confirmation option @wenzhengjiang
- #2014 Update log file @logancyang
- #2007 Add AWS Bedrock cross-region inference profile guidance @vedmichv
Bug Fixes
- #2016 Fix thinking model verification @logancyang
- #2024 Do not show thinking if reasoning is not checked @logancyang
- #2012 Fix bedrock model image support @logancyang
- #2001 Fix template note processing @zeroliu
3.1.2
Release time again π We are ramping up to reach our big goals sooner! Some major changes
- π«³ Drag-n-drop files from file navbar to Copilot Chat as context!
- π§ Revamped context management system that saves tokens by maximizing token cache hit
- π Better context note loading from saved chats
- β©οΈ New setting under Basic tab to set the send key - Enter / Shift + Enter
- π Embedded note
![[note]]now supported in context
More details in the changelog:
Improvements
- #1996 Support Tasks codeblock in AI response @logancyang
- #1995 Support embedded note in context @logancyang
- #1988 Update Corpus-in-Context and web search tool guide @logancyang
- #1979 Add SiliconFlow support for chat and embedding models @qychen2001
- #1982 Simplify log file @logancyang
- #1968 Add configurable send shortcut for chat messages @Emt-lin
- #1973 Integrate ProjectChainRunner and ChatManager with new layered context @logancyang
- #1971 Context revamp - Introduces layered context handling @logancyang
- #1964 Support drag-n-drop files from file navbar @zeroliu
- #1962 Prompt Improvement: Use getFileTree to explore ambiguous notes and folders @wenzhengjiang
- #1963 Stop condensing history in plus nonagent route @logancyang
Bug Fixes
- #1997 Enhance local search guidance prompt @logancyang
- #1994 Fixes rendering issues in saved chat notes when model names contain special characters @logancyang
- #1992 Fix HyDE calling the wrong model @logancyang
- #1976 Fix ENAMETOOLONG @logancyang
- #1975 Fix indexing complete UI hanging @logancyang
- #1977 Fix thinking block duplication text for openrouter thinking models @logancyang
- #1987 Focus on click copilot chat icon in left ribbon @logancyang
- #1986 Focus to chat input on opening chat window command @logancyang
3.1.1
This patch release 3.1.1 packs a punch πͺ with some significant upgrades and critical bug fixes.
- OpenRouter thinking models are supported now! As long as "Reasoning" is checked for a reasoning model from OpenRouter, the thinking block will render in chat. If you don't want to see it, simply uncheck "Reasoning" to hide it.
- Copilot can see Dataview results in the active note! π₯π₯π₯ Simply add the active note with dataview queries to context, and the LLM will see the executed results of those queries and use them as context!
- New model provider Amazon Bedrock added! (We only support API key and region settings for now, other ways of Bedrock access are not supported)
More details in the changelog:
Improvements
- #1955 Add bedrock provider @logancyang
- #1954 Enable Openrouter thinking tokens @logancyang
- #1942 Improve custom command @zeroliu
- #1931 Improve error handling architecture across chain runners @Emt-lin
- #1929 Add CRUD to Saved Memory @wenzhengjiang
- #1928 Enhance canvas creation spec with with JSON Canvas Spec @wenzhengjiang
- #1923 Turn autosaveChat ON by default @wenzhengjiang
- #1922 Sort notes in typeahead menu by creation time @zeroliu
- #1919 Implement tag list builtin tool @logancyang
- #1918 Support dataview result in active note @logancyang
- #1914 Turn on memory feature by default @wenzhengjiang
Bug Fixes
- #1957 Fix ENAMETOOLONG error on chat save @logancyang
- #1956 Enhance error handling @logancyang
- #1950 Fix new note (renamed) not discoverable in Copilot chat @logancyang
- #1947 Stop rendering dataview result in AI response @logancyang
- #1927 Properly render pills in custom command @zeroliu
3.1.0
Copilot for Obsidian - Release v3.1.0 π₯
3.1.0 finally comes out of preview!! πππ This release introduces significant advancements in chat functionality and memory management, alongside various improvements and bug fixes.
New Features
- Brand New Copilot Chat Input: A completely redesigned chat input! This is a huge update we introduced after referencing all the industry-leading solutions.
- Enhanced Context Referencing: A new typeahead system allows direct referencing of notes, folders, tags, URLs, and tools using familiar syntax like
@,[[,#, and/. - Interactive "Pills": Referenced items appear as interactive pills for a cleaner interface and easier management. No tripping over typos again!
- Enhanced Context Referencing: A new typeahead system allows direct referencing of notes, folders, tags, URLs, and tools using familiar syntax like
- Long-Term Memory (plus): A major roadmap item, this feature allows Copilot to reference recent conversations and save relevant information to long-term memory. Memories are saved as
.mdfiles in thecopilot/memorydirectory by default (configurable), allowing for inspection and manual updates.- Major item on the roadmap, making its debut
- Enable "Reference Recent Conversation" and "Reference Saved Memory" in Plus settings
- AI can see a summary of recent chats
- AI can save and reference relevant info to long-term memory on its own
- Option to manually trigger save by asking the agent or using the new
@memorytool - Memories saved as md files under copilot/memory by default
- Users can inspect or update memories as they like
- Note Read Tool (plus agent mode): A new built-in agentic tool that can read linked notes when necessary.
- Token Counter: Displays the number of tokens in the current chat session's context window, resetting with each new chat.
- Max-Token Limit Warning: Alerts users when AI output is cutoff due to low token limits in user setting.
- YouTube Transcript Automation (plus): YouTube transcripts are now fetched automatically when a YouTube URL is entered in the chat input. A new command,
Copilot: Download YouTube Transcript, is available for raw transcript retrieval. - Projects Mode Enhancements (plus): Includes a new Chat History Picker and an enhanced progress bar.
- Backend & Tooling:
- Optimized agentic tool calls for smoother operation
- Migration of backend model services.
- Better search coverage when Semantic Search toggle is on.
- Better agent debugging infra
Breaking Changes
- The
@pomodoroand@youtubetools have been removed from the tool picker. - (plus) Sentence and word autocomplete features are temporarily disabled due to unstable performance, with plans to reintroduce them with user-customizable options.
Bug Fixes
- Fix random blank screen on Copilot Chat UI
- Addressed issues with extracting response text, mobile typeahead menu size, chat crashes, tool call UI freezes, and chat saving.
- Fixed illegal saved chat file names and improved image passing with
copilot-plus-flash. - Avoided unnecessary index rebuilds upon semantic search toggle changes.
- Ensured autonomous agent workflows use consistent tool call IDs and helper orchestration.
- Resolved issues with dropdown colors, badge borders, search result numbers, folder context, and spaces in typeahead triggers.
- Fix model addition in "Set Keys" window. "Verification" no longer required
- Fix verification of certain Claude models (was complaining about top p -1 before, now it works)
Troubleshoot
- If models are missing, navigate to Copilot settings -> Models tab and click "Refresh Built-in Models".
- Users are encouraged to report any issues in the pre-release channel.
3.0.3
This release has some big changes despite being a patch version. Notable changes:
- Introducing Inline Citations! Now any vault search response has inline citations and a collapsible sources section below the AI response. You have the option to toggle it off in QA settings. (This feature is experimental, if it's not working please report back!)
- Implement Log File, now you can share Copilot Log in the Advanced Setting, no more dev console!
- Removed User / Bot icons to save space in the Copilot Chat UI
- Add OpenRouter GPT 4.1 models and grok-4-fast to Projects mode
- Now AI-generated title for saved chats is optional, it's a toggle in the Basic setting
- Add new default
copilot/parent folder for saved conversations and custom prompts - Embedding model picker is no longer hidden under QA settings tab
Detailed changelog:
Improvements
- #1838 Update sources styling @logancyang
- #1837 Drop user and bot icons to save space and add shade to user message @logancyang
- #1813 Add mobile-responsive components for settings @Emt-lin
- #1832 Add OpenRouter GPT-4.1 models to projects mode @logancyang
- #1831 Refactor active note inclusion and index event handling to respect setting @logancyang
- #1821 Implement inline citation @logancyang
- #1829 Agent Mode: Map copilot
@commandto builtin agent tools @wenzhengjiang - #1817 Conditionally initialize VectorStoreManager @logancyang
- #1816 Ensure nested folder paths exist when enhancing folder management @logancyang
- #1811 Make AI chat title optional @logancyang
- #1810 Move context menu and markdown image handling settings @logancyang
- #1809 Show embedding model @logancyang
- #1805 Add search explanation table in log @logancyang
- #1804 Implement log file @logancyang
- #1788 Only scroll to bottom when user messages are added @zeroliu
Bug Fixes
- #1840 Adjust vertical positioning in ModelTable component @logancyang
- #1830 Ensure proper QA exclusion on copilot data folders @logancyang
- #1827 Fix chat crash issue @zeroliu
- #1796 Support creating new folders in composer tools @wenzhengjiang
- #1795 Add safe area bottom padding to view content @Emt-lin
- #1793 Fix mobile embedded image passing @logancyang
- #1787 Improve loading state management in project context updates @Emt-lin
- #1786 Optimize modal height and close button display on mobile @Emt-lin
- #1778 Improve regex for composer codeblock @wenzhengjiang
3.0.2
Improvements
- #1775 Switch to the new file when creating files with composer tools. @wenzhengjiang
Bug Fixes
- #1776 Fix url processing with image false triggers @logancyang
- #1770 Fix chat input responsiveness @zeroliu
- #1773 Fix canvas parsing in writeToFile tool @wenzhengjiang
3.0.1
Quick Hotfixes
- Fix a critical bug that stopped
[[note]]reference from working in the free chat mode after introducing the context menu in v3. - Optimize the replace writer tool
- Add a MSeeP security badge
3.0.0
Copilot for Obsidian v3.0.0!
We are thrilled to announce the official release of Copilot for Obsidian v3.0.0! After months of hard work, this major update brings a new era of intelligent assistance to your Obsidian vault, focusing on enhanced AI capabilities, a new search system, and significant user experience improvements.
ποΈ Image Support and Chat Context Menu
Image support and the chat context menu are available for free users now! As long as your model supports vision, you can check the vision box and send image(s) to it.
π₯ Copilot Vault Search v3 - Index-Free & Optional Semantic Search
We've completely reimagined how Copilot finds notes in your vault, making the search feature significantly more intelligent, robust, and efficient.
- Smart Index-Free Search: Search now works out-of-the-box without requiring an index build, eliminating index corruption issues.
- Enhanced Relevance: Copilot leverages keywords from titles, headings, tags, note properties, Obsidian links, co-citations, and parent folders to find relevant notes.
- Optional Semantic Engine: For semantic understanding, you can enable Semantic Search under QA settings, which uses an embedding index same as before.
- Memory Efficient: Uses minimal RAM, you can tune it under QA settings.
- Privacy First: The search infrastructure remains local; no data leaves your device unless you use an online model provider.
- New QA Settings:
- The embedding model is moved here from the Basic tab.
- Lexical Search RAM Limit: Control RAM usage for index-free search, allowing optimization for performance or memory constraints.
β Introducing Inline Quick Command
Transform your inline editing workflow with the brand new "Copilot: trigger quick command." This feature replaces the legacy "apply adhoc custom prompt" and allows you to insert quick prompts to edit selected blocks inline, integrating seamlessly with your custom command workflow. Assigning it to a hotkey like Cmd (Ctrl) + K is highly recommended!
π Autonomous Agent (Plus Feature)
Experience a new level of AI interaction with the Autonomous Agent. When enabled in Plus settings, your Copilot can now automatically trigger tool calls based on your queries, eliminating the need for explicit @tool commands.
- Intelligent Tool Calling: The agent can automatically use tools like vault search, web search, composer and YouTube processing to fulfill your requests.
- Tool Call Banner: See exactly which tools the agent used and their results with expandable banners.
- Configurable Tools: Gain fine-grained control by enabling or disabling specific tools that the agent can call (Local vault search, Web search, Composer operations, YouTube processing) in the Plus settings.
- Max Iterations Control: Adjust the agent's reasoning depth (4-8 iterations) for more complex queries.
- Supported Models: Optimized for
copilot-plus-flash(Gemini 2.5 models), Claude 4, GPT-4.1, GPT-4.1-mini, and now GPT-5 models. (Note: Agent mode performs best with Gemini models, followed by Claude and GPT. (Performance can vary a lot if you choose other models) - Control Remains Yours: For more control, turn the agent toggle off. vault search and web search are conveniently available as toggle buttons below the chat input.
β¨ Other Key Improvements
- Tool Execution Banner: Visual feedback when the agent uses tools.
- Better Tool Visibility: Tool toggle buttons in chat input when the agent is off (vault search, web search, composer).
- Improved Settings UI: Dedicated "Agent Accessible Tools" section with clear framing.
- ChatGPT-like Auto-Scroll: Chat messages now auto-scroll when a new user message is posted.
- Image Support: Improved embedded image reading, no longer requiring "absolute path" setting for same-title disambiguation. Supports markdown-style embedded image links
. - AI Message Regeneration: Fixed issues with AI message regeneration.
- Tool Result Formatting: Enhanced formatting for tool results.
- UI Responsiveness: Better UI responsiveness during tool execution.
- Context Menu: Moved context menu items to a dedicated "Copilot" submenu.
- Model Parameters: Top P, frequency penalty, verbosity, and reasoning effort model parameters are now optional and can be toggled manually.
- Project Mode Context UI: A new progress bar indicates when project context is loading, with status visible via the context status icon.
- Embedding Models: Gemini embedding 001 is added as a built-in embedding model. The embedding model picker is now under the QA tab.
- OpenRouter: Now the top provider in settings.
π Thanks
Huge thanks to all our contributors and users, Copilot for Obsidian is nothing without its community! Please provide feedback if you encounter any issues.
2.9.5
Adding GPT-5 series models as built-in models, fresh out of the oven! Supports the new parameters reasoning_effort and verbosity. To see them, you may have to click "Refresh Builtin Models" under your chat model table in Copilot settings.
You can also add openrouter GPT-5 models such as openai/gpt-5-chat as a Custom Model with the OpenRouter provider.
This is an unscheduled release to add GPT-5. Copilot v3 is under construction and will be released officially very soon, stay tuned!
2.9.4
Yet another quick release fixing a few bugs: fix composer canvas codeblock, update copilot-plus-small (it hasn't been stable recently, should be stable now after a complete reindex)
PRs
- #1621 Exclude copilot folders from indexing by default @logancyang
- #1620 Disallow file types in context @logancyang
- #1619 Fix copilot-plus-small @logancyang
- #1617 Fix composer canvas codeblock @wenzhengjiang
Troubleshoot
- If you find models missing in any model table or dropdown, go to Copilot settings -> Models tab, find "Refresh Built-in Models" and click it. If it doesn't help, please report back!
- For
@Believerand@poweruserwho are on a preview version, now you can use BRAT to install official versions as well!