Skip to content

Releases: logancyang/obsidian-copilot

3.1.3

12 Nov 01:11
88feebf

Choose a tag to compare

This release includes

  • Significant enhancements to AWS Bedrock support
  • A new automatic text selection to chat context feature (default to off under Basic setting)
  • Better user experience with composer - skip confirmation with an explicit instruction
  • Reduced popups during onboarding

More details in the changelog:

Improvements

Bug Fixes

3.1.2

01 Nov 01:46
ba4eba5

Choose a tag to compare

Release time again πŸŽ‰ We are ramping up to reach our big goals sooner! Some major changes

  • 🫳 Drag-n-drop files from file navbar to Copilot Chat as context!
  • 🧠 Revamped context management system that saves tokens by maximizing token cache hit
  • πŸ“‚ Better context note loading from saved chats
  • ↩️ New setting under Basic tab to set the send key - Enter / Shift + Enter
  • πŸ”— Embedded note ![[note]] now supported in context

More details in the changelog:

Improvements

Bug Fixes

3.1.1

24 Oct 00:42
95247f0

Choose a tag to compare

This patch release 3.1.1 packs a punch πŸ’ͺ with some significant upgrades and critical bug fixes.

  • OpenRouter thinking models are supported now! As long as "Reasoning" is checked for a reasoning model from OpenRouter, the thinking block will render in chat. If you don't want to see it, simply uncheck "Reasoning" to hide it.
  • Copilot can see Dataview results in the active note! πŸ”₯πŸ”₯πŸ”₯ Simply add the active note with dataview queries to context, and the LLM will see the executed results of those queries and use them as context!
  • New model provider Amazon Bedrock added! (We only support API key and region settings for now, other ways of Bedrock access are not supported)

More details in the changelog:

Improvements

Bug Fixes

3.1.0

11 Oct 15:37
3b338b0

Choose a tag to compare

Copilot for Obsidian - Release v3.1.0 πŸ”₯

3.1.0 finally comes out of preview!! πŸŽ‰πŸŽ‰πŸŽ‰ This release introduces significant advancements in chat functionality and memory management, alongside various improvements and bug fixes.

New Features

  • Brand New Copilot Chat Input: A completely redesigned chat input! This is a huge update we introduced after referencing all the industry-leading solutions.
    • Enhanced Context Referencing: A new typeahead system allows direct referencing of notes, folders, tags, URLs, and tools using familiar syntax like @, [[, #, and /.
    • Interactive "Pills": Referenced items appear as interactive pills for a cleaner interface and easier management. No tripping over typos again!
  • Long-Term Memory (plus): A major roadmap item, this feature allows Copilot to reference recent conversations and save relevant information to long-term memory. Memories are saved as .md files in the copilot/memory directory by default (configurable), allowing for inspection and manual updates.
    • Major item on the roadmap, making its debut
    • Enable "Reference Recent Conversation" and "Reference Saved Memory" in Plus settings
    • AI can see a summary of recent chats
    • AI can save and reference relevant info to long-term memory on its own
    • Option to manually trigger save by asking the agent or using the new @memory tool
    • Memories saved as md files under copilot/memory by default
    • Users can inspect or update memories as they like
  • Note Read Tool (plus agent mode): A new built-in agentic tool that can read linked notes when necessary.
  • Token Counter: Displays the number of tokens in the current chat session's context window, resetting with each new chat.
  • Max-Token Limit Warning: Alerts users when AI output is cutoff due to low token limits in user setting.
  • YouTube Transcript Automation (plus): YouTube transcripts are now fetched automatically when a YouTube URL is entered in the chat input. A new command, Copilot: Download YouTube Transcript, is available for raw transcript retrieval.
  • Projects Mode Enhancements (plus): Includes a new Chat History Picker and an enhanced progress bar.
  • Backend & Tooling:
    • Optimized agentic tool calls for smoother operation
    • Migration of backend model services.
    • Better search coverage when Semantic Search toggle is on.
    • Better agent debugging infra

Breaking Changes

  • The @pomodoro and @youtube tools have been removed from the tool picker.
  • (plus) Sentence and word autocomplete features are temporarily disabled due to unstable performance, with plans to reintroduce them with user-customizable options.

Bug Fixes

  • Fix random blank screen on Copilot Chat UI
  • Addressed issues with extracting response text, mobile typeahead menu size, chat crashes, tool call UI freezes, and chat saving.
  • Fixed illegal saved chat file names and improved image passing with copilot-plus-flash.
  • Avoided unnecessary index rebuilds upon semantic search toggle changes.
  • Ensured autonomous agent workflows use consistent tool call IDs and helper orchestration.
  • Resolved issues with dropdown colors, badge borders, search result numbers, folder context, and spaces in typeahead triggers.
  • Fix model addition in "Set Keys" window. "Verification" no longer required
  • Fix verification of certain Claude models (was complaining about top p -1 before, now it works)

Troubleshoot

  • If models are missing, navigate to Copilot settings -> Models tab and click "Refresh Built-in Models".
  • Users are encouraged to report any issues in the pre-release channel.

3.0.3

23 Sep 03:44
731a8e8

Choose a tag to compare

This release has some big changes despite being a patch version. Notable changes:

  • Introducing Inline Citations! Now any vault search response has inline citations and a collapsible sources section below the AI response. You have the option to toggle it off in QA settings. (This feature is experimental, if it's not working please report back!)
  • Implement Log File, now you can share Copilot Log in the Advanced Setting, no more dev console!
  • Removed User / Bot icons to save space in the Copilot Chat UI
  • Add OpenRouter GPT 4.1 models and grok-4-fast to Projects mode
  • Now AI-generated title for saved chats is optional, it's a toggle in the Basic setting
  • Add new default copilot/ parent folder for saved conversations and custom prompts
  • Embedding model picker is no longer hidden under QA settings tab

Detailed changelog:

Improvements

Bug Fixes

3.0.2

31 Aug 06:08
d6ad716

Choose a tag to compare

Improvements

Bug Fixes

3.0.1

28 Aug 05:13
d987682

Choose a tag to compare

Quick Hotfixes

  • Fix a critical bug that stopped [[note]] reference from working in the free chat mode after introducing the context menu in v3.
  • Optimize the replace writer tool
  • Add a MSeeP security badge

3.0.0

26 Aug 04:32
d85420b

Choose a tag to compare

Copilot for Obsidian v3.0.0!

We are thrilled to announce the official release of Copilot for Obsidian v3.0.0! After months of hard work, this major update brings a new era of intelligent assistance to your Obsidian vault, focusing on enhanced AI capabilities, a new search system, and significant user experience improvements.

🏞️ Image Support and Chat Context Menu

Image support and the chat context menu are available for free users now! As long as your model supports vision, you can check the vision box and send image(s) to it.

πŸ”₯ Copilot Vault Search v3 - Index-Free & Optional Semantic Search

We've completely reimagined how Copilot finds notes in your vault, making the search feature significantly more intelligent, robust, and efficient.

  • Smart Index-Free Search: Search now works out-of-the-box without requiring an index build, eliminating index corruption issues.
  • Enhanced Relevance: Copilot leverages keywords from titles, headings, tags, note properties, Obsidian links, co-citations, and parent folders to find relevant notes.
  • Optional Semantic Engine: For semantic understanding, you can enable Semantic Search under QA settings, which uses an embedding index same as before.
  • Memory Efficient: Uses minimal RAM, you can tune it under QA settings.
  • Privacy First: The search infrastructure remains local; no data leaves your device unless you use an online model provider.
  • New QA Settings:
  • The embedding model is moved here from the Basic tab.
  • Lexical Search RAM Limit: Control RAM usage for index-free search, allowing optimization for performance or memory constraints.

⌘ Introducing Inline Quick Command

Transform your inline editing workflow with the brand new "Copilot: trigger quick command." This feature replaces the legacy "apply adhoc custom prompt" and allows you to insert quick prompts to edit selected blocks inline, integrating seamlessly with your custom command workflow. Assigning it to a hotkey like Cmd (Ctrl) + K is highly recommended!

πŸš€ Autonomous Agent (Plus Feature)

Experience a new level of AI interaction with the Autonomous Agent. When enabled in Plus settings, your Copilot can now automatically trigger tool calls based on your queries, eliminating the need for explicit @tool commands.

  • Intelligent Tool Calling: The agent can automatically use tools like vault search, web search, composer and YouTube processing to fulfill your requests.
  • Tool Call Banner: See exactly which tools the agent used and their results with expandable banners.
  • Configurable Tools: Gain fine-grained control by enabling or disabling specific tools that the agent can call (Local vault search, Web search, Composer operations, YouTube processing) in the Plus settings.
  • Max Iterations Control: Adjust the agent's reasoning depth (4-8 iterations) for more complex queries.
  • Supported Models: Optimized for copilot-plus-flash (Gemini 2.5 models), Claude 4, GPT-4.1, GPT-4.1-mini, and now GPT-5 models. (Note: Agent mode performs best with Gemini models, followed by Claude and GPT. (Performance can vary a lot if you choose other models)
  • Control Remains Yours: For more control, turn the agent toggle off. vault search and web search are conveniently available as toggle buttons below the chat input.

✨ Other Key Improvements

  • Tool Execution Banner: Visual feedback when the agent uses tools.
  • Better Tool Visibility: Tool toggle buttons in chat input when the agent is off (vault search, web search, composer).
  • Improved Settings UI: Dedicated "Agent Accessible Tools" section with clear framing.
  • ChatGPT-like Auto-Scroll: Chat messages now auto-scroll when a new user message is posted.
  • Image Support: Improved embedded image reading, no longer requiring "absolute path" setting for same-title disambiguation. Supports markdown-style embedded image links ![](link).
  • AI Message Regeneration: Fixed issues with AI message regeneration.
  • Tool Result Formatting: Enhanced formatting for tool results.
  • UI Responsiveness: Better UI responsiveness during tool execution.
  • Context Menu: Moved context menu items to a dedicated "Copilot" submenu.
  • Model Parameters: Top P, frequency penalty, verbosity, and reasoning effort model parameters are now optional and can be toggled manually.
  • Project Mode Context UI: A new progress bar indicates when project context is loading, with status visible via the context status icon.
  • Embedding Models: Gemini embedding 001 is added as a built-in embedding model. The embedding model picker is now under the QA tab.
  • OpenRouter: Now the top provider in settings.

πŸ™ Thanks

Huge thanks to all our contributors and users, Copilot for Obsidian is nothing without its community! Please provide feedback if you encounter any issues.

2.9.5

08 Aug 16:30

Choose a tag to compare

Adding GPT-5 series models as built-in models, fresh out of the oven! Supports the new parameters reasoning_effort and verbosity. To see them, you may have to click "Refresh Builtin Models" under your chat model table in Copilot settings.

SCR-20250808-itdy SCR-20250808-jaok

You can also add openrouter GPT-5 models such as openai/gpt-5-chat as a Custom Model with the OpenRouter provider.

This is an unscheduled release to add GPT-5. Copilot v3 is under construction and will be released officially very soon, stay tuned!

2.9.4

11 Jul 00:00
607b3c0

Choose a tag to compare

Yet another quick release fixing a few bugs: fix composer canvas codeblock, update copilot-plus-small (it hasn't been stable recently, should be stable now after a complete reindex)

PRs

Troubleshoot

  • If you find models missing in any model table or dropdown, go to Copilot settings -> Models tab, find "Refresh Built-in Models" and click it. If it doesn't help, please report back!
  • For @Believer and @poweruser who are on a preview version, now you can use BRAT to install official versions as well!