New user questions #2044
Replies: 2 comments 2 replies
-
|
@NickEastNL I totally get where you’re coming from. The idea of something that “just works” is appealing, but I haven’t seen any AI tool yet that truly does that out of the box. Every single one I’ve tried (ChatGPT, Gemini AI, Open-Webui, GitHub Copilot, Copilot Studio) needed effort, skills, and tinkering to get it aligned with my workflow. For me, the sweet spot has been using Obsidian Copilot projects to build domain‑specific assistants. Once you set them up, they feel much closer to “just works” because they’re tailored to my own notes and context. On subscriptions: I hear the hesitation. But with the believer plan you actually get unlimited access to chat and embedding models (copilot‑plus‑flash, which currently looks like Gemini Pro) without extra API costs. The believer lifetime option is priced roughly at three years of subscription, so if you’re planning to stick with Obsidian Copilot long‑term, it’s a fair offer compared to juggling separate API bills. About Fountain syntax: Fountain is just plain text. That means your screenplays can live inside Markdown files, and you can pull them directly into chat or project context. For CSVs: I’ve shifted to dense Markdown pipe tables instead. Basically, I replace commas with And on the bigger picture, Copilot really is the only choice right now. If you want AI deeply integrated with your vault, this is where the development energy is. Contributing to Obsidian Copilot’s growth seems like the best way forward, both to shape the features you’d like to see and to ensure the tool keeps evolving in directions that benefit workflows like yours. |
Beta Was this translation helpful? Give feedback.
-
I'm not sure what your exact use-case for CSV & Dataview is. This is how I handle Markdown tables: const file = await app.vault.getAbstractFileByPath("data/mydata.md");
const content = await app.vault.read(file);
// Parse dense table (simple split by lines/pipe)
const rows = content.split("\n").map(r => r.split("|"));
dv.table(rows[0], rows.slice(2));
// ^- header ^- data
Yup, I totally agree. I'm having issues with that as well. It appears that this is a transient issue, and once Obsidian Copilot nears release candidate status we should be having full access to the model's context window (that would be ~1M tokens for Gemini; ~200k for Claude; 400k for OpenAI ). That should be good even for larger Fountain scripts, but you'd still have the issue of context dilution (garbage in the context). Particularly, OpenAI models have an issue with that. I completely support the idea of context splitting e.g. by allowing
Indeed! I do that too. I think Obsidian Copilot's development trajectory is quite appealing, but your guess where it will end up is as good as mine. At this point the believer plan investment is a leap-of-faith. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi everyone. So, I predominantly use Obsidian for writing, worldbuilding, and roleplaying. Ever since I started experimenting with AI assistants a few months ago, I've found that they significantly boost my creativity and nearly eliminate any of the struggles I've been facing for a few years now.
I've been looking for a way to use Obsidian and use my own notes to allow the LLMs to produce better and more relevant responses. For a while I've used GitHub Copilot in VSCode to work directly with the Markdown files, and it has worked for a bit. But it's obviously not tailored for it.
To be honest, I've been hesitant to try Obsidian Copilot, because I want to avoid subscriptions as much as possible (especially if I also have to pay for model APIs). I also have certain more personal requirements. I've therefore been working on my own assistant for Obsidian, but it's proving to take much more time and energy than I can realistically spare. So at this point I'd rather pay a bit more money to get something that "just works". But I still have my own personal requirements, so I'm wondering what the possibilities are.
Perhaps a bit of an unusual request or remark, but I might be able and willing to assist with adding the features I would like to have (if they are considered appropriate of course), if it means getting them faster. As I've only just begun using Copilot, I'm not sure exactly what is part of the open source plugin, and what is part of the closed-source backend. All I know is that this plugin is much more mature than anything I could've made, and if I can somehow make it exactly what I'm looking, the better.
I apologize for this really long post, it's a bit difficult to describe my use case without being verbose. And I'm looking for as much information as possible, given the limited time left to get the Believer tier.
Beta Was this translation helpful? Give feedback.
All reactions