Hi,
The current Agentic Chunker feature works excellently with online LLMs like Gemini and OpenAI. I would like to request adding support for Ollama as well.
This addition would significantly enhance the versatility of the repository. Thanks for considering this enhancement!