## Description Right now, the dataset generator module ([code](https://github.com/SeaseLtd/llm-search-quality-evaluation/tree/main/src/llm_search_quality_evaluation/dataset_generator)) supports 2 LLM providers (Google's Gemini and OpenAI Chat models -> [code](https://github.com/SeaseLtd/llm-search-quality-evaluation/blob/main/src/llm_search_quality_evaluation/dataset_generator/llm/llm_provider_factory.py)). We want to extend support for local LLMs. ## Just an idea Use [ollama](https://ollama.com/) framework to implement this new feature.