Using a local Llamma2 model running on Ollama and Langchain.
Utilizing both a traditional non-sql database and a vector database AI book reports can easily store all of the data needed for effecient generation and later retrieval.
By adding custom data to the LLM AI Book Reports is able to create more accurate book reports more consistently.