Conversation
SeanGolez
left a comment
There was a problem hiding this comment.
Great job setting up the tables! One thing I am noticing is that you are not logging the messages in the right area, and that you are not querying the sqlite db for the conversation history to insert in the prompt before LLM response generation.
Notice that in rag.py lines 59 and 70, we are logging the messages to memory and line 63 takes the message history as a parameter.
The rag.search() function is essentially adding the message history to the prompt, doing the rag search, then generating a response.
GraphRAG documentation here
MessageHistory documentation here
What you need to do is replace the MessageHistory portion for all of this. So this essentially means that you need to get the message history and add it to the prompt before rag.search() is called. I open to discuss this in more detail because there is a lot of nuance to how RAG and message history interact in LLM systems.
|
|
||
| @app.post("/chat") | ||
| def chat(request: ChatRequest): | ||
|
|
There was a problem hiding this comment.
Your logic here seems sound, but this code should be done in a function to reduce the amount of code in main.py.
There was a problem hiding this comment.
I am referring to the code under the highlighted block (I'm don't know how to use GitHub)
….py, update session management in db.py
Multi User chat functionality - database changes (2 new tables: chatRecords and chatMetaData)
Resolves #26