Description
Problem
When developing LLM-based chat applications, people need a chat interface to see how it works. Many times people use a library like StreamLit in a separate file and which runs in a separate process.
This works, but there would be advantages if one were able to prototype a chat application right in JupyterLab:
- a single JupyterLab instance can be shared by multiple people; once it's stood up it can be re-used
- the JupyterLab instance doesn't require knowledge of how to use the terminal CLI; it's more friendly for non-technical users, which is growing in importance as Red Teaming and AI UX experts need to be able to be included in the AI development process, even though they don't necessarily know how to run and modify a python application from the CLI
- a Jupyter notebook tends to be a single file encapsulating everything needed to prototype a particular behavior, so an individual user could clone a Jupyter notebook and change parts with a feeling of "safety."
Given that there's not an idiomatic chat widget built into the Jupyter notebooks yet, I think it might be nice if a Jupyter notebook situated in a JupyterLab instance had access to be able to use the jupyter-chat
extension panel as an interface to prototype chat applications.
Proposed Solution
Provide a jupyter_chat
SDK that makes it possible to
- listen to message events
- write message events
- read all messages
- clear all messages
By calling the SDK from inside the cells of a jupyter notebook, one could control the chat interface so as to render an LLM-powered AI chat.