Skip to content

Analyze table rows with Ollama #1182

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Draft
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

kavehshahedi
Copy link

What it does

Tables, such as Events Table, contain various information in their rows, each indicating a specific type of data. However, interpreting this information might be challenging due to not being clear or well structured (i.e., human-readable). Now, this commit aims to provide an LLM-based solution for interpreting the table rows using Ollama, a local LLM infrastructure. Accordingly, the user can select any custom rows of a table, and if Ollama is running on their machine, see the interpretation of the row's data.

image

How to test

  1. Ensure that Ollama is already installed and running on your machine.
  2. Load a custom table-based data provider, like Events Table
  3. Select any desired row of the table, and right-click on it.
  4. Choose "Interpret via Ollama", and then, wait for the process to be completed.
  5. See the interpretation!

Follow-ups

Choosing a proper model for interpretation is crucial. By default, the code tends to use llama3.2, which is a decent model regarding size/accuracy. If specifying another model in the code, you should consider its required computation resources.

Review checklist

  • As an author, I have thoroughly tested my changes and carefully followed the instructions in this template

@marcdumais-work
Copy link
Contributor

Kaveh, have you considered using this library?
https://www.npmjs.com/package/ollama

Repo: https://github.com/ollama/ollama-js

@kavehshahedi
Copy link
Author

Kaveh, have you considered using this library? https://www.npmjs.com/package/ollama

Repo: https://github.com/ollama/ollama-js

Yes, but for our use case (and even many more ones), using direct APIs over HTTP works pretty fine. So, I didn't add an extra dependency to the project.

Using Ollama's APIs, we can now analyze and interpret the table rows.
Based on the model, user can get various insights from the data.

Signed-off-by: Kaveh Shahedi <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants