Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
What it does
Tables, such as Events Table, contain various information in their rows, each indicating a specific type of data. However, interpreting this information might be challenging due to not being clear or well structured (i.e., human-readable). Now, this commit aims to provide an LLM-based solution for interpreting the table rows using Ollama, a local LLM infrastructure. Accordingly, the user can select any custom rows of a table, and if Ollama is running on their machine, see the interpretation of the row's data.
How to test
Follow-ups
Choosing a proper model for interpretation is crucial. By default, the code tends to use
llama3.2
, which is a decent model regarding size/accuracy. If specifying another model in the code, you should consider its required computation resources.Review checklist