Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ollama - Remote hosts #8234

Open
wants to merge 8 commits into
base: master
Choose a base branch
from

Conversation

Fried-Squid
Copy link

@Fried-Squid Fried-Squid commented Sep 30, 2024

Background

Currently, AutoGPT only supports ollama servers running locally. Often, this is not the case as the ollama server could be running on a more suited instance, such as a Jetson board. This PR adds "ollama host" to the input of all LLM blocks, allowing users to select the ollama host for the LLM blocks.

Changes 🏗️

  • Changes contained within blocks/llm.py:
    • Adding ollama host input to all LLM blocks
    • Fixed incorrect parsing of prompt when passing to ollama in the StructuredResponse block
    • Used ollama.Client instances to accomplish this.

Testing 🔍

Tested all LLM blocks with Ollama remote hosts as well as with the default localhost value.

Related issues

#8225

@Fried-Squid Fried-Squid requested a review from a team as a code owner September 30, 2024 21:19
@Fried-Squid Fried-Squid requested review from Torantulino and kcze and removed request for a team September 30, 2024 21:19
@CLAassistant
Copy link

CLAassistant commented Sep 30, 2024

CLA assistant check
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you all sign our Contributor License Agreement before we can accept your contribution.
1 out of 3 committers have signed the CLA.

✅ Fried-Squid
❌ Ace (Fried_Squid / therift)
❌ AceH-Joblogic


Ace (Fried_Squid / therift) seems not to be a GitHub user. You need a GitHub account to be able to sign the CLA. If you have already a GitHub account, please add the email address used for this commit to your account.
You have signed the CLA already but the status is still pending? Let us recheck it.

Copy link

PR Reviewer Guide 🔍

Here are some key observations to aid the review process:

⏱️ Estimated effort to review: 2 🔵🔵⚪⚪⚪
🧪 No relevant tests
🔒 Security concerns

Potential remote code execution:
The addition of the ollama_host parameter allows users to specify arbitrary hosts for Ollama connections. If not properly validated and sanitized, this could potentially be exploited to connect to unintended hosts or execute arbitrary code. It's crucial to implement strict input validation for the ollama_host parameter to ensure only authorized and safe connections are allowed.

⚡ Recommended focus areas for review

Potential Security Risk
The ollama_host parameter is set with a default value of "localhost:11434". This could potentially allow users to connect to arbitrary hosts if not properly validated.

Error Handling
The changes introduce new network calls to potentially remote Ollama hosts, but there's no visible error handling for network-related issues.

Code Duplication
The ollama_host parameter is added to multiple Input classes, which could lead to duplication and maintenance issues.

Copy link

netlify bot commented Sep 30, 2024

Deploy Preview for auto-gpt-docs canceled.

Name Link
🔨 Latest commit 25df9d6
🔍 Latest deploy log https://app.netlify.com/sites/auto-gpt-docs/deploys/66fbeb64d49d8d00087af9bd

@Bentlybro
Copy link
Member

Bentlybro commented Sep 30, 2024

This is a super nice change and its much needed 🙏 once CI tests pass it should be good to go!

@Bentlybro Bentlybro self-assigned this Sep 30, 2024
@Fried-Squid
Copy link
Author

CI/CD all passing now 😄
Just forgot to run the formatter and change a type lol

@@ -104,6 +108,11 @@ class Input(BlockSchema):
prompt_values: dict[str, str] = SchemaField(
advanced=False, default={}, description="Values used to fill in the prompt."
)
ollama_host: str = SchemaField(
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@Bentlybro is there a way to conditionally show stuff like this if ollama is the selected model:

Copy link
Member

@Bentlybro Bentlybro Oct 1, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ntindle i dont think we have that setup at the moment but maybe that is something we should look into getting setup because i already see quite a lot of use-cases for that?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just thinking, would it be worth splitting model selection into provider then model, rather than just model? Some providers have a wide variety of models (i.e. Ollama) which may overlap with other providers, so being able to choose the model and provider would let users have more control. It'd also make it a tad easier to have conditional inputs on the blocks I imagine as we wouldn't have to look up provider based on the model, it'd be in the block inputs.

Let me know and I can try and get a pr out for that functionality soon

Copy link
Member

@Bentlybro Bentlybro Oct 2, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@Fried-Squid I think that's a good very good idea, it makes a lot more sense being able to do it that way, @ntindle @Torantulino what do we think? it should be pretty easy to do and it should just be changes to the block its self as far as I can see.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@Bentlybro the one issue I see with it is getting the frontend to display that properly - we'd need to make changes to the JSON schema which gets passed to the frontend to render the available model selection. Or we could just throw an error when model and provider don't match, but that seems pretty hostile to new users who might not understand the differences between the providers and models.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
Status: 🆕 Needs initial review
Development

Successfully merging this pull request may close these issues.

5 participants