Skip to content

feat(blocks): Ollama - Remote hosts #8234

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 44 commits into from
Dec 13, 2024

Conversation

Fried-Squid
Copy link
Contributor

@Fried-Squid Fried-Squid commented Sep 30, 2024

Background

Currently, AutoGPT only supports ollama servers running locally. Often, this is not the case as the ollama server could be running on a more suited instance, such as a Jetson board. This PR adds "ollama host" to the input of all LLM blocks, allowing users to select the ollama host for the LLM blocks.

Changes 🏗️

  • Changes contained within blocks/llm.py:
    • Adding ollama host input to all LLM blocks
    • Fixed incorrect parsing of prompt when passing to ollama in the StructuredResponse block
    • Used ollama.Client instances to accomplish this.

Testing 🔍

Tested all LLM blocks with Ollama remote hosts as well as with the default localhost value.

Related issues

#8225

@Fried-Squid Fried-Squid requested a review from a team as a code owner September 30, 2024 21:19
@Fried-Squid Fried-Squid requested review from Torantulino and kcze and removed request for a team September 30, 2024 21:19
@CLAassistant
Copy link

CLAassistant commented Sep 30, 2024

CLA assistant check
All committers have signed the CLA.

Copy link

PR Reviewer Guide 🔍

Here are some key observations to aid the review process:

⏱️ Estimated effort to review: 2 🔵🔵⚪⚪⚪
🧪 No relevant tests
🔒 Security concerns

Potential remote code execution:
The addition of the ollama_host parameter allows users to specify arbitrary hosts for Ollama connections. If not properly validated and sanitized, this could potentially be exploited to connect to unintended hosts or execute arbitrary code. It's crucial to implement strict input validation for the ollama_host parameter to ensure only authorized and safe connections are allowed.

⚡ Recommended focus areas for review

Potential Security Risk
The ollama_host parameter is set with a default value of "localhost:11434". This could potentially allow users to connect to arbitrary hosts if not properly validated.

Error Handling
The changes introduce new network calls to potentially remote Ollama hosts, but there's no visible error handling for network-related issues.

Code Duplication
The ollama_host parameter is added to multiple Input classes, which could lead to duplication and maintenance issues.

Copy link

netlify bot commented Sep 30, 2024

Deploy Preview for auto-gpt-docs canceled.

Name Link
🔨 Latest commit 849e23d
🔍 Latest deploy log https://app.netlify.com/sites/auto-gpt-docs/deploys/675b72f29fe1a200080010db

@Bentlybro Bentlybro self-requested a review September 30, 2024 21:32
@Bentlybro
Copy link
Member

Bentlybro commented Sep 30, 2024

This is a super nice change and its much needed 🙏 once CI tests pass it should be good to go!

@Bentlybro Bentlybro self-assigned this Sep 30, 2024
@Fried-Squid
Copy link
Contributor Author

CI/CD all passing now 😄
Just forgot to run the formatter and change a type lol

@github-actions github-actions bot added the conflicts Automatically applied to PRs with merge conflicts label Oct 10, 2024
Copy link
Contributor

This pull request has conflicts with the base branch, please resolve those so we can evaluate the pull request.

@ntindle
Copy link
Member

ntindle commented Oct 10, 2024

@Fried-Squid can i get that cla signed so we can work towards merging this

@Fried-Squid
Copy link
Contributor Author

Fried-Squid commented Oct 12, 2024 via email

@ntindle
Copy link
Member

ntindle commented Oct 12, 2024

Somewhere you’re developing probably has an old git config that contains those emails as a default. Ex GitHub.com uses one of those when you update from master or your dev machine has its git config email set to an old email.

You can reopen the PR with the correct emails after fixing locally or maybe try force pushing over but I’m not positive that force pushing to your branch will work

@Fried-Squid Fried-Squid requested a review from a team as a code owner October 14, 2024 10:56
@github-actions github-actions bot added platform/frontend AutoGPT Platform - Front end size/xl and removed size/m labels Oct 14, 2024
@github-actions github-actions bot added size/l and removed size/xl labels Dec 12, 2024
@github-actions github-actions bot removed the platform/frontend AutoGPT Platform - Front end label Dec 12, 2024
@github-actions github-actions bot removed the conflicts Automatically applied to PRs with merge conflicts label Dec 12, 2024
Copy link
Contributor

Conflicts have been resolved! 🎉 A maintainer will review the pull request shortly.

@github-actions github-actions bot added size/m and removed size/l labels Dec 12, 2024
Copy link
Member

@Bentlybro Bentlybro left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Tested and this works.

@Bentlybro Bentlybro added this pull request to the merge queue Dec 13, 2024
@Bentlybro Bentlybro changed the title Ollama - Remote hosts feat(blocks): Ollama - Remote hosts Dec 13, 2024
Merged via the queue into Significant-Gravitas:dev with commit 94a312a Dec 13, 2024
19 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
Status: Done
Status: Done
Development

Successfully merging this pull request may close these issues.

9 participants