You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The LLM call blocks should support an additional input for an ollama host which is not the localhost.
Will self-assign and PR later this week.
Examples 🌈
No response
Motivation 🔦
Allowing remote ollama servers would allow self-hosted users to run autogpt+ollama across multiple hosts which may be more suited to tasks, e.g. running ollama on a jetson board.
The text was updated successfully, but these errors were encountered:
Duplicates
Summary 💡
The LLM call blocks should support an additional input for an ollama host which is not the localhost.
Will self-assign and PR later this week.
Examples 🌈
No response
Motivation 🔦
Allowing remote ollama servers would allow self-hosted users to run autogpt+ollama across multiple hosts which may be more suited to tasks, e.g. running ollama on a jetson board.
The text was updated successfully, but these errors were encountered: