Skip to content

Connection refused when streaming with Ollama #464

@oldfieldtc

Description

@oldfieldtc

Describe the bug
When using Ollama as the provider I'm getting a GuzzleHttp\Exception\ConnectException Connection refused for URI http://localhost:11434/api/chat error when trying to run the basic streaming code example. This has been working fine when getting a ->chat() response.

Full error:

GuzzleHttp\Exception\ConnectException 

Connection refused for URI http://localhost:11434/api/chat

at vendor/guzzlehttp/guzzle/src/Handler/StreamHandler.php:84
     80▕                 || false !== \strpos($message, 'Connection refused')
     81▕                 || false !== \strpos($message, "couldn't connect to host") // error on HHVM
     82▕                 || false !== \strpos($message, 'connection attempt failed')
     83▕             ) {
  ➜  84▕                 $e = new ConnectException($e->getMessage(), $request, $e);
     85▕             } else {
     86▕                 $e = RequestException::wrapException($request, $e);
     87▕             }
     88▕             $this->invokeStats($options, $request, $startTime, null, $e);

To Reproduce
My console command code:

namespace App\Console\Commands;

use App\Neuron\MyAgent;
use Illuminate\Console\Command;
use NeuronAI\Chat\Messages\UserMessage;

class Agent extends Command
{
    /**
     * The name and signature of the console command.
     *
     * @var string
     */
    protected $signature = 'app:agent {prompt}';

    /**
     * The console command description.
     *
     * @var string
     */
    protected $description = 'Command description';

    /**
     * Execute the console command.
     * @throws \Throwable
     */
    public function handle()
    {
        $prompt = $this->argument('prompt');

        $response = MyAgent::make()->stream(new UserMessage($prompt));
        foreach ($response as $text) {
            echo $text;
        }
    }
}

Then running php artisan app:agent "the prompt" in the terminal.

The Neuron AI version you are using:

  • Version 2.12.2

Additional context
I'm running Ollama via Docker on Windows 11. Tried with and without WSL.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions