Skip to content

[Bug]: kubectl-ai will ollama says starting k8s agent: initializing chat session: Initialize not yet implemented for ollama #504

@jbrinnand

Description

@jbrinnand

Environment (please complete the following):

  • OS: macos Version Sequoia 15.6 (24G84)
  • kubectl-ai version: 0.0.22
    commit: f151614
    date: 2025-08-14T18:00:57Z
  • LLM provider: ollama
  • LLM model: mistral:latest

Describe the bug
Check if ollama is running:

curl http://localhost:11434/; echo
Ollama is running

export OLLAMA_HOST=http://localhost:11434/
kubectl-ai --llm-provider ollama --model mistral:latest --kubeconfig '</path/to/my/kubeconfig>

Here is the error:
.....

.....
Use "kubectl-ai [command] --help" for more information about a command.

starting k8s agent: initializing chat session: **Initialize not yet implemented for ollama**

To Reproduce
Steps to reproduce the behavior:

  1. Go to 'CLI'
  2. Run command kubectl-ai --llm-provider ollama --model mistral:latest --kubeconfig '</path/to/my/kubeconfig>'
  3. See error: starting k8s agent: initializing chat session: Initialize not yet implemented for ollama
    Expected behavior
    What you expected to happen. I want to go to the prompt and query for nodes, pods, namespaces and in general interact with k8s agent.

Additional context
Add any other context or logs here.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions