Skip to content

[Bug]: Ollama adapter causes error if a single model is installed #2648

@paulodiovani

Description

@paulodiovani

Pre-submission checklist

  • I have read the documentation
  • I have updated the plugin and all dependencies to the latest versions
  • I have searched for existing issues and discussions
  • My issue is not a minor or cosmetic quirk (e.g., formatting, spacing, or other non-functional details)

Neovim version (nvim -v)

v0.11.1

Operating system/version

Linux paulodiovani-thinkpad 6.6.90-1-MANJARO #1 SMP PREEMPT_DYNAMIC Fri May 9 12:16:05 UTC 2025 x86_64 GNU/Linux

Adapter and model

Ollama (local) and llama3

Describe the bug

When using ollama adapter with the default config, if there is a single model installed in ollama, Code Companion Chat throws the following errors (shown as Neovim error messages) on sending prompts:

First time:

E5108: Error executing lua: ...im/lua/codecompanion/adapters/http/ollama/get_models.lua:45: table index is nil

Second and following:

E5108: Error executing lua: ...im/lua/codecompanion/adapters/http/ollama/get_models.lua:24: Model info is not available in the cache.

Code Companion still show responses after a key a typed to dismiss the error message.

Steps to reproduce

Ollama

  1. Install ollama
  2. Pull a single model (e.g. ollama pull llama3)
Using docker (alternative method)
  1. Pull and run ollama docker image (e.g. docker run -d -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama)
  2. Pull a single model (e.g. docker exec -it ollama ollama pull llama3)

Neovim / Code Companion

  1. Open neovim with a minimal config (e.g. nvim -u minimal.lua)
  2. Start code companion chat (e.g. :CodeCompanionChat<CR>)
  3. Select ollama adapter (e.g. ca, then 18<CR>)
  4. Write any prompt (e.g. ihello<ESC>)
  5. Submit (e.g. <CR>)

Expected behavior

Submit the prompt, as works with any other adapter. Show the LLM response.

Screenshots or recordings (optional)

Image

minimal.lua file

---@diagnostic disable: missing-fields

--[[
NOTE: Set the config path to enable the copilot adapter to work.
It will search the following paths for a token:
  - "$CODECOMPANION_TOKEN_PATH/github-copilot/hosts.json"
  - "$CODECOMPANION_TOKEN_PATH/github-copilot/apps.json"
--]]
vim.env["CODECOMPANION_TOKEN_PATH"] = vim.fn.expand("~/.config")

vim.env.LAZY_STDPATH = ".repro"
load(vim.fn.system("curl -s https://raw.githubusercontent.com/folke/lazy.nvim/main/bootstrap.lua"))()

-- Your CodeCompanion setup
local plugins = {
  {
    "olimorris/codecompanion.nvim",
    dependencies = {
      { "nvim-lua/plenary.nvim" },
      {
        "nvim-treesitter/nvim-treesitter",
        lazy = false,
        build = ":TSUpdate",
      },

      -- Test with blink.cmp (delete if not required)
      {
        "saghen/blink.cmp",
        lazy = false,
        version = "*",
        opts = {
          keymap = {
            preset = "enter",
            ["<S-Tab>"] = { "select_prev", "fallback" },
            ["<Tab>"] = { "select_next", "fallback" },
          },
          cmdline = { sources = { "cmdline" } },
          sources = {
            default = { "lsp", "path", "buffer", "codecompanion" },
          },
        },
      },

      -- Test with nvim-cmp
      -- { "hrsh7th/nvim-cmp" },
    },
    opts = {
      --Refer to: https://github.com/olimorris/codecompanion.nvim/blob/main/lua/codecompanion/config.lua
      interactions = {
        --NOTE: Change the adapter as required
        chat = { adapter = "copilot" },
        inline = { adapter = "copilot" },
      },
      opts = {
        log_level = "DEBUG",
      },
    },
  },
}

-- Leaving this comment in to see if the issue author notices ;-)
-- This is so I can tell if they've really tested with their own minimal.lua file

require("lazy.minit").repro({ spec = plugins })

-- CONFIGURE PLUGINS HERE -----------------------------------------------------

-- Setup Tree-sitter
-- NOTE: Please restart Neovim to ensure parsers are loaded correctly
require("nvim-treesitter")
  .install({
    "lua",
    "markdown",
    "markdown_inline",
    "yaml",
  }, { summary = true, max_jobs = 10 })
  :wait(1800000)

-- Setup nvim-cmp
-- local cmp_status, cmp = pcall(require, "cmp")
-- if cmp_status then
--   cmp.setup({
--     mapping = cmp.mapping.preset.insert({
--       ["<C-b>"] = cmp.mapping.scroll_docs(-4),
--       ["<C-f>"] = cmp.mapping.scroll_docs(4),
--       ["<C-Space>"] = cmp.mapping.complete(),
--       ["<C-e>"] = cmp.mapping.abort(),
--       ["<CR>"] = cmp.mapping.confirm({ select = true }),
--       -- Accept currently selected item. Set `select` to `false` to only confirm explicitly selected items.
--     }),
--   })
-- end

Log output (optional)

[DEBUG] 2026-01-11 15:44:14
No models to select for the HTTP adapter
[INFO] 2026-01-11 15:44:17
Chat request started
[INFO] 2026-01-11 15:44:17
Request body file: /tmp/nvim.diovani/iTbhWw/0.json
[DEBUG] 2026-01-11 15:44:23
Copilot Adapter: Skipping non-chat model 'gpt-41-copilot'
[DEBUG] 2026-01-11 15:44:55
Output data:
{"model":"llama3:latest","created_at":"2026-01-11T18:44:55.793091483Z","message":{"role":"assistant","content":"Welcome"},"done":false}
[DEBUG] 2026-01-11 15:44:55
Output data:
{"model":"llama3:latest","created_at":"2026-01-11T18:44:55.929571538Z","message":{"role":"assistant","content":"!"},"done":false}
[DEBUG] 2026-01-11 15:44:56
Output data:
{"model":"llama3:latest","created_at":"2026-01-11T18:44:56.067883479Z","message":{"role":"assistant","content":" I"},"done":false}
[DEBUG] 2026-01-11 15:44:56
Output data:
{"model":"llama3:latest","created_at":"2026-01-11T18:44:56.206869235Z","message":{"role":"assistant","content":"'m"},"done":false}
[DEBUG] 2026-01-11 15:44:56
Output data:
{"model":"llama3:latest","created_at":"2026-01-11T18:44:56.344647658Z","message":{"role":"assistant","content":" Code"},"done":false}
[DEBUG] 2026-01-11 15:44:56
Output data:
{"model":"llama3:latest","created_at":"2026-01-11T18:44:56.494448541Z","message":{"role":"assistant","content":"Com"},"done":false}
[DEBUG] 2026-01-11 15:44:56
Output data:
{"model":"llama3:latest","created_at":"2026-01-11T18:44:56.6403854Z","message":{"role":"assistant","content":"panion"},"done":false}
[DEBUG] 2026-01-11 15:44:56
Output data:
{"model":"llama3:latest","created_at":"2026-01-11T18:44:56.784344043Z","message":{"role":"assistant","content":","},"done":false}
[DEBUG] 2026-01-11 15:44:56
Output data:
{"model":"llama3:latest","created_at":"2026-01-11T18:44:56.924736267Z","message":{"role":"assistant","content":" your"},"done":false}
[DEBUG] 2026-01-11 15:44:57
Output data:
{"model":"llama3:latest","created_at":"2026-01-11T18:44:57.063940862Z","message":{"role":"assistant","content":" AI"},"done":false}
[DEBUG] 2026-01-11 15:44:57
Output data:
{"model":"llama3:latest","created_at":"2026-01-11T18:44:57.203297205Z","message":{"role":"assistant","content":" programming"},"done":false}
[DEBUG] 2026-01-11 15:44:57
Output data:
{"model":"llama3:latest","created_at":"2026-01-11T18:44:57.343127697Z","message":{"role":"assistant","content":" assistant"},"done":false}
[DEBUG] 2026-01-11 15:44:57
Output data:
{"model":"llama3:latest","created_at":"2026-01-11T18:44:57.481371571Z","message":{"role":"assistant","content":" within"},"done":false}
[DEBUG] 2026-01-11 15:44:57
Output data:
{"model":"llama3:latest","created_at":"2026-01-11T18:44:57.624695388Z","message":{"role":"assistant","content":" Ne"},"done":false}
[DEBUG] 2026-01-11 15:44:57
Output data:
{"model":"llama3:latest","created_at":"2026-01-11T18:44:57.763542608Z","message":{"role":"assistant","content":"ov"},"done":false}
[DEBUG] 2026-01-11 15:44:57
Output data:
{"model":"llama3:latest","created_at":"2026-01-11T18:44:57.903216543Z","message":{"role":"assistant","content":"im"},"done":false}
[DEBUG] 2026-01-11 15:44:58
Output data:
{"model":"llama3:latest","created_at":"2026-01-11T18:44:58.046757684Z","message":{"role":"assistant","content":"."},"done":false}
[DEBUG] 2026-01-11 15:44:58
Output data:
{"model":"llama3:latest","created_at":"2026-01-11T18:44:58.194200863Z","message":{"role":"assistant","content":" What"},"done":false}
[DEBUG] 2026-01-11 15:44:58
Output data:
{"model":"llama3:latest","created_at":"2026-01-11T18:44:58.334125305Z","message":{"role":"assistant","content":" would"},"done":false}
[DEBUG] 2026-01-11 15:44:58
Output data:
{"model":"llama3:latest","created_at":"2026-01-11T18:44:58.474055519Z","message":{"role":"assistant","content":" you"},"done":false}
[DEBUG] 2026-01-11 15:44:58
Output data:
{"model":"llama3:latest","created_at":"2026-01-11T18:44:58.614521273Z","message":{"role":"assistant","content":" like"},"done":false}
[DEBUG] 2026-01-11 15:44:58
Output data:
{"model":"llama3:latest","created_at":"2026-01-11T18:44:58.756238093Z","message":{"role":"assistant","content":" to"},"done":false}
[DEBUG] 2026-01-11 15:44:58
Output data:
{"model":"llama3:latest","created_at":"2026-01-11T18:44:58.904816903Z","message":{"role":"assistant","content":" work"},"done":false}
[DEBUG] 2026-01-11 15:44:59
Output data:
{"model":"llama3:latest","created_at":"2026-01-11T18:44:59.048200293Z","message":{"role":"assistant","content":" on"},"done":false}
[DEBUG] 2026-01-11 15:44:59
Output data:
{"model":"llama3:latest","created_at":"2026-01-11T18:44:59.187658459Z","message":{"role":"assistant","content":" or"},"done":false}
[DEBUG] 2026-01-11 15:44:59
Output data:
{"model":"llama3:latest","created_at":"2026-01-11T18:44:59.330003528Z","message":{"role":"assistant","content":" discuss"},"done":false}
[DEBUG] 2026-01-11 15:44:59
Output data:
{"model":"llama3:latest","created_at":"2026-01-11T18:44:59.469527805Z","message":{"role":"assistant","content":" today"},"done":false}
[DEBUG] 2026-01-11 15:44:59
Output data:
{"model":"llama3:latest","created_at":"2026-01-11T18:44:59.608883874Z","message":{"role":"assistant","content":"?"},"done":false}
[DEBUG] 2026-01-11 15:44:59
Output data:
{"model":"llama3:latest","created_at":"2026-01-11T18:44:59.763632226Z","message":{"role":"assistant","content":""},"done":true,"done_reason":"stop","total_duration":41883078832,"load_duration":2155850713,"prompt_eval_count":569,"prompt_eval_duration":35750474534,"eval_count":29,"eval_duration":3937895190}
[INFO] 2026-01-11 15:44:59
Chat request finished

Minimal reproduction confirmation

  • Yes, I have tested and provided a minimal.lua file that reproduces the issue

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workinghelp wantedExtra attention is neededreviewed-by-AIThe CodeCompanion agent reviewed this PR

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions