Skip to content

Exception while trying to get models #1121

Open
@Wicus

Description

@Wicus

I get this error when trying to retrieve models:

nvim-data/lazy/CopilotChat.nvim/lua/CopilotChat/client.lua:359: Failed to fetch models from github_models: ...zy/CopilotChat.nvim/lua/CopilotChat/config/providers.lua:327: attempt to perform arithmetic on local 'max_output_tokens' (a nil value)

I've debugged it down to this model, seeing that the local max_output_tokens = model.modelLimits.textLimits.maxOutputTokens will return nil, in which case the error appears.

{
    assetId="azureml://registries/azure-openai/models/o1-preview/versions/1",
    createdTime="2024-09-12T22:23:37.6129559+00:00",
    displayName="OpenAIo1-preview",
    fineTuningTasks={},
    inferenceTasks={"chat-completion"},
    keywords={"Reasoning","Multilingual","Coding"},
    labels={"latest"},
    license="custom",
    modelCapabilities={},
    modelLimits={
        otherLimits=vim.empty_dict(),
        supportedLanguages={"en","it","af","es","de","fr","id","ru","pl","uk","el","lv","zh","ar","tr","ja","sw","cy","ko","is","bn","ur","ne","th","pa","mr","te"},
        textLimits={
            inputContextWindow=128000
        }},
        name="o1-preview",
        playgroundLimits={},
        popularity=49.98,
        publisher="OpenAI",
        registryName="azure-openai",
        summary="Focusedonadvancedreasoningandsolvingcomplexproblems,includingmathandsciencetasks.Idealforapplicationsthatrequiredeepcontextualunderstandingandagenticworkflows.",
        tradeRestricted=true,
        version="1"
}

So the issue is this part:

local max_output_tokens = model.modelLimits.textLimits.maxOutputTokens

Context:

  get_models = function(headers)
    local response, err = utils.curl_post('https://api.catalog.azureml.ms/asset-gallery/v1.0/models', {
      headers = headers,
      json_request = true,
      json_response = true,
      body = {
        filters = {
          { field = 'freePlayground', values = { 'true' }, operator = 'eq' },
          { field = 'labels', values = { 'latest' }, operator = 'eq' },
        },
        order = {
          { field = 'displayName', direction = 'asc' },
        },
      },
    })

    if err then
      error(err)
    end

    return vim
      .iter(response.body.summaries)
      :filter(function(model)
        return vim.tbl_contains(model.inferenceTasks, 'chat-completion')
      end)
      :map(function(model)
        local context_window = model.modelLimits.textLimits.inputContextWindow
        local max_output_tokens = model.modelLimits.textLimits.maxOutputTokens
        local max_input_tokens = context_window - max_output_tokens
        if max_input_tokens <= 0 then
          max_output_tokens = 4096
          max_input_tokens = context_window - max_output_tokens
        end

        return {
          id = model.name,
          name = model.displayName,
          tokenizer = 'o200k_base',
          max_input_tokens = max_input_tokens,
          max_output_tokens = max_output_tokens,
        }
      end)
      :totable()
  end,

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions