Skip to content

Model Settings Not Persisted #34

@metadzin

Description

@metadzin

Cognotik does not persist model settings, does not initialize on startup, and Code Chat sends OpenAI requests with missing model parameter

Description

Cognotik is not properly initializing its configuration on IDE startup.
The plugin also fails to propagate the configured AI model to the Code Chat backend (ConversationalMode).
This results in persistent OpenAI API errors:

java.io.IOException: you must provide a model parameter

The plugin UI shows the correct model values, and the session JSON includes the correct model, but the backend still sends OpenAI requests without a model parameter.

This appears to be a combination of:

Settings persistence bug

Startup initialization bug

Model propagation bug inside ConversationalMode / OpenAIChatClient

Environment

Android Studio: Narwhal 4 Feature Drop | 2025.1.4

Build: #AI-251.27812.49.2514.14217341

OS: Windows 10

Cognotik Plugin Version: (your version here)

OpenAI API: using correct Base URL (https://api.openai.com/v1)

OpenAI API Key: valid, working in other tools

Steps to Reproduce

  1. Launch Android Studio

Go to Tools > Cognotik → menu is grayed out

Status bar shows: Uninitialized

  1. Open Settings

File > Settings > Tools > Cognotik > Basic Settings

The model fields have reverted to defaults (gpt-4-turbo), not the values from the previous session.

  1. Change the model settings

Set:

Smart Model: gpt-4.1-mini-2025-04-14

Fast Model: gpt-4.1-mini-2025-04-14

Image Chat Model: gpt-4.1-mini-2025-04-14

Click Apply.

Results:

Cognotik icon initializes in status bar

Status bar updates to: gpt-4.1-mini-2025-04-14

Tools > Cognotik menu becomes active

  1. Open Task Planning

Tools > Cognotik > Task Planning

Hit Load.

Session Info displays a correct model configuration:

{
"defaultModel": "gpt-4.1-mini-2025-04-14",
"parsingModel": "gpt-4.1-mini-2025-04-14",
"imageChatModel": "gpt-4.1-mini-2025-04-14",
"cognitiveMode": "Chat",
"temperature": 0.1
}

  1. Open Code Chat

Right-click editor → Cognotik → Code Chat

A browser opens:

http://localhost:3670/#G-20251115-xxxx

Type: “Good morning”

  1. Code Chat fails with error
    Error: java.util.concurrent.ExecutionException
    java.io.IOException: you must provide a model parameter

Stack trace excerpt:

Caused by: java.io.IOException: you must provide a model parameter
at com.simiacryptus.cognotik.exceptions.ErrorUtil.checkError
at com.simiacryptus.cognotik.chat.OpenAIChatClient.chat
at com.simiacryptus.cognotik.plan.cognitive.ConversationalMode.runAll

Expected Behavior

Cognotik should load persisted model settings on IDE startup

Cognotik should initialize automatically without requiring manual Apply

Code Chat should inherit the model from Basic Settings or the session file

Backend requests to OpenAI should include a valid "model": "gpt-4.1-mini-2025-04-14" (or equivalent)

Actual Behavior

Cognotik starts uninitialized on IDE launch

The model fields reset to defaults (gpt-4-turbo) every restart

Manual Apply is required to initialize the plugin

Code Chat backend still sends OpenAI requests with model = null, causing:

you must provide a model parameter

Session JSON shows correct model, but ConversationalMode does not use it

Basic Settings do not persist across IDE sessions

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions