-
Notifications
You must be signed in to change notification settings - Fork 3.8k
Open
Labels
area:configurationRelates to configuration optionsRelates to configuration optionside:vscodeRelates specifically to VS Code extensionRelates specifically to VS Code extensionkind:bugIndicates an unexpected problem or unintended behaviorIndicates an unexpected problem or unintended behavioros:macHappening specifically on MacHappening specifically on Mac
Description
Before submitting your bug report
- I've tried using the "Ask AI" feature on the Continue docs site to see if the docs have an answer
- I'm not able to find a related conversation on GitHub discussions that reports the same bug
- I'm not able to find an open issue that reports the same bug
- I've seen the troubleshooting guide on the Continue Docs
Relevant environment info
- OS: macOS
- Continue version: 1.2.10
- IDE version: VSCode 1.104.2 (Universal)
- Model: Gemini 3 Pro Preview
- config:
name: Local Config
version: 1.0.0
schema: v1
models:
- uses: google/gemini-3-pro-preview
with:
GEMINI_API_KEY: API_KEY_HERE
OR link to agent in Continue hub:Description
I can't use the Gemini 3 Pro Preview model inside the Continue plugin. My API key has full access to the model but I simply get the error
Error loading Local Config. Chat is disabled until a model is available.
Gemini 2.5 pro still works as expected
To reproduce
Setup Gemini 2.5 Pro via the config.yaml file
Update the model to google/gemini-3-pro-preview (taken from Google's AI Studio page)
Observe that it's not being accepted as a local model
Log output
Metadata
Metadata
Assignees
Labels
area:configurationRelates to configuration optionsRelates to configuration optionside:vscodeRelates specifically to VS Code extensionRelates specifically to VS Code extensionkind:bugIndicates an unexpected problem or unintended behaviorIndicates an unexpected problem or unintended behavioros:macHappening specifically on MacHappening specifically on Mac
Type
Projects
Status
Todo