Describe the bug
When creating a model with model: gemini/gemini-3-flash-preview and a custom api_base, LiteLLM throws Missing gemini_api_key, please set GEMINI_API_KEY even when azure_ad_token is set in litellm_params. However, an existing model on the same proxy with identical config works fine.
Context
We have two models on gsk-prod with identical litellm_params:
| Field |
Value |
model |
gemini/gemini-3-flash-preview |
api_base |
https://dev.api.gsk.com/co/ent/gcp/gemini3-flash/global |
drop_params |
true |
azure_ad_token |
set (non-empty) |
GSKPlatform_gemini-3.0-flash — created on an older version, works correctly
Kc3ajs8TEUyP7jsX27WQwg_gemini-3.0-flash — newly created with same params, fails with Missing gemini_api_key
The newly created model only works after adding api_key to litellm_params. The existing model has no api_key field and works without it.
Error
litellm.APIConnectionError: Missing gemini_api_key, please set `GEMINI_API_KEY`
File ".../litellm/llms/vertex_ai/vertex_llm_base.py", line 360, in _check_custom_proxy
raise ValueError("Missing gemini_api_key, please set `GEMINI_API_KEY`")
Stack: vertex_and_google_ai_studio_gemini.py:async_completion → vertex_llm_base.py:_get_token_and_url → _check_custom_proxy
Questions to investigate
- Why does
azure_ad_token not satisfy auth resolution in _check_custom_proxy for gemini/ provider with a custom api_base?
- What is different about the older model that allows it to work without
api_key? Is there a DB field not returned by /model/info that's being used?
- Should
azure_ad_token be a valid auth mechanism for gemini/ provider with custom api_base, or is api_key the correct field?
Environment
- LiteLLM version: v1.81.14
- Provider:
gemini/ with custom api_base (non-Google endpoint)
- Proxy: multi-pod GKE deployment
Describe the bug
When creating a model with
model: gemini/gemini-3-flash-previewand a customapi_base, LiteLLM throwsMissing gemini_api_key, please set GEMINI_API_KEYeven whenazure_ad_tokenis set inlitellm_params. However, an existing model on the same proxy with identical config works fine.Context
We have two models on gsk-prod with identical
litellm_params:modelgemini/gemini-3-flash-previewapi_basehttps://dev.api.gsk.com/co/ent/gcp/gemini3-flash/globaldrop_paramstrueazure_ad_tokenGSKPlatform_gemini-3.0-flash— created on an older version, works correctlyKc3ajs8TEUyP7jsX27WQwg_gemini-3.0-flash— newly created with same params, fails withMissing gemini_api_keyThe newly created model only works after adding
api_keytolitellm_params. The existing model has noapi_keyfield and works without it.Error
Stack:
vertex_and_google_ai_studio_gemini.py:async_completion→vertex_llm_base.py:_get_token_and_url→_check_custom_proxyQuestions to investigate
azure_ad_tokennot satisfy auth resolution in_check_custom_proxyforgemini/provider with a customapi_base?api_key? Is there a DB field not returned by/model/infothat's being used?azure_ad_tokenbe a valid auth mechanism forgemini/provider with customapi_base, or isapi_keythe correct field?Environment
gemini/with customapi_base(non-Google endpoint)