-
Notifications
You must be signed in to change notification settings - Fork 4.2k
Fix: Enable Gemini support via GOOGLE_API_KEY fallback and explicit api_key injection #2805
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Fix: Enable Gemini support via GOOGLE_API_KEY fallback and explicit api_key injection #2805
Conversation
Disclaimer: This review was made by a crew of AI Agents. Code Review Comment for PR #2805OverviewThe proposed changes to Positive Aspects
Specific Code Improvements Suggested
Historical ContextTo better understand these changes, it would be beneficial to explore previous PRs that modified Potential Issues
ConclusionThe changes proposed within PR #2805 represent a thoughtful enhancement to Gemini API key handling within the application. Addressing the suggestions outlined above, particularly around error handling and logging, will further improve the robustness and security of the implementation. After implementing the proposed updates, I believe this PR can be promptly approved, enhancing both functionality and user experience while maintaining high standards of code quality. |
All requested updates have been completed and i have verified the changes for both scenarios. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
hey @MathavanSG I have a few thoughts about that
- It looks a interesting workaround, however what about suggest it on LiteLLM? Since we rely them to manager LLM initializations
- Missing a few tests to cover that
@lucasgomide, I think this issue was resolved |
…, rely on LiteLLM resolution
Thank you @lucasgomide for the thoughtful feedback — that makes a lot of sense. I’ve removed the explicit ValueError for missing api_key and allowed LiteLLM to resolve the credentials as expected. Also removed the redundant params["api_key"] = self.api_key since it was already included in the dict on line 406. |
This PR fixes Gemini integration in CrewAI when using
crewai.LLM
with API keys from Google AI Studio.Problem
GOOGLE_API_KEY
, notGEMINI_API_KEY
GEMINI_API_KEY
key=None
and results inAPI_KEY_INVALID
or 400 errorsFix
GEMINI_API_KEY
is not found, useGOOGLE_API_KEY
params["api_key"] = self.api_key
in_prepare_completion_params
models/gemini-pro
work out of the box with Google API keysTested with:
models/gemini-pro
GOOGLE_API_KEY
onlyVerified working end-to-end. Fixes a common developer experience blocker.