Skip to content

Fix: Enable Gemini support via GOOGLE_API_KEY fallback and explicit api_key injection #2805

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 3 commits into
base: main
Choose a base branch
from

Conversation

MathavanSG
Copy link

@MathavanSG MathavanSG commented May 10, 2025

This PR fixes Gemini integration in CrewAI when using crewai.LLM with API keys from Google AI Studio.

Problem

  • Most Gemini users export GOOGLE_API_KEY, not GEMINI_API_KEY
  • LiteLLM (used by CrewAI) only checks for GEMINI_API_KEY
  • This causes key=None and results in API_KEY_INVALID or 400 errors

Fix

  • Adds fallback logic: if GEMINI_API_KEY is not found, use GOOGLE_API_KEY
  • Explicitly injects params["api_key"] = self.api_key in _prepare_completion_params
  • Now Gemini models like models/gemini-pro work out of the box with Google API keys

Tested with:

  • models/gemini-pro
  • GOOGLE_API_KEY only
  • Received correct output from Gemini via CrewAI agent

Verified working end-to-end. Fixes a common developer experience blocker.

@joaomdmoura
Copy link
Collaborator

Disclaimer: This review was made by a crew of AI Agents.

Code Review Comment for PR #2805

Overview

The proposed changes to src/crewai/llm.py enhance the handling of Gemini API keys, specifically by implementing fallback mechanisms that enhance usability and operational robustness. The changes include new explicit API key injection strategies to ensure that valid keys are always utilized.

Positive Aspects

  1. API Key Fallback Mechanism: The addition of environment variable fallback support is a solid improvement in making the key handling process reliable.
  2. Clear Documentation: The code is well-commented, which aids in understanding the purpose and functioning of the new implementations.
  3. Backward Compatibility: The fallback mechanism respects previous configurations, ensuring that existing users experience no disruption.
  4. User Feedback: The print statements provide immediate user feedback regarding fallback assignments, aiding in diagnostics.

Specific Code Improvements Suggested

  1. Logging Over Print Statements:

    • Replace print with logging to prevent exposure of sensitive key information in standard output in production environments.
    import logging
    
    if "GEMINI_API_KEY" not in os.environ and "GOOGLE_API_KEY" in os.environ:
        os.environ["GEMINI_API_KEY"] = os.environ["GOOGLE_API_KEY"]
        logging.info("[CrewAI Gemini Patch] Set GEMINI_API_KEY from GOOGLE_API_KEY")
  2. Enhanced API Key Validation:

    • Implement error handling in the constructor to manage cases where neither API key is provided, providing a clear exception to users.
    if not api_key:
        raise ValueError("No valid API key found. Please provide GEMINI_API_KEY or GOOGLE_API_KEY")
  3. Type Hinting and Documentation:

    • Add type hints and docstring updates to reflect new functionalities within methods, which help with type checking and maintainability.
    def _prepare_completion_params(
        self,
        messages: Union[str, List[Dict[str, str]]],
        **kwargs: Any
    ) -> Dict[str, Any]:
        """
        Prepare parameters for completion API call.
        ...
        """
  4. Update Class Docstring:

    • Refresh the class-level docstring to detail API key sources and updated functionality.
    class LLM:
        """
        LLM class for handling language model interactions.
        Supports multiple sources for API key configuration.
        ...
        """

Historical Context

To better understand these changes, it would be beneficial to explore previous PRs that modified src/crewai/llm.py to determine how similar challenges were addressed. Unfortunately, due to access limitations, specific insights from past PRs can't be provided, but typically such modifications can reflect evolving best practices surrounding API key management and error handling.

Potential Issues

  • Security Considerations: Printing sensitive information related to API keys can lead to security vulnerabilities; transitioning to structured logging mitigates this risk.
  • Error Handling: The current implementation lacks robust error handling, particularly for scenarios where neither API key is present, resulting in potential silent failures.

Conclusion

The changes proposed within PR #2805 represent a thoughtful enhancement to Gemini API key handling within the application. Addressing the suggestions outlined above, particularly around error handling and logging, will further improve the robustness and security of the implementation.

After implementing the proposed updates, I believe this PR can be promptly approved, enhancing both functionality and user experience while maintaining high standards of code quality.

@MathavanSG
Copy link
Author

MathavanSG commented May 10, 2025

All requested updates have been completed and i have verified the changes for both scenarios.

Copy link
Contributor

@lucasgomide lucasgomide left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hey @MathavanSG I have a few thoughts about that

  1. It looks a interesting workaround, however what about suggest it on LiteLLM? Since we rely them to manager LLM initializations
  2. Missing a few tests to cover that

@Vidit-Ostwal
Copy link
Contributor

@lucasgomide, I think this issue was resolved
#2804

@MathavanSG
Copy link
Author

Thank you @lucasgomide for the thoughtful feedback — that makes a lot of sense.

I’ve removed the explicit ValueError for missing api_key and allowed LiteLLM to resolve the credentials as expected.

Also removed the redundant params["api_key"] = self.api_key since it was already included in the dict on line 406.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants