-
Notifications
You must be signed in to change notification settings - Fork 1.6k
feat: Vertex AI support #4458
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: Vertex AI support #4458
Conversation
The latest updates on your projects. Learn more about Vercel for Git ↗︎
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
PR Summary
This PR introduces support for the Gemini/VertexAI LLM Provider by updating configuration interfaces, backend completion calls, and UI components to properly handle file-based credentials and streamline provider-specific options.
- Updated backend/onyx/llm/chat_llm.py to conditionally inject 'vertex_credentials' into the Litellm call for vertex_ai.
- Extended backend/onyx/llm/interfaces.py by adding an optional 'credentials_file' in LLMConfig.
- Added Gemini/VertexAI support in backend/onyx/llm/llm_provider_options.py with drag-and-drop custom config keys.
- Modified web/src/app/admin/configuration/llm/LLMConfiguration.tsx and LLMProviderUpdateForm.tsx for simplified UI handling of VertexAI.
💡 (2/5) Greptile learns from your feedback when you react with 👍/👎!
8 file(s) reviewed, 1 comment(s)
Edit PR Review Bot Settings | Greptile
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lgtm
disabled={existingLlmProvider ? true : false} | ||
/> | ||
)} | ||
<TextFormField |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I def agree that hideAdvanced
shouldn't control this, but I think we still need a way to hide this. Basically, when a user first comes to the app, we don't want to ask them to put in a name, it slightly increases friction (see ApiKeyForm)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah, you're right. I think I was a little bullish with the removal of this field. I've added it back (and renamed it to be a little bit more aligned with its purpose).
I've renamed it to firstTimeConfiguration
. Do you think that's an appropriate name here? It's only used in ApiKeyForm
, which I believe is only shown during first time configurations?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yep, that sounds great
…ore complex `CustomConfigKey` types
03981de
to
8c0bee9
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
disabled={existingLlmProvider ? true : false} | ||
/> | ||
)} | ||
<TextFormField |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yep, that sounds great
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lgtm 🍾
Description
This PR adds support for the Gemini/VertexAI LLM Provider.
Addresses https://linear.app/danswer/issue/DAN-1704/built-in-gemini-vertex-ai-support.
How Has This Been Tested?
Primarily UI based changes; tested manually to make sure registration/deletion works.