-
Notifications
You must be signed in to change notification settings - Fork 76
feat: Added support for Azure OpenAI and other LLMs in Semantic Kernel #51
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: Added support for Azure OpenAI and other LLMs in Semantic Kernel #51
Conversation
…ch for support of other LLMs
…approach for support of other LLMs
…bs/a2a-samples into feature/sk-aoai-support
@moonbox3, could you please review? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Left one small comment you can change if you want. There is no current pattern so I will let @moonbox3 comment on which way he wants to go since he ins the maintainer.
But a fan of the new code changes. I know there were a lot of changes but this looks pretty clean to me.
Thanks for reviewing this change, @jland-redhat. I really appreciate it. I agree about note markdown formatting. I was just following the pattern that was already set in the readme :) I fixed all instances of this now. Cheers. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for working on this -- a couple of small comments.
Description
See google-a2a/A2A#505 for more information and background as the previous PR was closed due to the move of a2a-examples folder to a new repo.
Key changes:
Enum-based Service Selection: added a ChatServices enum with AZURE_OPENAI and OPENAI options for clear service identification.
Added explicity service configuration functions:
get_chat_completion_service()
: Main function to get a service by enum value_get_azure_openai_chat_completion_service()
: Handles Azure OpenAI configuration_get_openai_chat_completion_service()
: Handles OpenAI configurationAdded auto-detection with fallback using auto_detect_chat_service() function.
Provided
.envexample
file for environment variable examplesUpdated the Readme file with instructions on how to define the variables for each service.
Thanks @moonbox3 for suggestions.
This change makes it possible to add other LLM chat completion services in future.
Cheers,
Oliver
Before submitting your PR, there are a few things you can do to make sure it goes smoothly:
CONTRIBUTING
Guide.nox -s format
from the repository root to format)Fixes #45 🦕