Skip to content

feat: adding support for Azure OpenAI in Semantic Kernel #505

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 21 commits into from

Conversation

oliverlabs
Copy link

@oliverlabs oliverlabs commented May 12, 2025

Description

Thank you for opening a Pull Request!
Before submitting your PR, there are a few things you can do to make sure it goes smoothly:

Fixes google-a2a/a2a-samples#45 🦕

Copy link

google-cla bot commented May 12, 2025

Thanks for your pull request! It looks like this may be your first contribution to a Google open source project. Before we can look at your pull request, you'll need to sign a Contributor License Agreement (CLA).

View this failed invocation of the CLA check for more information.

For the most up to date status, view the checks section at the bottom of the pull request.

@oliverlabs oliverlabs changed the title feature: adding support for Azure OpenAI for Semantic Kernel - commands are commented feat: adding support for Azure OpenAI for Semantic Kernel - commands are commented May 12, 2025
@oliverlabs oliverlabs closed this May 12, 2025
@oliverlabs oliverlabs reopened this May 12, 2025
@oliverlabs
Copy link
Author

@swapydapy, could you please review?

@oliverlabs oliverlabs requested a review from a team as a code owner May 15, 2025 11:50
@oliverlabs
Copy link
Author

@DJ-os perhaps you could review as well?

@oliverlabs
Copy link
Author

@didier-durand or perhaps you could review as you were the one who committed the semantic kernel sample. Thanks.

@didier-durand
Copy link
Contributor

Hi, sorry: I am not accredited to make reviews on this repo. Didier

@didier-durand
Copy link
Contributor

Hi @jland-redhat, this article may help to understand the differences between the Azure endpoint for OpenAI and the native one.

https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/switching-endpoints

@oliverlabs oliverlabs requested a review from jland-redhat May 21, 2025 13:39
@jland-redhat
Copy link

Hi @jland-redhat, this article may help to understand the differences between the Azure endpoint for OpenAI and the native one.

https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/switching-endpoints

Ok I missed the custom library.

So still have a few suggestions I would like to run by y'all.

There is a lot of commented code here which I am not in love with.

What if instead we simplify the environment Variables. So instead of the current setup, what if we primarily stuck to OPENAI_API_KEY and OPENAI_ENDPOINT. (I see AZURE_OPENAI_DEPLOYMENT_NAME is being set to deployment_name, but it's unclear where this variable is actually used in the code.)

Including OPENAI_ENDPOINT also means that on line 122 we could conditionally set that value if it exist. This would solve the issue for allowing us to connect to other LLMs that use the same specification as OpenAIChatCompletion such as the ones served through vLLM.

And for the Azure/Non-Azure Logic from what I can tell, the only code difference is on lines 137 and 155, where service=AzureChatCompletion( is called. We could introduce a single variable, like ENABLE_AZURE_OPENAI, and use it to conditionally leverage the Azure libraries in these two instances if it's set to True.

What are y'alls thoughts on that?

@oliverlabs
Copy link
Author

Hi @jland-redhat, this article may help to understand the differences between the Azure endpoint for OpenAI and the native one.
https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/switching-endpoints

Ok I missed the custom library.

So still have a few suggestions I would like to run by y'all.

There is a lot of commented code here which I am not in love with.

What if instead we simplify the environment Variables. So instead of the current setup, what if we primarily stuck to OPENAI_API_KEY and OPENAI_ENDPOINT. (I see AZURE_OPENAI_DEPLOYMENT_NAME is being set to deployment_name, but it's unclear where this variable is actually used in the code.)

Including OPENAI_ENDPOINT also means that on line 122 we could conditionally set that value if it exist. This would solve the issue for allowing us to connect to other LLMs that use the same specification as OpenAIChatCompletion such as the ones served through vLLM.

And for the Azure/Non-Azure Logic from what I can tell, the only code difference is on lines 137 and 155, where service=AzureChatCompletion( is called. We could introduce a single variable, like ENABLE_AZURE_OPENAI, and use it to conditionally leverage the Azure libraries in these two instances if it's set to True.

What are y'alls thoughts on that?

Sure. See the flag implementation in the latest commit. I tested it with Azure OpenAI as working successfully. However, I don't have an OpenAI api key to test with. If you could test it, that'd be great. Cheers.

@jland-redhat
Copy link

jland-redhat commented May 22, 2025

Hi @jland-redhat, this article may help to understand the differences between the Azure endpoint for OpenAI and the native one.
https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/switching-endpoints

Ok I missed the custom library.
So still have a few suggestions I would like to run by y'all.
There is a lot of commented code here which I am not in love with.
What if instead we simplify the environment Variables. So instead of the current setup, what if we primarily stuck to OPENAI_API_KEY and OPENAI_ENDPOINT. (I see AZURE_OPENAI_DEPLOYMENT_NAME is being set to deployment_name, but it's unclear where this variable is actually used in the code.)
Including OPENAI_ENDPOINT also means that on line 122 we could conditionally set that value if it exist. This would solve the issue for allowing us to connect to other LLMs that use the same specification as OpenAIChatCompletion such as the ones served through vLLM.
And for the Azure/Non-Azure Logic from what I can tell, the only code difference is on lines 137 and 155, where service=AzureChatCompletion( is called. We could introduce a single variable, like ENABLE_AZURE_OPENAI, and use it to conditionally leverage the Azure libraries in these two instances if it's set to True.
What are y'alls thoughts on that?

Sure. See the flag implementation in the latest commit. I tested it with Azure OpenAI as working successfully. However, I don't have an OpenAI api key to test with. If you could test it, that'd be great. Cheers.

Hey Oliver,

Validated the change:

Validation using the default OpenAI API:
{"artifacts":[{"artifactId":"89f49deb-ff46-4338-8179-cf8e96bd2c24","description":"Result of request to agent.","name":"current_result","parts":[{"kind":"text","text":"The current exchange rate for USD to EUR is approximately 1 USD to 0.88331 EUR."}]}],"contextId":"c7872837-23e6-4a91-a0d1-3913902bc467","history":[{"contextId":"c7872837-23e6-4a91-a0d1-3913902bc467","kind":"message","messageId":"31a7404a-0b80-48e8-a908-80e264b8b935","parts":[{"kind":"text","text":"Dollar to Euro"}],"role":"user","taskId":"1e05a9a9-95b9-43c5-b2f0-f28524bbf27a"},{"contextId":"c7872837-23e6-4a91-a0d1-3913902bc467","kind":"message","messageId":"333e887b-83ea-41d4-b726-d1de225907af","parts":[{"kind":"text","text":"Processing function calls..."}],"role":"agent","taskId":"1e05a9a9-95b9-43c5-b2f0-f28524bbf27a"},{"contextId":"c7872837-23e6-4a91-a0d1-3913902bc467","kind":"message","messageId":"ff4fcba2-acfe-4e2d-8f76-594938a0632f","parts":[{"kind":"text","text":"Building the output..."}],"role":"agent","taskId":"1e05a9a9-95b9-43c5-b2f0-f28524bbf27a"}],"id":"1e05a9a9-95b9-43c5-b2f0-f28524bbf27a","kind":"task","status":{"state":"completed"}}

I was also hoping that we could have provided a custom endpoint to the OpenAIChatCompletion but the endpoint is hardcoded and changing it would involve creating a custom service which is annoying more complex than I was hoping. May try to pursue getting that working once this PR is up.

But this all LGTM!

@oliverlabs
Copy link
Author

oliverlabs commented May 22, 2025

@jland-redhat this is easy. I could add that in if you like (but you'd have to retest it). This would add one extra variable in the .env file and one extra parameter for the def __init__(self). How do I get this PR merged now? :) I believe you need to re-approve yours now, too.

upd: I see, google/a2a-eng needs to approve.

@jland-redhat
Copy link

@oliverlabs I think you need to get one of the repo owners to approve in order to move forward. Not sure the best way to poke them to take a look

@oliverlabs
Copy link
Author

@moonbox3 @holtskinner @kthota-g @koverholt , perhaps one of you could merge?

@kthota-g
Copy link
Collaborator

@moonbox3 would you be able to review this PR?

@oliverlabs oliverlabs changed the title feat: adding support for Azure OpenAI for Semantic Kernel - commands are commented feat: adding support for Azure OpenAI for Semantic Kernel May 22, 2025
@oliverlabs oliverlabs changed the title feat: adding support for Azure OpenAI for Semantic Kernel feat: adding support for Azure OpenAI in Semantic Kernel May 22, 2025
@moonbox3
Copy link
Contributor

@moonbox3 would you be able to review this PR?

Definitely, @kthota-g. Will have a look today.

Copy link
Contributor

@moonbox3 moonbox3 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks like this needs a merge from main, as well as some other updates. Appreciate you adding this functionality.

@moonbox3
Copy link
Contributor

Hi @oliverlabs, please let me know if you'd like some support getting the Azure OpenAI support added. I'm from the SK Python team and will be glad to help add the functionality.

@oliverlabs
Copy link
Author

oliverlabs commented May 26, 2025

@moonbox3 hi Evan, thanks for reviewing this PR. Apologies, it's a bank holiday weekend in the UK and I was OoO. I will reach out to you tomorrow with some comments.

@oliverlabs oliverlabs requested a review from moonbox3 May 27, 2025 10:28
Copy link
Author

@oliverlabs oliverlabs left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Implemented

@oliverlabs
Copy link
Author

oliverlabs commented May 27, 2025

@moonbox3 , could you please have a look at the __init__(self) method env variable logic? I have implemented it and it works, but I can't implement your suggestion to only use chat_service = AzureChatCompletion() with no variables. It fails to load when I do that.

UPD: fixed. I almost went mad searching for an error, but then noticed that AZURE_OPENAI_CHAT_DEPLOYMENT_NAME variable name was expected as opposed to AZURE_OPENAI_DEPLOYMENT_NAME. This should all work now. Please have a look and approve.

@moonbox3
Copy link
Contributor

Hi @oliverlabs, great progress. I would really prefer to have a more explicit approach to configuring the chat service, versus relying solely on if X number of env vars configured. Something like this (doesn't have to be that involved, but the enum is nice): https://github.com/microsoft/semantic-kernel/blob/main/python/samples/concepts/setup/chat_completion_services.py

The method get_chat_completion_service_and_request_settings gives one entry point to get the service / settings. To make it easier, we don't need to include the request settings, we can have the method be something like get_chat_completion_service and have it return the service only. It's pretty easy then to even include other AI Services like Google, Mistral or Anthropic, as an example - for those who want to test with the specific service.

We can also link to SK's docs on configuring chat completion services in the README so devs can understand what needs to be included, if there are ever questions: https://learn.microsoft.com/en-us/semantic-kernel/concepts/ai-services/chat-completion/?tabs=csharp-AzureOpenAI%2Cpython-AzureOpenAI%2Cjava-AzureOpenAI&pivots=programming-language-python#creating-a-chat-completion-service along with this doc that spells out what env vars are required.

@oliverlabs
Copy link
Author

@moonbox3, well, it looks like they removed the samples from this repository altogether. So this was all for nothing. Thanks for your feedback though.

@oliverlabs oliverlabs closed this May 28, 2025
@oliverlabs
Copy link
Author

oliverlabs commented May 28, 2025

P.S. I rewrote the auth function to use the functions you mentioned @moonbox3. I will make a PR in the new repo, but I am not sure if you would have the maintainer rights there. I tested it and it works.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Supprting change LLM (Azure OpenAI、Deepseek) for Host Agent
5 participants