Python: Bug: Kernel Function plugin not working with AzureAssistantAgent #10141
Description
Describe the bug
Testing the setup described here with a bugfix released in 1.18.0
To Reproduce
See the setup here.
Expected behavior
AzureAssistantAgent
with a kernel function plugin works as part of AgentGroupChat
Platform
- OS: Windows
- IDE: VS Code
- Language: Python
- Source: semantic-kernel==1.18.0
Additional context
ERROR:
semantic_kernel.exceptions.service_exceptions.ServiceResponseException: ("<class 'semantic_kernel.connectors.ai.open_ai.services.azure_chat_completion.AzureChatCompletion'> service failed to complete the prompt", BadRequestError('Error code: 400 - {\'error\': {\'message\': "An assistant message with \'tool_calls\' must be followed by tool messages responding to each \'tool_call_id\'. The following tool_call_ids did not have response messages: call_74vVFw3smVjsnsoCwcbrUNaN", \'type\': \'invalid_request_error\', \'param\': \'messages.[3].role\', \'code\': None}}'))
According to this, the tool_call_id should be included in messages with AuthorRole.TOOL
. I believe this should be handled in semantic kernel
Part of the stack trace:
...
File "c:\Users\<snip>\Projects\semantic_kernel_agents\.venv\Lib\site-packages\semantic_kernel\agents\group_chat\agent_group_chat.py", line 144, in invoke
async for message in super().invoke_agent(selected_agent):
File "c:\Users\<snip>\Projects\semantic_kernel_agents\.venv\Lib\site-packages\semantic_kernel\agents\group_chat\agent_chat.py", line 144, in invoke_agent
async for is_visible, message in channel.invoke(agent):
File "c:\Users\<snip>\Projects\semantic_kernel_agents\.venv\Lib\site-packages\semantic_kernel\agents\channels\chat_history_channel.py", line 71, in invoke
async for response_message in agent.invoke(self):
File "c:\Users\<snip>\Projects\semantic_kernel_agents\.venv\Lib\site-packages\semantic_kernel\agents\chat_completion\chat_completion_agent.py", line 111, in invoke
messages = await chat_completion_service.get_chat_message_contents(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "c:\Users\<snip>\Projects\semantic_kernel_agents\.venv\Lib\site-packages\semantic_kernel\connectors\ai\chat_completion_client_base.py", line 142, in get_chat_message_contents
return await self._inner_get_chat_message_contents(chat_history, settings)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "c:\Users\<snip>\Projects\semantic_kernel_agents\.venv\Lib\site-packages\semantic_kernel\utils\telemetry\model_diagnostics\decorators.py", line 83, in wrapper_decorator
return await completion_func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "c:\Users\<snip>\Projects\semantic_kernel_agents\.venv\Lib\site-packages\semantic_kernel\connectors\ai\open_ai\services\open_ai_chat_completion_base.py", line 88, in _inner_get_chat_message_contents
response = await self._send_request(settings)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "c:\Users\<snip>\Projects\semantic_kernel_agents\.venv\Lib\site-packages\semantic_kernel\connectors\ai\open_ai\services\open_ai_handler.py", line 59, in _send_request
return await self._send_completion_request(settings)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "c:\Users\<snip>\Projects\semantic_kernel_agents\.venv\Lib\site-packages\semantic_kernel\connectors\ai\open_ai\services\open_ai_handler.py", line 99, in _send_completion_request
raise ServiceResponseException(
semantic_kernel.exceptions.service_exceptions.ServiceResponseException: ("<class 'semantic_kernel.connectors.ai.open_ai.services.azure_chat_completion.AzureChatCompletion'> service failed to complete the prompt", BadRequestError('Error code: 400 - {\'error\': {\'message\': "An assistant message with \'tool_calls\' must be followed by tool messages responding to each \'tool_call_id\'. The following tool_call_ids did not have response messages: call_74vVFw3smVjsnsoCwcbrUNaN", \'type\': \'invalid_request_error\', \'param\': \'messages.[3].role\', \'code\': None}}'))
Metadata
Assignees
Labels
Type
Projects
Status
No status