Does OpenAI's API support the name field? Do we need to change the "role" of other angent's words to "user"? #516
Description
agentscope\src\agentscope\models\openai_model.py
The call function directly sends messages to the OpenAI API:
kwargs.update(
{
"model": self.model_name,
"messages": messages,
"stream": stream,
},
)
if stream:
kwargs["stream_options"] = {"include_usage": True}
response = self.client.chat.completions.create(**kwargs)
The static_format function 'messages' has a' name 'field.
I checked and didn't see any documentation from OpenAI stating that it supports the name field.
Do we need to handle this and put the neme field into the "content"?
Does this agent think that what other agents(assistants) say is said by the agent itself?
Do we need to change the "role" of other angent's words to "user"?
And inform him in the system prompt that he is having a conversation with multiple people, and introduce the format of the content, such as: ”name: speech content“. Such as: ”xiaohong: hello“.
messages = []
for arg in args:
if arg is None:
continue
if isinstance(arg, Msg):
if arg.url is not None:
# Format the message according to the model type
# (vision/non-vision)
formatted_msg = OpenAIChatWrapper._format_msg_with_url(
arg,
model_name,
)
messages.append(formatted_msg)
else:
messages.append(
{
"role": arg.role,
"name": arg.name,
"content": _convert_to_str(arg.content),
},
)
Editing content after one day:
I just realized hat the function "ModelWrapperBase.format_for_common_chat_models(*args)
" in the code below does exactly what I mentioned earlier, it seems like I'm overthinking it.
if self.model_name.startswith("gpt-"):
return OpenAIChatWrapper.static_format(
*args,
model_name=self.model_name,
)
else:
# The OpenAI library maybe re-used to support other models
return ModelWrapperBase.format_for_common_chat_models(*args)