Manage and store messages in a conversation
messages = messageHistory
messages = messageHistory
creates an empty messageHistory
object.
{}
(default) | cell array of struct
This property is read-only.
Messages in the message history. Each element of the cell array is a structure array corresponding to one message. You can add and remove messages using the object functions. The content of the structure array depends on the type of message.
addSystemMessage
— Add system message to message history
addUserMessage
— Add user message to message history
addUserMessageWithImages
— Add user message with images to message history
addToolMessage
— Add tool message to message history
addResponseMessage
— Add response message to message history
removeMessage
— Remove message from message history
messages = messageHistory
messages =
messageHistory with properties:
Messages: {}
messages = addSystemMessage(messages,"example_user","Hello, how are you?");
messages = addUserMessage(messages,"I am well, thank you for asking.");
messages.Messages{1}
ans = struct with fields:
role: "system"
name: "example_user"
content: "Hello, how are you?"
messages.Messages{2}
ans = struct with fields:
role: "user"
content: "I am well, thank you for asking."
First, specify the OpenAI® API key as an environment variable and save it to a file called ".env"
. Next, load the environment file using the loadenv
function.
loadenv(".env")
Connect to the OpenAI Chat Completion API. Use a system prompt to instruct the model.
model = openAIChat("You are a helpful assistants who judges whether two English words rhyme. You answer either yes or no.");
Initialize the message history.
messages = messageHistory;
Add example messages to the message history. When you pass this to the model, this example conversation further instructs the model on the output you want it to generate.
messages = addSystemMessage(messages,"example_user","House and mouse?");
messages = addSystemMessage(messages,"example_assistant","Yes");
messages = addSystemMessage(messages,"example_user","Thought and brought?");
messages = addSystemMessage(messages,"example_assistant","Yes");
messages = addSystemMessage(messages,"example_user","Tough and though?");
messages = addSystemMessage(messages,"example_assistant","No");
Add a user message to the message history. When you pass this to the model, the system messages act as an extension of the system prompt. The user message acts as the prompt.
messages = addUserMessage(messages,"Love and move?");
Generate a response from the message history.
generate(model,messages)
ans = "No"
First, specify the OpenAI® API key as an environment variable and save it to a file called ".env"
. Next, load the environment file using the loadenv
function.
loadenv(".env")
Create an openAIFunction
object that represents the sind
function. The sind
function has a single input argument, x
, representing the input angle in degrees.
f = openAIFunction("sind","Sine of argument in degrees");
f = addParameter(f,"x",type="number",description="Angle in degrees");
Connect to the OpenAI Chat Completion API. Pass the openAIFunction
object f
as an input argument.
model = openAIChat("You are a helpful assistant.",Tools=f);
Initialize the message history. Add a user message to the message history.
messages = messageHistory;
messages = addUserMessage(messages,"What is the sine of thirty?");
Generate a response based on the message history.
[~,completeOutput] = generate(model,messages)
completeOutput = struct with fields:
role: 'assistant'
content: []
tool_calls: [1x1 struct]
refusal: []
The model has not generated any text. Instead, it has created a function call, completeOutput.tool_calls
.
Add the response to the message history.
messages = addResponseMessage(messages,completeOutput);
Extract the tool call ID and the name of the called function.
toolCallID = string(completeOutput.tool_calls.id)
toolCallID = "call_HW11K1FFmOPun9ouXScMcanR"
functionCalled = string(completeOutput.tool_calls.function.name)
functionCalled = "sind"
Make sure that the model is calling the correct function. Even with only a single function, large language models can hallucinate function calls to fictitious functions.
Extract the input argument values from the complete output using the jsondecode
function. Compute the sine of the generated argument value and add the result to the message history using the addToolMessage
function.
if functionCalled == "sind"
args = jsondecode(completeOutput.tool_calls.function.arguments);
result = sind(args.x)
messages = addToolMessage(messages,toolCallID,functionCalled,"x="+result);
end
result = 0.5000
Finally, generate a natural language response.
generatedText = generate(model,messages)
generatedText = "The sine of 30 degrees is 0.5."
generate
| openAIChat
| ollamaChat
| azureChat
| addSystemMessage
| addUserMessage
| addUserMessageWithImage
| addToolMessage
| addResponseMessage
- Create Simple Chatbot
- Create Simple Ollama Chatbot
- Describe Images Using ChatGPT
- Analyze Text Data Using Parallel Function Calls with ChatGPT
Copyright 2024 The MathWorks, Inc.