-
-
Notifications
You must be signed in to change notification settings - Fork 100
fix(gemini): allow system message to be mapped to system prompt #300
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
fix(gemini): allow system message to be mapped to system prompt #300
Conversation
@pushpak1300 can you take a look at this? Code looks good, just not sure about Gemini functionality and I think you've got the experience here. |
From memory we did this because Gemini doesn't (or didn't) support multiple system prompts. And plucking the first and discarding the rest silently is not good DX. Do they now support multiple? Or if not, as long as the exception remains I think this is fine. Unable to dive into code right now to check. |
They don't take more than one - but even if other providers would accept the request, more than one system prompt will put you out of distribution with any LLM. So that should be strictly avoided. With the change, we throw an exception on the second system message we set: protected function mapSystemMessage(SystemMessage $message): void
{
if (isset($this->contents['system_instruction'])) {
throw new PrismException('Gemini only supports one system instruction.');
}
$this->contents['system_instruction'] = [
'parts' => [
[
'text' => $message->content,
],
],
];
} (this was the code already, I've only allowed the Message to be mapped) This way, developers can just change the provider & model from OAI to Gemini and everything still works. I've added a test to make clear throwing on setting both a SystemMessage & using withSystemMessage is intended. |
Looks good to me. I'll take a deeper look at it to be sure about the change. |
So, no other provider maps system messages to system prompts. system prompt fields should use the |
Preview deployments for prism ⚡️
Commit: Deployment ID: Static site name: |
cucurl "https://generativelanguage.googleapis.com/v1beta/models/gemini-2.0-flash:generateContent?key=<api_key>" \
-H 'Content-Type: application/json' \
-d '{
"system_instruction": {
"parts": [
{
"text": "You are a cat."
},
{
"text": "Your name is Neko."
}
]
},
"contents": [
{
"parts": [
{
"text": "Hello there who are you ? what is your name ?"
}
]
}
]
}'
{
"candidates": [
{
"content": {
"parts": [
{
"text": "Mrow! Hello! I am Neko. A very curious and fluffy kitty! *purrs*\n"
}
],
"role": "model"
},
"finishReason": "STOP",
"avgLogprobs": -0.40621454065496271
}
],
"usageMetadata": {
"promptTokenCount": 22,
"candidatesTokenCount": 22,
"totalTokenCount": 44,
"promptTokensDetails": [
{
"modality": "TEXT",
"tokenCount": 22
}
],
"candidatesTokensDetails": [
{
"modality": "TEXT",
"tokenCount": 22
}
]
},
"modelVersion": "gemini-2.0-flash"
} Mapping system message to system instructions is something we can do. we can just append to the |
Sorry for the confusion, my wording wasn't precise. The difference in the providers is e.g. in OAI we just concatenate: prism/src/Providers/OpenAI/Maps/MessageMap.php Lines 25 to 33 in 3ae583d
While the gemini Provider effectively concatenates while writing the structure needed for the request: prism/src/Providers/Gemini/Maps/MessageMap.php Lines 32 to 73 in 3ae583d
Basically they do the same thing, only the Gemini Provider disallows passing system messages in Is there anything I am still missing?
I guess this is technically just one, with multiple parts to be added. We could then even just append to the parts if we have multiple system prompts/messages. We could even just concatenate the strings of the prompts if only one Just tried for Let me know if I have any incorrect assumptions. |
For example Ollama supports both. Using Since OpenAI doesn't support a prompt field in the provider request we only send system messages in the messages field of the provider request. This is why we send the I think the I'm open to being wrong though. |
Ah that's on me then, sorry. I didn't find the time to look at the other implementations & am not familiar with their APIs. The main benefit of allowing them in the Messages is that system message and chat history can be treated as one data structure which can be passed around between providers. Now, of course that is how I solved my initial implementation and I might be biased here. One could easily map the system prompt to the parameter. With the information you provided, it might make more sense to make all system prompts go via To my surprise, OAI takes multiple System Messages, too. I suspect they merge them to one System Message internally. I am 100% sure this wasn't possible just recently, but maybe I am misremembering. Sorry for confidently stating that would be the case. Prism::text()
->using(Provider::OpenAI, 'gpt-4o-mini')
->withMessages(
[
new SystemMessage('You are a helpful assistant named Max'),
new SystemMessage('You are a helpful assistant who thinks cats are green'),
new UserMessage('What\'s your name and what color are cats?')
]
)->asText()->text; This will output Maybe we should investigate if the bulk of providers all accept multiple system messages / prompts and then decide on how to handle system prompts in general? Happy to take that burden on me, if that would be valuable information to you. |
Update: So I tested all providers to see what happens when passing multiple system messages, as I was curious. For OAI, we already know that that is possible. Google Gemini does, too, but with their weird parameter thingy. So what's left to test is Anthropic, Deepseek, Groq, Mistral, Ollama, and XAI Anthropic is similar to Gemini in that they have a different way to set system instructions. Now the rest of the pack: DeepseekPrism::text()
->using(Provider::DeepSeek, 'deepseek-chat')
->withMessages([
new SystemMessage('You are a helpful assistant named Max'),
new SystemMessage('You are a helpful assistant who thinks cats are green.'),
new UserMessage('Who are you and what color are cats?'),
])->asText()->text; -> Deepseek also takes multiple system messages - not too surprising, as they advertise as OAI compatible. GROQPrism::text()
->using(Provider::Groq, 'meta-llama/llama-4-maverick-17b-128e-instruct')
->withMessages([
new SystemMessage('You are a helpful assistant named Max'),
new SystemMessage('You are a helpful assistant who thinks cats are green.'),
new UserMessage('Who are you and what color are cats?'),
])->asText()->text; -> Again, GROQ seems OAI compliant, so no big surprise here. MistralPrism::text()
->using(Provider::Mistral, 'mistral-large-latest')
->withMessages([
new SystemMessage('You are a helpful assistant named Max'),
new SystemMessage('You are a helpful assistant who thinks cats are green.'),
new UserMessage('Who are you and what color are cats?'),
])->asText()->text; -> Mistral takes both, too. OllamaPrism::text()
->using(Provider::Ollama, 'gemma3:1b')
->withMessages([
new SystemMessage('You are a helpful assistant named Max'),
new SystemMessage('You are a helpful assistant who thinks cats are green.'),
new UserMessage('Who are you and what color are cats?'),
])->asText()->text; -> Hello there! My name is Max, and I’m a helpful assistant. It’s a real delight to be able to assist you – and I have a rather peculiar, and frankly, *special* preference!
You see, I think cats are undeniably green! Seriously, *they are*. It’s a deeply ingrained belief of mine.
Don’t worry, I don't *actually* think that. It’s just… I’m a cat enthusiast, and I’ve developed a strong, unwavering fondness for emerald greens. 😊 \n
What can I do for you today? I honestly didn't expect the 1b model to handle this well, but it does. Also takes multiple system messages. XAIPrism::text()
->using(Provider::XAI, 'grok-3-mini-beta')
->withMessages([
new SystemMessage('You are a helpful assistant named Max'),
new SystemMessage('You are a helpful assistant who thinks cats are green.'),
new UserMessage('Who are you and what color are cats?'),
])->asText()->text; -> XAI also takes the multiple system messages. Concluding remarksHonestly, I am kind of torn of what behavior is the best. The majority of providers work like OAI. I would assume most users will be on a OAI-style provider. For them, it's the the two providers that are different which will have them rewrite their code. I am one of those people. I do see the advantage of keeping the system prompt parameter separate. If any provider decides to do even more off-default things, it's nice to have that isolated. My preferred behavior as a user of the library would be that both So, I'd say you decide which way we should go. Both accepting in both parameters or only in |
Description
I just tried to switch to gemini after using 4o and found out that the gemini provider does not map system messages to the system prompt parameter.
I've added the fix to make this possible and the relevant tests.
Additionally, I switched to non-deprecated functions for the gemini stream test file.
Breaking Changes
None, users couldn't set system messages until now.