Summary
generateTitle silently falls back to "Conversation" when the agent model (or the generateTitle.model override) is a reasoning model such as gpt-5-mini, o1-mini, or o3-mini. The failure is swallowed by a try/catch with only a DEBUG-level log, making it very hard to diagnose.
Root Cause
createConversationTitleGenerator in agent.ts calls generateText with temperature: 0 unconditionally:
const result = await context.traceContext.withSpan(llmSpan, () =>
generateText({
model: resolvedModel,
messages,
temperature: 0, // reasoning models reject this parameter
maxOutputTokens,
}),
);
Reasoning models (OpenAI o-series, gpt-5-mini, etc.) do not accept the temperature parameter. The AI SDK emits a warning and the call either errors or returns no usable output. The surrounding try/catch catches this silently:
} catch (error) {
context.logger.debug("[Memory] Failed to generate conversation title", {
error: safeStringify(error),
});
}
return fallbackTitle; // always "Conversation"
Steps to Reproduce
- Create an agent with a reasoning model, e.g.
model: "openai/gpt-5-mini"
- Configure
generateTitle: true on the Memory instance
- Start a new conversation
- Observe: conversation title is
"Conversation" instead of an AI-generated title
- In logs:
AI SDK Warning: The feature "temperature" is not supported for reasoning models
Expected Behavior
generateTitle should detect reasoning models and omit unsupported parameters (temperature)
- Or the error should be surfaced at
warn level so users know why title generation failed
- The docs should note that reasoning models are not supported for
generateTitle
Workaround
Explicitly override the model in generateTitle config to use a non-reasoning model:
new Memory({
storage: adapter,
generateTitle: {
enabled: true,
model: "openai/gpt-4o-mini", // must be a non-reasoning model
systemPrompt: "Generate a short title (max 6 words).",
maxLength: 60,
maxOutputTokens: 24,
},
});
Environment
@voltagent/core latest
- OpenAI provider, model
gpt-5-mini (reasoning model)
Summary
generateTitlesilently falls back to"Conversation"when the agent model (or thegenerateTitle.modeloverride) is a reasoning model such asgpt-5-mini,o1-mini, oro3-mini. The failure is swallowed by a try/catch with only a DEBUG-level log, making it very hard to diagnose.Root Cause
createConversationTitleGeneratorinagent.tscallsgenerateTextwithtemperature: 0unconditionally:Reasoning models (OpenAI
o-series,gpt-5-mini, etc.) do not accept thetemperatureparameter. The AI SDK emits a warning and the call either errors or returns no usable output. The surrounding try/catch catches this silently:Steps to Reproduce
model: "openai/gpt-5-mini"generateTitle: trueon theMemoryinstance"Conversation"instead of an AI-generated titleAI SDK Warning: The feature "temperature" is not supported for reasoning modelsExpected Behavior
generateTitleshould detect reasoning models and omit unsupported parameters (temperature)warnlevel so users know why title generation failedgenerateTitleWorkaround
Explicitly override the model in
generateTitleconfig to use a non-reasoning model:Environment
@voltagent/corelatestgpt-5-mini(reasoning model)