Skip to content

fix: prevent AttributeError in parse_vision_messages when llm is None#5023

Open
toller892 wants to merge 1 commit intomem0ai:mainfrom
toller892:fix/parse-vision-messages-null-llm
Open

fix: prevent AttributeError in parse_vision_messages when llm is None#5023
toller892 wants to merge 1 commit intomem0ai:mainfrom
toller892:fix/parse-vision-messages-null-llm

Conversation

@toller892
Copy link
Copy Markdown

Problem

parse_vision_messages() defaults llm=None (vision disabled), but unconditionally calls get_image_description()llm.generate_response() for list-typed and image_url-dict content. This crashes with:

AttributeError: "NoneType" object has no attribute "generate_response"

Triggered by any standard multimodal message format (as produced by OpenAI SDK, LangChain, etc.) when enable_vision=False (the default).

Repro

from mem0.memory.utils import parse_vision_messages

messages = [{"role": "user", "content": [{"type": "text", "text": "Hi"}]}]
parse_vision_messages(messages)  # AttributeError

Fix

  • Fast path: return messages unchanged when llm is None — no vision processing needed
  • List content guard: for list-typed content, check if items actually contain image_url before calling vision processing; pass through text-only lists unchanged

Related

Fixes #4799

@CLAassistant
Copy link
Copy Markdown

CLAassistant commented Apr 29, 2026

CLA assistant check
All committers have signed the CLA.

parse_vision_messages() defaults llm=None (vision disabled), but
unconditionally calls get_image_description() → llm.generate_response()
for list-typed and image_url-dict content. This crashes with
AttributeError: 'NoneType' object has no attribute 'generate_response'
when any message uses the standard multimodal content format.

Fix:
- Fast-path return when llm is None (no vision processing needed)
- For list-typed content, check if items actually contain image_url
  before calling vision processing; pass through text-only lists

Fixes mem0ai#4799
@toller892 toller892 force-pushed the fix/parse-vision-messages-null-llm branch from dd8f9b5 to 345c2e8 Compare April 30, 2026 08:01
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

fix: AttributeError crash in Memory.add() when message content is a list and vision is disabled

2 participants