fix(ai-chat): add null check to model name#645
fix(ai-chat): add null check to model name#645hestela wants to merge 1 commit intoCrosstalk-Solutions:devfrom
Conversation
When the OpenAI-compatible fallback (/v1/models) is used, models are mapped as { name: m.id, size: 0 } with no details field. Accessing model.details.parameter_size throws
TypeError: Cannot read properties of undefined, which crashes the React render and causes the entire page to go blank.
|
this seems to be new code from #612. i found this with lm-studio as well as a custom openai api interface. |
chriscrosstalk
left a comment
There was a problem hiding this comment.
Reviewed the diff -- clean one-character fix for a real crash. When using the remote OpenAI-compatible fallback (/v1/models), models come back without a details field, so model.details.parameter_size throws a TypeError and blanks the entire AI settings page. Optional chaining is the right fix.
This is a regression from PR #612 (Installed Models section) that shipped in v1.31.0. Recommend including in a v1.31.1 hotfix.
chriscrosstalk
left a comment
There was a problem hiding this comment.
Textbook hotfix. One-character optional-chaining addition that prevents a full-page crash when any remote OpenAI-compatible backend returns models without a details field. Zero regression risk — optional chaining is strictly safer than the current unchecked access.
This also directly addresses the crash symptom I diagnosed in #679 (empty Models & Settings page with remote LLM). It won't make unsupported backends like LM Studio fully work, but it restores the page and prevents the blank-screen experience for anyone using a remote Ollama host too.
Strong v1.31.1 candidate. Approving.
When the OpenAI-compatible fallback (/v1/models) is used, models are mapped as { name: m.id, size: 0 } with no details field. Accessing model.details.parameter_size throws
TypeError: Cannot read properties of undefined, which crashes the React render and causes the entire page to go blank.