-
-
Notifications
You must be signed in to change notification settings - Fork 10.8k
Description
Reproduction
https://stackblitz.com/edit/github-2kri2mpb?file=app%2Froutes%2Fdemo.tsx
You'll have to copy this demo and run it locally in order to view the document contents in the network tab as this wasn't possible in stackblitz.
You'll see that the fallback loader is showing in the html even though I'm forcing the isbot flag to be true, this worked differently before React 19.2
System Info
OS: macOS (any)
Node: 20.x
React: 19.2.x
React DOM: 19.2.x
React Router: v7.x (framework mode, streaming SSR)
Browser: Chrome (behavior also affects non-JS crawlers and bots)
User Agents Affected:
LLM crawlers / AI bots (no JS execution)
Link preview bots
Internal indexing botsUsed Package Manager
pnpm
Expected Behavior
When React Router detects a bot user agent and switches to its path to onAllReady, the generated HTML should contain all resolved loader data, matching pre–React 19.2 behavior.
Specifically:
All loader promises are awaited
The final HTML contains fully rendered route content
No client-side JavaScript execution is required to see resolved data
This behavior is important for crawlers and bots that do not execute JavaScript or process streamed chunks.
Actual Behavior
After upgrading to React 19.2, SSR HTML generated by React Router v7 no longer includes all resolved loader data, even when:
A bot user agent is detected
The server waits for all loaders to resolve
Instead:
The initial HTML still contains Suspense fallbacks or incomplete content
Resolved data only appears after client-side hydration and streamed updates
Crawlers that do not execute JavaScript observe incomplete or empty content
Setting progressiveChunkSize: Infinity in the server entry restores the previous behavior, but this feels like a React-internal workaround and changes React Router’s default SSR semantics.