Description
Environment
- node: 20.15.1
- "@sentry/vite-plugin": "^3.2.0",
- "vite": "^6.1.1",
Steps to Reproduce
I know there are already a few issues around this but the one with the most info is locked and I cant upload any memory profiles.
Context:
- We are making the switch from webpack -> vite.
- ~13000 FE files
Scenario 1 (webpack w/ Sentry plugin):
- Everything works great. Builds, uploads, all good.
Scenario 2 (Vite, no sentry plugin installed):
- Again, all great, no build issues
Scenario 3 (Vite, Sentry plugin installed but disabled):
- Yet again, all great, no build issues
Scenario 4 (Vite, Sentry plugin enabled):
- We get all the way through the transformation but then the rendering sticks at the same place every time.
- It does work if I bump
max_old_space_size=8192
.
We can live with the max old space bump for now, but I would like to understand what the plugin is doing that's causing an OOM issue. I was under the impression it just uploads to sentry. If there is a myriad of other "enhancements" on by default then I would like to A: know about them, and B: disable all of them to see if I can get the build to work anyways.
Vite Config:
plugins: [
tsConfigPaths(),
react(),
sentryVitePlugin({
org: 'REDACTED',
project: 'REDACTED',
authToken: process.env.SENTRY_AUTH_TOKEN,
disable: !process.env.SENTRY_AUTH_TOKEN, // Redundant as the plugin automatically disable if no authToken is provided
debug: true,
telemetry: false,
errorHandler: err => {
console.warn(err);
},
sourcemaps: {
filesToDeleteAfterUpload: ['./dist/*.map'],
},
release: {
name: `dashboard-frontend@${process.env.APP_VERSION}`,
},
}),
],
Logs:
vite v6.1.1 building for production...
✓ 13249 modules transformed.
rendering chunks (136)...
<--- Last few GCs --->
[33540:0x140008000] 79113 ms: Scavenge (reduce) 4052.3 (4142.9) -> 4052.1 (4143.4) MB, 1876.67 / 0.00 ms (average mu = 0.506, current mu = 0.514) allocation failure;
[33540:0x140008000] 83672 ms: Mark-Compact (reduce) 4052.7 (4143.4) -> 4052.1 (4144.2) MB, 4548.21 / 0.00 ms (average mu = 0.393, current mu = 0.324) allocation failure; scavenge might not succeed
<--- JS stacktrace --->
FATAL ERROR: Reached heap limit Allocation failed - JavaScript heap out of memory
----- Native stack trace -----
Expected Result
The only thing the plugin does is upload the sourcemaps.
Actual Result
Something in the execution of the plugin is causing an OOM error.
Metadata
Metadata
Assignees
Labels
Type
Projects
Status