Conversation
|
/crush_fast AI review started. |
Advanced AI Review
Click to expand reviewCode Review: Enhanced protection from memory leaksIssue 1: SharedArrayBuffer check logic is inverted
🔴 High importance / High confidence Location: if (ArrayBuffer.isView(value) && !(typeof SharedArrayBuffer !== 'function' || value.buffer instanceof SharedArrayBuffer)) {The logic for excluding SharedArrayBuffer-backed views is inverted. When
This means the entire condition becomes Suggested fix: if (ArrayBuffer.isView(value) && !(typeof SharedArrayBuffer === 'function' && value.buffer instanceof SharedArrayBuffer)) {This correctly excludes only SharedArrayBuffer-backed views when SharedArrayBuffer is available, while processing all other ArrayBuffer views normally. Issue 2: Revokables added in async callback are never returned
🔴 High importance / High confidence Location: mc.port1.onmessage = async (ev) => {
try {
try {
const result = await value(...deserializer(ev.data[1]) as unknown[])
const { data, transferables, revokables: rr } = serializer(result)
ev.data[0].postMessage([true, data], transferables)
rr.forEach(r => revokables.add(r)) // <-- This line
} catch (e) {
const { data, transferables, revokables: rr } = serializer(e, true)
ev.data[0].postMessage([false, data], transferables)
rr.forEach(r => revokables.add(r)) // <-- This line
}
} catch (e) {
console.error('Async error on onmessage handler', e)
}
}The This defeats the purpose of tracking these revokables for cleanup. Suggested fix: This is architecturally challenging because the function call happens asynchronously. Consider:
Issue 3: Revokables ignored in deserializer's function handler
🟡 Medium importance / High confidence Location: const { data, transferables } = serializer(args)When serializing function arguments in the deserializer's Suggested fix: const { data, transferables, revokables } = serializer(args)
// Either track these revokables somewhere for cleanup, or close them
// after the response is receivedNote that this has the same fundamental timing issue as Issue 2 - knowing when to revoke requires knowing when the remote side is done with the ports. Issue 4: Mutation of Error.cause during serialization
🟡 Medium importance / High confidence Location: if (value.cause) {
value.cause = serializer(value.cause, true).data
}This mutates the original Error object's
...the Suggested fix: Create a copy of the cause rather than mutating the original: if (value.cause) {
const serializedCause = serializer(value.cause, true).data
return rawResult(rawResultSet, ['_', '_err', rawResult(rawResultSet, ['_', '_ref', pos]), value.name, serializedCause])
}Then deserialize it from the array element rather than from the Error object itself. Issue 5: Memory leaks acknowledged but not fully addressed
⚪ Lower importance Location: // TODO FIXME: This could still leave some ports open on functions
// that return functions that return functions// TODO FIXME: Potential memory leak if any of the args is a
// function that gets called. This level of nesting isn't
// that common. A more comprehensive solution should nevertheless
// be devised.The PR adds
SummaryThe most critical issues are:
The memory leak improvements in this PR appear to be incomplete and may give a false sense of security while introducing new bugs. Review generated using |
|
/crush_fast AI review started. |
Advanced AI Review
Click to expand reviewCode Review: Enhanced protection from memory leaksIssue 1: Typo in error message
⚪️ rcvPort.onmessageerror = () => {
reject(new Error('Messge error'))
rcvPort.close()
}Suggested fix: rcvPort.onmessageerror = () => {
reject(new Error('Message error'))
rcvPort.close()
}Issue 2: Typo in comment
⚪️ // Automatric destruction (handled on the deserializing end):
// When a returned proxy function goes out of scope, close its associated portSuggested fix: // Automatic destruction (handled on the deserializing end):
// When a returned proxy function goes out of scope, close its associated portIssue 3: Potential memory leak - missing try-catch around postMessage
🟡 const fn = (...args: unknown[]) => {
return new Promise((resolve, reject) => {
const mc = new MessageChannel()
const { data, transferables } = serializer(args)
const rcvPort = mc.port1
const sendingPort = mc.port2
rcvPort.onmessage = (ev) => {
if (ev.data[0]) {
resolve(deserializer(ev.data[1]))
} else {
reject(deserializer(ev.data[1]))
}
rcvPort.close()
}
rcvPort.onmessageerror = () => {
reject(new Error('Messge error'))
rcvPort.close()
}
mp.postMessage([sendingPort, data], [sendingPort, ...transferables])
})
}Suggested fix: const fn = (...args: unknown[]) => {
return new Promise((resolve, reject) => {
const mc = new MessageChannel()
const { data, transferables } = serializer(args)
const rcvPort = mc.port1
const sendingPort = mc.port2
rcvPort.onmessage = (ev) => {
if (ev.data[0]) {
resolve(deserializer(ev.data[1]))
} else {
reject(deserializer(ev.data[1]))
}
rcvPort.close()
}
rcvPort.onmessageerror = () => {
reject(new Error('Message error'))
rcvPort.close()
}
try {
mp.postMessage([sendingPort, data], [sendingPort, ...transferables])
} catch (e) {
rcvPort.close()
reject(e)
}
})
}Issue 4: SharedArrayBuffer detection logic appears inverted
🟡 if (ArrayBuffer.isView(value) && !(typeof SharedArrayBuffer !== 'function' && value.buffer instanceof SharedArrayBuffer)) {Breaking this down:
This means the condition passes when SharedArrayBuffer exists OR when the buffer is not a SharedArrayBuffer. The intent seems to be to exclude SharedArrayBuffer from transferables (since you cannot transfer them), but this logic doesn't achieve that. Suggested fix: if (ArrayBuffer.isView(value) && !(typeof SharedArrayBuffer === 'function' && value.buffer instanceof SharedArrayBuffer)) {This correctly excludes the value when SharedArrayBuffer is available AND the buffer is a SharedArrayBuffer. Issue 5: Test doesn't verify memory leak fix
⚪️ Additionally, line 29 has a fire-and-forget promise that could cause race conditions in tests: callbackFactory().then(callback => callback(() => 'foo'))
return callbackFactory().then(callback => callback(() => 'bar'))The first line is not awaited, meaning if the test framework completes before this promise resolves, results could be incomplete. The test happens to work because Suggested improvement for a proper memory leak test: it('should not leak memory', async () => {
const weakRefs: WeakRef<object>[] = []
const serialized = _.serializer((callbackFactory: () => Promise<(x: () => string) => Promise<void>>) => {
callbackFactory().then(callback => callback(() => 'foo'))
return callbackFactory().then(callback => callback(() => 'bar'))
})
const fn = _.deserializer(serialized.data) as (callback: () => (valueProvider: () => Promise<string>) => void) => void
const results: [string, string][] = []
weakRefs.push(new WeakRef(fn))
await Promise.all([
fn(() => (valueProvider) => valueProvider().then(value => results.push(['cb1', value]))),
fn(() => (valueProvider) => valueProvider().then(value => results.push(['cb2', value])))
])
assert.ok(results.length === 4)
// Close revokables to allow cleanup
serialized.revokables.forEach(port => port.close())
// Force garbage collection if available
if (global.gc) {
global.gc()
global.gc() // Call twice to ensure cleanup
}
// Verify the function can be garbage collected
// Note: This is a heuristic - GC is not deterministic
const collected = weakRefs.some(ref => ref.deref() === undefined)
// In practice, we can't assert this reliably, but the test documents the expected behavior
})Issue 6: Message data accessed without validation (security consideration)
⚪️ mc.port1.onmessage = async (ev) => {
try {
try {
const result = await value(...deserializer(ev.data[1]) as unknown[])
const { data, transferables } = serializer(result)
ev.data[0].postMessage([true, data], transferables)
} catch (e) {Suggested improvement: mc.port1.onmessage = async (ev) => {
try {
if (!Array.isArray(ev.data) || ev.data.length < 2) {
console.error('Invalid message format')
return
}
const [responsePort, argsData] = ev.data
if (!(responsePort instanceof MessagePort)) {
console.error('Invalid response port')
return
}
try {
const result = await value(...deserializer(argsData) as unknown[])
const { data, transferables } = serializer(result)
responsePort.postMessage([true, data], transferables)
} catch (e) {Summary
The most important issues to address are:
Review generated using |
|
/crush_fast {{openrouter, google/gemini-3.1-pro-preview, google/gemini-3-flash-preview}} AI review started. |
Advanced AI Review
Click to expand reviewHere is the review of the changes, focusing strictly on bugs, memory leak scenarios, and robust improvements. 1. Destructive mutation of application data (
|
|
/crush_fast {{openrouter, openai/gpt-5.3-codex, google/gemini-3-flash-preview}} AI review started. |
Advanced AI Review
Click to expand review1) 🟡 Leak fix silently degrades to no-op on older runtimes (
|
|
/crush_fast {{openrouter, google/gemini-3.1-pro-preview, google/gemini-3-flash-preview}} AI review started. |
Advanced AI Review
Click to expand reviewHere is my thorough review of the PR changes. 1. 🔴 TypedArray backed by SharedArrayBuffer is completely corrupted during serialization
In the However, bypassing this block entirely means that You should decouple the specific check for
if (ArrayBuffer.isView(value) && !(typeof SharedArrayBuffer === 'function' && value.buffer instanceof SharedArrayBuffer)) {
const pos = verbatim.length
verbatim[verbatim.length] = value
transferables.add(value.buffer as Transferables)
return rawResult(rawResultSet, ['_', '_ref', pos])
}Suggestion: if (ArrayBuffer.isView(value)) {
const pos = verbatim.length
verbatim[verbatim.length] = value
if (!(typeof SharedArrayBuffer === 'function' && value.buffer instanceof SharedArrayBuffer)) {
transferables.add(value.buffer as Transferables)
}
return rawResult(rawResultSet, ['_', '_ref', pos])
}2. 🔴
|
|
/crush_fast {{openrouter, google/gemini-3.1-pro-preview, google/gemini-3-flash-preview}} AI review started. |
Advanced AI Review
Click to expand reviewHere is a review of the changes, focusing on potential bugs, security problems, and areas for improvement. 1. 🔴 High Importance:
|
|
/crush_fast {{openrouter, google/gemini-3.1-pro-preview, google/gemini-3-flash-preview}} AI review started. |
Advanced AI Review
Click to expand reviewOverall, these changes bring a solid approach to cleaning up proxied I've reviewed the code and found a couple of areas where error handling logic can be adjusted to preserve error information and prevent an edge-case port leak. 1. 🟡 Use of
|
|
/crush_fast {{openrouter, google/gemini-3.1-pro-preview, google/gemini-3-flash-preview}} AI review started. |
Advanced AI Review
Click to expand reviewHere is a thorough review of the provided code changes, focusing on memory leaks, reliability, and code clarity. The issues have been ordered by importance from most to least critical. 1. 🔴 Memory leak if serialization throws an error
In The critical bug occurs if Fix Idea: Place the JSON traversal inside a export const serializer = (data: unknown, noFn?: boolean): {
data: unknown,
transferables: Transferables[],
revokables: Revokables[]
} => {
const rawResultSet = new WeakSet()
const verbatim: Verbatim[] = []
const transferables = new Set<Transferables>()
const revokables = new Set<Revokables>()
try {
// JSON.parse and JSON.stringify are called for their ability to do a deep
// clone and calling a reviver / replacer.
const result = JSON.parse(JSON.stringify(data, (_key: string, value: unknown) => {
// Return already processed values without modifications
if (value && typeof value === 'object' && rawResultSet.has(value)) return value
// ... [rest of the existing replacer logic]
return value
}), (_key: string, value: unknown) => {
// ... [rest of the existing reviver logic]
})
return {
data: result,
transferables: Array.from(transferables),
revokables: Array.from(revokables)
}
} catch (err) {
// Prevent memory leaks if stringify aborts cleanly mid-traversal
revokables.forEach(port => port.close())
throw err
}
}2. 🟡 Error fallback implementation loses important Error details (
|
|
/crush_fast {{openrouter, openai/gpt-5.4, google/gemini-3-flash-preview}} Note: this model is new and expensive - it should only be run after the other models stop finding things. I'm running it here as you've indicated the PR is ready to merge. AI review started. |
Advanced AI Review
Click to expand review1. 🔴
|
|
It goes from 1 to 5, skipping 2, 3 & 4. None of the two issues seem too relevant.
|
taoeffect
left a comment
There was a problem hiding this comment.
Fantastic work @corrideat !
|
@corrideat wrote:
Oops, this had to do with how copy/paste works in the Github Actions workflow. The UI doesn't actually render all of the lines, only the visible and near-visible ones. I've recovered them. Was there anything important we missed? Recovered review:2. 🔴 Revokables leak when a proxied function throws an error whose
|
Closes #1
The issue with the memory leak is as follows (this is the rationale behind why there was a memory leak and the fix):
structuredClonedoes not support passing in functions. It does allow passing inMessagePorts, which we can use to proxy functions.MessagePorts weren't being cleaned up.serializerreturns a list ofrevokablesthat was being ignored (now they're closed in Close unused message ports group-income#3037).Why things were the were they were
Normally and conventionally, JavaScript functions don't have a 'lifetime' associated with them, other than going out of scope (like most other JavaScript objects). Closing the port amounts to adding an explicit lifetime to the function.
Why the one-liner solution was rejected
The LLM suggestion of closing the port once the function call resolves is suboptimal because it'd mean that the function can only be called once.
What was done instead
In okTurtles/group-income#3037, the list of
revokablesis used to clean up ports after the main call succeeds. This still adds an explicit lifetime, but allows for proxied function arguments to be called multiple times.In this PR:
revokablesis extended to include a few more message portsserializer) isn't sending or shouldn't be sending any functions.