Skip to content

Enhanced protection from memory leaks#2

Merged
taoeffect merged 11 commits intomainfrom
1-fix-memory-leak
Mar 6, 2026
Merged

Enhanced protection from memory leaks#2
taoeffect merged 11 commits intomainfrom
1-fix-memory-leak

Conversation

@corrideat
Copy link
Member

@corrideat corrideat commented Mar 2, 2026

Closes #1

The issue with the memory leak is as follows (this is the rationale behind why there was a memory leak and the fix):

  1. It's convenient to have functions to be passed across boundaries (e.g., from the app to the SW). For example, a hook or a callback function.
  2. However, structuredClone does not support passing in functions. It does allow passing in MessagePorts, which we can use to proxy functions.
  3. These MessagePorts weren't being cleaned up. serializer returns a list of revokables that was being ignored (now they're closed in Close unused message ports group-income#3037).

Why things were the were they were

Normally and conventionally, JavaScript functions don't have a 'lifetime' associated with them, other than going out of scope (like most other JavaScript objects). Closing the port amounts to adding an explicit lifetime to the function.

Why the one-liner solution was rejected

The LLM suggestion of closing the port once the function call resolves is suboptimal because it'd mean that the function can only be called once.

What was done instead

In okTurtles/group-income#3037, the list of revokables is used to clean up ports after the main call succeeds. This still adds an explicit lifetime, but allows for proxied function arguments to be called multiple times.

In this PR:

  • The list revokables is extended to include a few more message ports
  • Added an argument to disable function proxying entirely. This avoids leaks before they can even happen, and is a good way to prevent them in cases where the caller (the one calling serializer) isn't sending or shouldn't be sending any functions.

@corrideat corrideat self-assigned this Mar 2, 2026
@taoeffect
Copy link
Member

taoeffect commented Mar 2, 2026

/crush_fast


AI review started.

@okTurtles okTurtles deleted a comment from github-actions bot Mar 2, 2026
@github-actions
Copy link

github-actions bot commented Mar 2, 2026

Advanced AI Review

  • Type: Agentic (crush)
  • Model: glm-5
Click to expand review

Code Review: Enhanced protection from memory leaks

Issue 1: SharedArrayBuffer check logic is inverted

  • Addressed
  • Dismissed

🔴 High importance / High confidence

Location: src/index.ts:94

if (ArrayBuffer.isView(value) && !(typeof SharedArrayBuffer !== 'function' || value.buffer instanceof SharedArrayBuffer)) {

The logic for excluding SharedArrayBuffer-backed views is inverted. When SharedArrayBuffer is not defined in the environment (which is common, as it requires specific CORS headers), the condition evaluates as follows:

  • typeof SharedArrayBuffer !== 'function'true (since it's 'undefined')
  • true || anythingtrue (short-circuit)
  • !(true)false

This means the entire condition becomes ArrayBuffer.isView(value) && false, causing ALL ArrayBuffer views (TypedArrays, DataViews) to be skipped when SharedArrayBuffer is unavailable.

Suggested fix:

if (ArrayBuffer.isView(value) && !(typeof SharedArrayBuffer === 'function' && value.buffer instanceof SharedArrayBuffer)) {

This correctly excludes only SharedArrayBuffer-backed views when SharedArrayBuffer is available, while processing all other ArrayBuffer views normally.


Issue 2: Revokables added in async callback are never returned

  • Addressed
  • Dismissed

🔴 High importance / High confidence

Location: src/index.ts:112, 116

mc.port1.onmessage = async (ev) => {
  try {
    try {
      const result = await value(...deserializer(ev.data[1]) as unknown[])
      const { data, transferables, revokables: rr } = serializer(result)
      ev.data[0].postMessage([true, data], transferables)
      rr.forEach(r => revokables.add(r))  // <-- This line
    } catch (e) {
      const { data, transferables, revokables: rr } = serializer(e, true)
      ev.data[0].postMessage([false, data], transferables)
      rr.forEach(r => revokables.add(r))  // <-- This line
    }
  } catch (e) {
    console.error('Async error on onmessage handler', e)
  }
}

The revokables set is being modified inside an async onmessage callback that fires after the serializer function has already returned. The returned revokables array (line 144) captures the state of the set at return time, so any revokables added later in the callback are never included in the return value.

This defeats the purpose of tracking these revokables for cleanup.

Suggested fix:

This is architecturally challenging because the function call happens asynchronously. Consider:

  1. Having the caller manage revokable ports via a separate cleanup mechanism
  2. Closing mc.port1 after a single use if the function is single-call
  3. Using a FinalizationRegistry to clean up ports when the function reference is garbage collected

Issue 3: Revokables ignored in deserializer's function handler

  • Addressed
  • Dismissed

🟡 Medium importance / High confidence

Location: src/index.ts:211

const { data, transferables } = serializer(args)

When serializing function arguments in the deserializer's _fn case, the revokables from the serializer result are destructured but never used. If any argument is itself a function (creating nested MessagePorts), those ports will never be tracked for cleanup.

Suggested fix:

const { data, transferables, revokables } = serializer(args)
// Either track these revokables somewhere for cleanup, or close them
// after the response is received

Note that this has the same fundamental timing issue as Issue 2 - knowing when to revoke requires knowing when the remote side is done with the ports.


Issue 4: Mutation of Error.cause during serialization

  • Addressed
  • Dismissed

🟡 Medium importance / High confidence

Location: src/index.ts:82-84

if (value.cause) {
  value.cause = serializer(value.cause, true).data
}

This mutates the original Error object's cause property during serialization. If the same Error object is:

  1. Serialized multiple times
  2. Referenced from multiple places in the same serialization

...the cause will be transformed repeatedly, potentially leading to corruption. Additionally, the original cause value is lost after serialization.

Suggested fix:

Create a copy of the cause rather than mutating the original:

if (value.cause) {
  const serializedCause = serializer(value.cause, true).data
  return rawResult(rawResultSet, ['_', '_err', rawResult(rawResultSet, ['_', '_ref', pos]), value.name, serializedCause])
}

Then deserialize it from the array element rather than from the Error object itself.


Issue 5: Memory leaks acknowledged but not fully addressed

  • Addressed
  • Dismissed

Lower importance

Location: src/index.ts:108-109, 207-210

// TODO FIXME: This could still leave some ports open on functions
// that return functions that return functions
// TODO FIXME: Potential memory leak if any of the args is a
// function that gets called. This level of nesting isn't
// that common. A more comprehensive solution should nevertheless
// be devised.

The PR adds noFn and revokables tracking as partial mitigations, but the TODOs acknowledge that memory leaks remain. Consider whether:

  1. A more comprehensive solution should be implemented as part of this PR
  2. The limitations should be documented in the API
  3. Consumers should have a way to explicitly clean up resources

Summary

The most critical issues are:

  1. The SharedArrayBuffer check bug - will break ArrayBuffer view serialization in most environments
  2. The revokables timing bug - the core mechanism for tracking ports doesn't actually work

The memory leak improvements in this PR appear to be incomplete and may give a false sense of security while introducing new bugs.


Review generated using glm-5 via Z.AI. Comment /review to re-run.

devin-ai-integration[bot]

This comment was marked as resolved.

@taoeffect
Copy link
Member

taoeffect commented Mar 2, 2026

/crush_fast


AI review started.

devin-ai-integration[bot]

This comment was marked as resolved.

@github-actions
Copy link

github-actions bot commented Mar 2, 2026

Advanced AI Review

  • Type: Agentic (crush)
  • Model: glm-5
Click to expand review

Code Review: Enhanced protection from memory leaks

Issue 1: Typo in error message

  • Addressed
  • Dismissed

⚪️ src/index.ts:238 - Typo: "Messge error" should be "Message error"

rcvPort.onmessageerror = () => {
  reject(new Error('Messge error'))
  rcvPort.close()
}

Suggested fix:

rcvPort.onmessageerror = () => {
  reject(new Error('Message error'))
  rcvPort.close()
}

Issue 2: Typo in comment

  • Addressed
  • Dismissed

⚪️ src/index.ts:38 - Typo: "Automatric" should be "Automatic"

  // Automatric destruction (handled on the deserializing end):
  // When a returned proxy function goes out of scope, close its associated port

Suggested fix:

  // Automatic destruction (handled on the deserializing end):
  // When a returned proxy function goes out of scope, close its associated port

Issue 3: Potential memory leak - missing try-catch around postMessage

  • Addressed
  • Dismissed

🟡 src/index.ts:223-242 - If mp.postMessage throws (e.g., if the remote port is already closed), the Promise will never settle and rcvPort will never be closed, causing a memory leak.

const fn = (...args: unknown[]) => {
  return new Promise((resolve, reject) => {
    const mc = new MessageChannel()
    const { data, transferables } = serializer(args)
    const rcvPort = mc.port1
    const sendingPort = mc.port2
    rcvPort.onmessage = (ev) => {
      if (ev.data[0]) {
        resolve(deserializer(ev.data[1]))
      } else {
        reject(deserializer(ev.data[1]))
      }
      rcvPort.close()
    }
    rcvPort.onmessageerror = () => {
      reject(new Error('Messge error'))
      rcvPort.close()
    }
    mp.postMessage([sendingPort, data], [sendingPort, ...transferables])
  })
}

Suggested fix:

const fn = (...args: unknown[]) => {
  return new Promise((resolve, reject) => {
    const mc = new MessageChannel()
    const { data, transferables } = serializer(args)
    const rcvPort = mc.port1
    const sendingPort = mc.port2
    rcvPort.onmessage = (ev) => {
      if (ev.data[0]) {
        resolve(deserializer(ev.data[1]))
      } else {
        reject(deserializer(ev.data[1]))
      }
      rcvPort.close()
    }
    rcvPort.onmessageerror = () => {
      reject(new Error('Message error'))
      rcvPort.close()
    }
    try {
      mp.postMessage([sendingPort, data], [sendingPort, ...transferables])
    } catch (e) {
      rcvPort.close()
      reject(e)
    }
  })
}

Issue 4: SharedArrayBuffer detection logic appears inverted

  • Addressed
  • Dismissed

🟡 src/index.ts:114 - The condition logic for excluding SharedArrayBuffer from transferables appears incorrect:

if (ArrayBuffer.isView(value) && !(typeof SharedArrayBuffer !== 'function' && value.buffer instanceof SharedArrayBuffer)) {

Breaking this down:

  • !(typeof SharedArrayBuffer !== 'function' && value.buffer instanceof SharedArrayBuffer)
  • = !(!hasSharedArrayBuffer && isSharedArrayBuffer)
  • = hasSharedArrayBuffer || !isSharedArrayBuffer

This means the condition passes when SharedArrayBuffer exists OR when the buffer is not a SharedArrayBuffer. The intent seems to be to exclude SharedArrayBuffer from transferables (since you cannot transfer them), but this logic doesn't achieve that.

Suggested fix:

if (ArrayBuffer.isView(value) && !(typeof SharedArrayBuffer === 'function' && value.buffer instanceof SharedArrayBuffer)) {

This correctly excludes the value when SharedArrayBuffer is available AND the buffer is a SharedArrayBuffer.


Issue 5: Test doesn't verify memory leak fix

  • Addressed
  • Dismissed

⚪️ src/index.test.ts:26-46 - The test "should not leak memory" tests functionality but doesn't actually verify that memory leaks are fixed. A proper memory leak test would use WeakRef to verify objects are garbage collected.

Additionally, line 29 has a fire-and-forget promise that could cause race conditions in tests:

callbackFactory().then(callback => callback(() => 'foo'))
return callbackFactory().then(callback => callback(() => 'bar'))

The first line is not awaited, meaning if the test framework completes before this promise resolves, results could be incomplete. The test happens to work because Promise.all waits for the outer promises, but it's fragile.

Suggested improvement for a proper memory leak test:

it('should not leak memory', async () => {
  const weakRefs: WeakRef<object>[] = []
  
  const serialized = _.serializer((callbackFactory: () => Promise<(x: () => string) => Promise<void>>) => {
    callbackFactory().then(callback => callback(() => 'foo'))
    return callbackFactory().then(callback => callback(() => 'bar'))
  })
  
  const fn = _.deserializer(serialized.data) as (callback: () => (valueProvider: () => Promise<string>) => void) => void
  
  const results: [string, string][] = []
  weakRefs.push(new WeakRef(fn))
  
  await Promise.all([
    fn(() => (valueProvider) => valueProvider().then(value => results.push(['cb1', value]))),
    fn(() => (valueProvider) => valueProvider().then(value => results.push(['cb2', value])))
  ])
  
  assert.ok(results.length === 4)
  
  // Close revokables to allow cleanup
  serialized.revokables.forEach(port => port.close())
  
  // Force garbage collection if available
  if (global.gc) {
    global.gc()
    global.gc() // Call twice to ensure cleanup
  }
  
  // Verify the function can be garbage collected
  // Note: This is a heuristic - GC is not deterministic
  const collected = weakRefs.some(ref => ref.deref() === undefined)
  // In practice, we can't assert this reliably, but the test documents the expected behavior
})

Issue 6: Message data accessed without validation (security consideration)

  • Addressed
  • Dismissed

⚪️ src/index.ts:126-136 - The onmessage handler accesses ev.data[0] and ev.data[1] without validating the structure. While this is an internal communication channel, if a malicious message were somehow injected, it could cause unexpected behavior.

mc.port1.onmessage = async (ev) => {
  try {
    try {
      const result = await value(...deserializer(ev.data[1]) as unknown[])
      const { data, transferables } = serializer(result)
      ev.data[0].postMessage([true, data], transferables)
    } catch (e) {

Suggested improvement:

mc.port1.onmessage = async (ev) => {
  try {
    if (!Array.isArray(ev.data) || ev.data.length < 2) {
      console.error('Invalid message format')
      return
    }
    const [responsePort, argsData] = ev.data
    if (!(responsePort instanceof MessagePort)) {
      console.error('Invalid response port')
      return
    }
    try {
      const result = await value(...deserializer(argsData) as unknown[])
      const { data, transferables } = serializer(result)
      responsePort.postMessage([true, data], transferables)
    } catch (e) {

Summary

Priority Count
🔴 High 0
🟡 Medium 2
⚪️ Low 4

The most important issues to address are:

  1. Issue 3: Missing error handling around postMessage which could cause memory leaks
  2. Issue 4: The SharedArrayBuffer detection logic appears to be inverted

Review generated using glm-5 via Z.AI. Comment /review to re-run.

@corrideat corrideat requested a review from taoeffect March 2, 2026 20:50
@taoeffect
Copy link
Member

taoeffect commented Mar 2, 2026

/crush_fast {{openrouter, google/gemini-3.1-pro-preview, google/gemini-3-flash-preview}}


AI review started.

@github-actions
Copy link

github-actions bot commented Mar 2, 2026

Advanced AI Review

  • Type: Agentic (crush)
  • Model: google/gemini-3.1-pro-preview
Click to expand review

Here is the review of the changes, focusing strictly on bugs, memory leak scenarios, and robust improvements.

1. Destructive mutation of application data (value.cause) during serialization (🔴 High)

  • Addressed
  • Dismissed

In src/index.ts:96-103, the original Error object passed to the serializer is permanently mutated. JSON.stringify does not inherently clone objects before passing them to its replacer function, which means the application Error.cause is forcibly overwritten with the library's internal serialized array representation. Furthermore, any nested system objects generated by serializing the cause are ignored because transferables are destructured and dropped natively.

    // src/index.ts:96-103
    if (value instanceof Error) {
      const pos = verbatim.length
      verbatim[verbatim.length] = value
      // We need to also serialize `Error.cause` recursively
      if (value.cause) {
        value.cause = serializer(value.cause, true).data // ❌ Mutates and abandons transferables!
      }
      return rawResult(rawResultSet, ['_', '_err', rawResult(rawResultSet, ['_', '_ref', pos]), value.name])
    }

Suggestion:
You should pass the serialized cause dynamically as an optional fifth element inside the serialized target footprint, rather than backfilling it onto value.cause. We must also safely bubble up the transferables.

    if (value instanceof Error) {
      const pos = verbatim.length
      verbatim[verbatim.length] = value
      
      let serializedCause = undefined
      // We need to also serialize `Error.cause` recursively
      if (value.cause) {
        const result = serializer(value.cause, true)
        serializedCause = result.data
        result.transferables.forEach(t => transferables.add(t))
      }
      // Put the cloned cause securely at index 4
      return rawResult(rawResultSet, ['_', '_err', rawResult(rawResultSet, ['_', '_ref', pos]), value.name, serializedCause])
    }

Then in deserializer:

        // src/index.ts:208-216
        case '_err': {
          const err = value[2]
          if (err.name !== value[3]) {
            err.name = value[3]
          }
          if (value[4] !== undefined) {
            err.cause = deserializer(value[4])
          }
          return err
        }

2. Orphaned proxy memory leak when postMessage fails natively (🔴 High)

  • Addressed
  • Dismissed

When serializer(args) or serializer(result) are processed recursively, it's generally safe that revokables are ignored because your new portDestructor properly manages garbage collection if the package safely reaches the remote listener context. However, if the underlying postMessage synchronously throws (e.g., throwing a DataCloneError), the payload never reaches the portDestructor. Your local MessagePort receivers (mc.port1) will be indefinitely kept alive by their newly attached .onmessage listeners, causing an immediate, irretrievable memory leak.

Suggestion A: Clean up locally mapped nested proxy callbacks when sending args fails

              // src/index.ts:223-238
              const mc = new MessageChannel()
              const { data, transferables, revokables } = serializer(args)
              const rcvPort = mc.port1
              const sendingPort = mc.port2
              
              // ...
              try {
                mp.postMessage([sendingPort, data], [sendingPort, ...transferables])
              } catch (e) {
                rcvPort.close()
                revokables.forEach(port => port.close()) // ✅ Must close stranded inner receivers
                reject(e)
              }

Suggestion B: Clean up remote callbacks trapped locally when resolving responses natively fails

          // src/index.ts:133-145 
          try {
            const result = await value(...deserializer(ev.data[1]) as unknown[])
            const { data, transferables, revokables } = serializer(result)
            try {
              ev.data[0].postMessage([true, data], transferables)
            } catch (err) {
              revokables.forEach(port => port.close()) // ✅ Close stranded references
              throw err
            }
          } catch (e) {
            const { data, transferables } = serializer(e, true)
            ev.data[0].postMessage([false, data], transferables)
          }

3. Simplify and harden portDestructor logic (🟡 Medium)

  • Addressed
  • Dismissed

In portDestructor, wrapping the active MessagePort locally under a WeakRef creates unnecessary closure overhead and an unneeded abstraction. The parameter passed into FinalizationRegistry.register(target, heldValue) natively guarantees that heldValue cleanly scopes internally only as long as target stays alive anyway. Since the nested proxy fn securely holds the port in scope natively, we can strictly pass port down directly without it generating reference cycles or premature closures.

  // src/index.ts:38-46
  return (fn: object, port: MessagePort) => {
    // Using a weak reference to prevent leaking memory ❌ Incorrect assumption
    const portWR = new WeakRef(port)
    registry.register(fn, () => {
      const port = portWR.deref()
      port?.close()
    })
  }

Suggestion:
This is architecturally safer and guarantees there isn't a nanosecond window where the GC prunes the WeakRef off before the teardown initiates:

const portDestructor = (() => {
  if (typeof FinalizationRegistry !== 'function') {
    return () => {}
  }
  const registry = new FinalizationRegistry((port: MessagePort) => {
    port.close()
  })

  // Automatic destruction (handled on the deserializing end):
  return (fn: object, port: MessagePort) => {
    registry.register(fn, port)
  }
})()

4. Improve robust Deno/Node test validation checks (⚪️ Low)

  • Addressed
  • Dismissed

The GC garbage check implementation tries to conditionally handle older Deno quirks by wrapping the block under try/catch and evaluating optional chaining on global. Depending purely on environment bindings like Node-specific global suppresses natural suite-level errors.

  // src/index.test.ts:4-13
  // Deno doesn't seem to have implemented `afterEach`
  try {
    afterEach(() => {
      if (typeof global?.gc === 'function') {
        global.gc()
      }
    })
  } catch {}

Suggestion:
Rely actively on typeof for standard execution blocks and use the globally-compliant globalThis abstraction.

  if (typeof afterEach === 'function') {
    afterEach(() => {
      // Explicitly call the garbage collector on Node. This makes the tests finish faster
      if (typeof (globalThis as any).gc === 'function') {
        (globalThis as any).gc()
      }
    })
  }

Review generated using google/gemini-3.1-pro-preview via OpenRouter. Comment /review to re-run.

@taoeffect
Copy link
Member

taoeffect commented Mar 2, 2026

/crush_fast {{openrouter, openai/gpt-5.3-codex, google/gemini-3-flash-preview}}


AI review started.

@github-actions
Copy link

github-actions bot commented Mar 2, 2026

Advanced AI Review

  • Type: Agentic (crush)
  • Model: openai/gpt-5.3-codex
Click to expand review

1) 🟡 Leak fix silently degrades to no-op on older runtimes (FinalizationRegistry/WeakRef missing)

  • Addressed
  • Dismissed

Where: src/index.ts:30-33

if (typeof FinalizationRegistry !== 'function' || typeof WeakRef !== 'function') {
  return () => {}
}

If either API is unavailable, proxy ports are never auto-closed, so the memory-leak fix effectively disappears without warning.

Suggested fix (deterministic fallback):

type ProxiedFn = ((...args: unknown[]) => Promise<unknown>) & { revoke?: () => void }

const attachPortCleanup = (fn: ProxiedFn, port: MessagePort): void => {
  fn.revoke = () => port.close()

  if (typeof FinalizationRegistry === 'function' && typeof WeakRef === 'function') {
    const registry = new FinalizationRegistry((heldValue: () => void) => heldValue())
    const portWR = new WeakRef(port)
    registry.register(fn, () => { portWR.deref()?.close() })
  }
}

Use this in _fn so callers always have an explicit cleanup path even when GC hooks are unavailable.


2) 🟡 “should not leak memory” test does not assert memory/port cleanup

  • Addressed
  • Dismissed

Where: src/index.test.ts:29-50

it('should not leak memory', async () => {
  ...
  assert.ok(results.length === 4)
  ...
})

This verifies callback behavior, not leak behavior. It will pass even if ports stay open forever.

Suggested fix (assert cleanup signal):

it('should close function proxy ports', async () => {
  let closed = 0
  const serialized = _.serializer(() => 'ok')
  for (const p of serialized.revokables) {
    p.addEventListener('close', () => { closed++ }, { once: true })
  }

  const fn = _.deserializer(serialized.data) as () => Promise<string>
  await fn()

  serialized.revokables.forEach(p => p.close())
  assert.equal(closed, serialized.revokables.length)
})

This is a direct, deterministic assertion that cleanup actually occurs.


3) ⚪️ Broad catch {} masks test-environment regressions

  • Addressed
  • Dismissed

Where: src/index.test.ts:7-15

try {
  afterEach(() => { ... })
} catch {}

Any unexpected error in hook registration is swallowed, which can hide real failures and make CI behavior inconsistent.

Suggested simplification:

if (typeof afterEach === 'function') {
  afterEach(() => {
    if (typeof global?.gc === 'function') global.gc()
  })
}

Review generated using openai/gpt-5.3-codex via OpenRouter. Comment /review to re-run.

@taoeffect
Copy link
Member

taoeffect commented Mar 2, 2026

/crush_fast {{openrouter, google/gemini-3.1-pro-preview, google/gemini-3-flash-preview}}


AI review started.

devin-ai-integration[bot]

This comment was marked as resolved.

@github-actions
Copy link

github-actions bot commented Mar 2, 2026

Advanced AI Review

  • Type: Agentic (crush)
  • Model: google/gemini-3.1-pro-preview
Click to expand review

Here is my thorough review of the PR changes.

1. 🔴 TypedArray backed by SharedArrayBuffer is completely corrupted during serialization

  • Addressed
  • Dismissed

In the ArrayBuffer.isView(value) check, the condition has been updated to explicitly bypass the block if the array is backed by a SharedArrayBuffer in order to avoid throwing on transferables.add().

However, bypassing this block entirely means that verbatim misses the ArrayBuffer view value, causing the TypedArray to directly hit the fallback JSON.stringify(). Applying JSON.stringify on a TypedArray converts it into a standard object (e.g., {"0": 1, "1": 2}), completely destroying its underlying memory-buffer capabilities and properties during deserialization.

You should decouple the specific check for transferables.add from the general verbatim registration.

src/index.ts:114-119

    if (ArrayBuffer.isView(value) && !(typeof SharedArrayBuffer === 'function' && value.buffer instanceof SharedArrayBuffer)) {
      const pos = verbatim.length
      verbatim[verbatim.length] = value
      transferables.add(value.buffer as Transferables)
      return rawResult(rawResultSet, ['_', '_ref', pos])
    }

Suggestion:

    if (ArrayBuffer.isView(value)) {
      const pos = verbatim.length
      verbatim[verbatim.length] = value
      if (!(typeof SharedArrayBuffer === 'function' && value.buffer instanceof SharedArrayBuffer)) {
        transferables.add(value.buffer as Transferables)
      }
      return rawResult(rawResultSet, ['_', '_ref', pos])
    }

2. 🔴 revokables leak memory on deserializer proxy function arguments

  • Addressed
  • Dismissed

The PR correctly addresses the core bug by establishing an explicit lifetime for functions converted into MessagePorts; cleaning up the revokables list is the intentional fix for the memory leak.

However, when a codebase passes a proxy function as an argument through a deserialized proxy, the hidden serializer(args) call inside deserializer generates a list of revokables containing these proxy ports. Because they are never iterated to invoke .close() when the proxy promise finally resolves, they recreate the same memory leak out-of-sight. You should ensure any proxy function call appropriately tidies up its implicit revokables on success or rejection.

src/index.ts:241-252

              rcvPort.onmessage = (ev) => {
                if (ev.data[0]) {
                  resolve(deserializer(ev.data[1]))
                } else {
                  reject(deserializer(ev.data[1]))
                }
                rcvPort.close()
              }
              rcvPort.onmessageerror = () => {
                reject(new Error('Message error'))
                rcvPort.close()
              }

Suggestion:

              rcvPort.onmessage = (ev) => {
                try {
                  if (ev.data[0]) {
                    resolve(deserializer(ev.data[1]))
                  } else {
                    reject(deserializer(ev.data[1]))
                  }
                } finally {
                  rcvPort.close()
                  revokables.forEach(port => port.close())
                }
              }
              rcvPort.onmessageerror = () => {
                try {
                  reject(new Error('Message error'))
                } finally {
                  rcvPort.close()
                  revokables.forEach(port => port.close())
                }
              }

3. 🟡 Receiving side channels hang indefinitely if serialization of a runtime Error throws

  • Addressed
  • Dismissed

In the message handler standing in for the serialized function, if the runtime logic throws an error, the catch block issues ev.data[0].postMessage([false, data], transferables).

If this attempt to postMessage the rejected value throws (e.g., throwing a DataCloneError due to un-clonable references deep inside an unexpected error), it cascades into the outermost catch block. When this happens, ev.data[0].close() winds up skipped. The port permanently leaks and the sender's caller-promise hangs indefinitely, never learning that its remote counterpart exploded.

Moving ev.data[0].close() into a dedicated finally ensures the active link is safely torched down even if the failover postMessage fails.

src/index.ts:133-146

          } catch (e) {
            const { data, transferables } = serializer(e, true)
            ev.data[0].postMessage([false, data], transferables)
          }
          ev.data[0].close()
        } catch (e) {
          console.error('Async error on onmessage handler', e)
        }

Suggestion:

          } catch (e) {
            const { data, transferables } = serializer(e, true)
            ev.data[0].postMessage([false, data], transferables)
          }
        } catch (e) {
          console.error('Async error on onmessage handler', e)
        } finally {
          ev.data[0].close()
        }

4. ⚪️ Formatting: Extra space introduced in NPM scripts

  • Addressed
  • Dismissed

A minor syntax blemish: there is an unintended double space inserted right after node in your package's test command block.

package.json:23-23

    "test": "npm run lint && node  --expose-gc --import 'data:text/javascript,import { register } from \"node:module\"; import { pathToFileURL } from \"node:url\"; register(\"ts-node/esm\", pathToFileURL(\"./\"));' src/index.test.ts",

Suggestion:

    "test": "npm run lint && node --expose-gc --import 'data:text/javascript,import { register } from \"node:module\"; import { pathToFileURL } from \"node:url\"; register(\"ts-node/esm\", pathToFileURL(\"./\"));' src/index.test.ts",

Review generated using google/gemini-3.1-pro-preview via OpenRouter. Comment /crush_fast {{openrouter, google/gemini-3.1-pro-preview, google/gemini-3-flash-preview}} to re-run.

@corrideat
Copy link
Member Author

corrideat commented Mar 6, 2026

/crush_fast {{openrouter, google/gemini-3.1-pro-preview, google/gemini-3-flash-preview}}


AI review started.

@github-actions
Copy link

github-actions bot commented Mar 6, 2026

Advanced AI Review

  • Type: Agentic (crush)
  • Model: google/gemini-3.1-pro-preview
Click to expand review

Here is a review of the changes, focusing on potential bugs, security problems, and areas for improvement.

1. 🔴 High Importance: Error.cause serialization drops transferables and may crash structuredClone

  • Addressed
  • Dismissed

In src/index.ts:89-109, the approach used to serialize nested Error.cause creates two major issues:

  1. It calls serializer(value.cause, true).data, completely ignoring and discarding the transferables and revokables arrays returned from the recursive call. If .cause happens to contain a MessagePort or an ArrayBuffer, it won't be appended to the root transferables list.
  2. It then invokes structuredClone(value). When structuredClone processes the MessagePort left locally inside the nested verbatim cause structure—without an explicit transfer list—it will instantly throw a DOMException: DataCloneError.
    if (value instanceof Error) {
      const obj = (() => {
        if (value.cause) {
          const causeCopy = value.cause
          try {
            // We need to also serialize `Error.cause` recursively
            // Do it on a copy so that the original object isn't destructively
            // modified
            value.cause = serializer(value.cause, true).data
            return structuredClone(value)
          } finally {
            value.cause = causeCopy
          }
        } else {
          return value
        }
      })()
      const pos = verbatim.length
      verbatim[verbatim.length] = obj
      return rawResult(rawResultSet, ['_', '_err', rawResult(rawResultSet, ['_', '_ref', pos]), value.name])
    }

Improvement: You can avoid mutating the original error, prevent clone errors, and automatically retain downstream transferables by returning the .cause as a new fifth element in the exact same format array. Because JSON.stringify automatically traverses elements in arrays returned by replacers, .cause will seamlessly be processed by the outer serializer bounds.

Update the serializer like so:

    if (value instanceof Error) {
      const pos = verbatim.length
      verbatim[verbatim.length] = value
      const arr = ['_', '_err', rawResult(rawResultSet, ['_', '_ref', pos]), value.name]
      if ('cause' in value) {
        arr.push(value.cause)
      }
      return rawResult(rawResultSet, arr)
    }

This also requires an update to the correlating deserializer on src/index.ts:229-237. The JSON.parse reviver evaluates "bottom-up," meaning by the time _err is intercepted, value[4] is already fully deserialized. Add an else if to ensure old serdes data works correctly:

        case '_err': {
          if (value[2].name !== value[3]) {
            value[2].name = value[3]
          }
          if (value.length > 4) {
            value[2].cause = value[4]
          } else if (value[2].cause) {
            value[2].cause = deserializer(value[2].cause)
          }
          return value[2]
        }

2. 🔴 High Importance: Payload errors permanently hang the deserializer Promise

  • Addressed
  • Dismissed

In src/index.ts:248-259, there is a scenario where deserializer can throw a synchronous error inside the onmessage handling block. For example, if serialization of the initial error threw and it falls back to $ev.data[0].postMessage([false]), iterating over ev.data[1] on the receiver end evaluates to undefined and throws.

              rcvPort.onmessage = (ev) => {
                try {
                  if (ev.data[0]) {
                    resolve(deserializer(ev.data[1]))
                  } else {
                    reject(deserializer(ev.data[1]))
                  }
                } finally {
                  rcvPort.close()
                  revokables.forEach(port => port.close())
                }
              }

If deserializer(ev.data[1]) throws asynchronously during evaluation, the resolve or reject execution abruptly halts. While the finally correctly closes ports, the awaited Promise in the upstream caller's space just sits permanently "pending" resulting in awful UI hangs or memory leaks.

Improvement: Protect the un-wrapping operations so that if anything throws or misses indices, it gracefully fails the promise.

              rcvPort.onmessage = (ev) => {
                try {
                  if (ev.data[0]) {
                    resolve(deserializer(ev.data[1]))
                  } else {
                    reject(ev.data.length > 1 ? deserializer(ev.data[1]) : new Error('Message error'))
                  }
                } catch (err) {
                  reject(err)
                } finally {
                  rcvPort.close()
                  revokables.forEach(port => port.close())
                }
              }

3. 🟡 Medium Importance: Sequential structure could skip .close() and leak the MessagePort

  • Addressed
  • Dismissed

In src/index.ts:130-154, ev.data[0].close() relies on reaching the end of the try block safely.

      mc.port1.onmessage = async (ev) => {
        try {
          try {
            // [...] normal execution
          } catch (e) {
            try {
              const { data, transferables } = serializer(e, true)
              ev.data[0].postMessage([false, data], transferables)
            } catch (e) {
              console.error('Error on onmessage handler trying to transmit error', e)
              ev.data[0].postMessage([false])
            }
          }
          ev.data[0].close()
        } catch (e) {
          console.error('Async error on onmessage handler', e)
        }
      }

If the fallback invocation ev.data[0].postMessage([false]) throws (e.g. because the channel died or payload constraint errors surfaced), it immediately leaps straight to the outer catch (e) block on line 151. This outright skips closing ev.data[0], leaking the port on a recurring path.

Improvement: Re-tether .close() into a finally block to absolutely guarantee cleanup boundaries:

      mc.port1.onmessage = async (ev) => {
        try {
          try {
            const result = await value(...deserializer(ev.data[1]) as unknown[])
            const { data, transferables, revokables } = serializer(result)
            try {
              ev.data[0].postMessage([true, data], transferables)
            } catch (e) {
              revokables.forEach(port => port.close())
              throw e
            }
          } catch (e) {
            try {
              const { data, transferables } = serializer(e, true)
              ev.data[0].postMessage([false, data], transferables)
            } catch (err) {
              console.error('Error on onmessage handler trying to transmit error', err)
              ev.data[0]?.postMessage([false])
            }
          } finally {
            ev.data[0]?.close()
          }
        } catch (e) {
          console.error('Async error on onmessage handler', e)
        }
      }

4. ⚪️ Low Importance: Pointless WeakRef check

  • Addressed
  • Dismissed

In src/index.ts:32-34 the initialization of portDestructor tests for the existence of WeakRef:

const portDestructor = (() => {
  if (typeof FinalizationRegistry !== 'function' || typeof WeakRef !== 'function') {
    return () => {}
  }

It is never used internally. FinalizationRegistry can execute without references to WeakRef, and avoiding the check makes it slightly cleaner.

Improvement: Use only what's required:

const portDestructor = (() => {
  if (typeof FinalizationRegistry !== 'function') {
    return () => {}
  }

Review generated using google/gemini-3.1-pro-preview via OpenRouter. Comment /crush_fast {{openrouter, google/gemini-3.1-pro-preview, google/gemini-3-flash-preview}} to re-run.

devin-ai-integration[bot]

This comment was marked as resolved.

@corrideat
Copy link
Member Author

corrideat commented Mar 6, 2026

/crush_fast {{openrouter, google/gemini-3.1-pro-preview, google/gemini-3-flash-preview}}


AI review started.

@github-actions
Copy link

github-actions bot commented Mar 6, 2026

Advanced AI Review

  • Type: Agentic (crush)
  • Model: google/gemini-3.1-pro-preview
Click to expand review

Overall, these changes bring a solid approach to cleaning up proxied MessagePorts in the serdes library while allowing structured payloads. The explicit execution lifetimes and FinalizationRegistry implementation safely stop memory from growing unbounded.

I've reviewed the code and found a couple of areas where error handling logic can be adjusted to preserve error information and prevent an edge-case port leak.

1. 🟡 Use of Error('Error') hides actual error information

  • Addressed
  • Dismissed

In your Error serialization logic (src/index.ts:88-113), if structuredClone(value) throws an error (e.g., if there happens to be an unclonable property attached to the error), the code catches the exception but doesn't return anything. This causes the IIFE to evaluate to undefined, which forces the fallback to a generic Error('Error').

Using this generic fallback entirely loses the original error's message, name, and stack trace, making debugging extremely difficult on the receiving end. You can improve this gracefully by constructing a fallback Error inside the catch block that preserves these essential details.

// src/index.ts:88-117
      const obj = (() => {
        if (value.cause) {
          const causeCopy = value.cause
          let serialized: ReturnType<typeof serializer> | undefined
          try {
            // We need to also serialize `Error.cause` recursively
            // Do it on a copy so that the original object isn't destructively modified
            serialized = serializer(value.cause, true)
            value.cause = serialized.data
            const copy = structuredClone(value)

            serialized.transferables.forEach(t => transferables.add(t))
            serialized.revokables.forEach(r => revokables.add(r))

            return copy
          } catch (e) {
            console.error('Error serializing error cause', e)
            serialized?.revokables.forEach(r => r.close())
            
            // Add a fallback that preserves the error's key details
            const fallback = new Error(value.message)
            fallback.name = value.name
            fallback.stack = value.stack
            fallback.cause = serialized ? serialized.data : String(value.cause)
            return fallback
          } finally {
            value.cause = causeCopy
          }
        } else {
          return value
        }
      })()
      const pos = verbatim.length

2. 🟡 Missing finally block in mc.port1.onmessage can cause leaked ports

  • Addressed
  • Dismissed

In the function handler created within serializer (src/index.ts:133-157), you're rightly closing the sender port with ev.data[0].close(). However, this call is positioned at the end of the try block. If ev.data[0].postMessage([false]) throws an exception inside the nested catch (for instance, if the port channel gets unexpectedly dropped or closed elsewhere), execution will jump straight to the outer catch block, entirely skipping ev.data[0].close().

To guarantee that the port is securely shut down even when transmitting the error fails, execution should be moved to a finally block. (I also suggest renaming the inner caught error to err to avoid shadowing).

// src/index.ts:133-157
    if (typeof value === 'function' && !noFn) {
      const mc = new MessageChannel()
      mc.port1.onmessage = async (ev) => {
        try {
          try {
            const result = await value(...deserializer(ev.data[1]) as unknown[])
            const { data, transferables, revokables } = serializer(result)
            try {
              ev.data[0].postMessage([true, data], transferables)
            } catch (e) {
              revokables.forEach(port => port.close())
              throw e
            }
          } catch (e) {
            try {
              const { data, transferables } = serializer(e, true)
              ev.data[0].postMessage([false, data], transferables)
            } catch (err) {
              console.error('Error on onmessage handler trying to transmit error', err)
              // This postMessage could also throw, making the finally block crucial
              ev.data[0].postMessage([false])
            }
          }
        } catch (e) {
          console.error('Async error on onmessage handler', e)
        } finally {
          ev.data[0]?.close()
        }
      }

Review generated using google/gemini-3.1-pro-preview via OpenRouter. Comment /crush_fast {{openrouter, google/gemini-3.1-pro-preview, google/gemini-3-flash-preview}} to re-run.

devin-ai-integration[bot]

This comment was marked as resolved.

@corrideat
Copy link
Member Author

corrideat commented Mar 6, 2026

/crush_fast {{openrouter, google/gemini-3.1-pro-preview, google/gemini-3-flash-preview}}


AI review started.

@github-actions
Copy link

github-actions bot commented Mar 6, 2026

Advanced AI Review

  • Type: Agentic (crush)
  • Model: google/gemini-3.1-pro-preview
Click to expand review

Here is a thorough review of the provided code changes, focusing on memory leaks, reliability, and code clarity. The issues have been ordered by importance from most to least critical.

1. 🔴 Memory leak if serialization throws an error

  • Addressed
  • Dismissed

In src/index.ts:51-143, the serializer traverses the user's data recursively using JSON.stringify. As part of this traversal, whenever a function is encountered, a new MessageChannel proxy is eagerly created and added to the revokables set.

The critical bug occurs if JSON.stringify throws an error further down the tree (for instance, due to a throwing object getter, a circular reference, or max call stack limit). If the serialization natively aborts by throwing, the serializer function catches nothing and simply bubbles the error up. However, the MessagePorts that were already generated and pushed to revokables remain open. Because they have .onmessage listeners attached, the engine considers them "active," effectively leaking them in memory permanently.

Fix Idea: Place the JSON traversal inside a try...catch block to guarantee that any accumulated revokables get explicitly closed if the traversal fails prematurely.

export const serializer = (data: unknown, noFn?: boolean): {
    data: unknown,
    transferables: Transferables[],
    revokables: Revokables[]
} => {
  const rawResultSet = new WeakSet()
  const verbatim: Verbatim[] = []
  const transferables = new Set<Transferables>()
  const revokables = new Set<Revokables>()
  
  try {
    // JSON.parse and JSON.stringify are called for their ability to do a deep
    // clone and calling a reviver / replacer.
    const result = JSON.parse(JSON.stringify(data, (_key: string, value: unknown) => {
      // Return already processed values without modifications
      if (value && typeof value === 'object' && rawResultSet.has(value)) return value
      // ... [rest of the existing replacer logic]
      return value
    }), (_key: string, value: unknown) => {
      // ... [rest of the existing reviver logic]
    })

    return {
      data: result,
      transferables: Array.from(transferables),
      revokables: Array.from(revokables)
    }
  } catch (err) {
    // Prevent memory leaks if stringify aborts cleanly mid-traversal
    revokables.forEach(port => port.close())
    throw err
  }
}

2. 🟡 Error fallback implementation loses important Error details (cause, stack, name)

  • Addressed
  • Dismissed

In src/index.ts:110-116, the implementation gracefully prevents crashes caused by un-clonable Error properties:

          } catch (e) {
            console.error('Error serializing error cause', e)
            serialized?.revokables.forEach(r => r.close())

            // Add a fallback that preserves the error's key details
            const fallback = new Error(value.message)
            return fallback
          } finally {

However, the fallback implementation throws away vital debugging information. Specifically, it drops value.name and the backtrace value.stack.

Furthermore, since serializer(value.cause, true) might have succeeded right before structuredClone(value) failed, there is no reason to completely drop the successfully serialized cause! Rather than closing the revokables and throwing away the cause, we should append the serialized cause to our fallback and appropriately merge its resulting transferables and revokables.

Fix Idea:

          } catch (e) {
            console.error('Error serializing error cause', e)

            // Add a fallback that preserves the error's key details
            const fallback = new Error(value.message)
            fallback.name = value.name
            fallback.stack = value.stack
            
            // If the cause was successfully serialized before it threw, carry it over
            if (serialized) {
              fallback.cause = serialized.data
              serialized.transferables.forEach(t => transferables.add(t))
              serialized.revokables.forEach(r => revokables.add(r))
            }
            return fallback
          } finally {
            value.cause = causeCopy
          }

3. ⚪️ Shadowed inner catch variable

  • Addressed
  • Dismissed

In src/index.ts:98-103, you have a catch wrapper that shadows the outer Exception variable e:

          } catch (e) {
            try {
              const { data, transferables } = serializer(e, true)
              ev.data[0].postMessage([false, data], transferables)
            } catch (e) {
              console.error('Error on onmessage handler trying to transmit error', e)
              ev.data[0].postMessage([false])
            }
          }

While this performs functionally fine, it causes the inner error to mask the originally thrown exception context and commonly triggers no-shadow linter warnings. It is much cleaner to rename the internal catch parameter.

Fix Idea:
Change the inner catch (e) to catch (transmitError).

4. ⚪️ Unnecessary try/catch wrapper for afterEach functionality check

  • Addressed
  • Dismissed

In src/index.test.ts:4-13, the fallback checking for afterEach in Deno shims executes the function speculatively and suppresses TypeError upon invocation:

  // Deno doesn't seem to have implemented `afterEach`
  try {
    afterEach(() => {
      // Explicitly call the garbage collector on Node. This makes the tests
// ...
  } catch {
    // if afterEach fails, it's not a critical error
  }

Fix Idea:
Since you are importing afterEach directly, evaluating its type explicitly is simpler and a lot more direct for environments where imported definitions may be structurally stubbed as undefined.

  // Deno doesn't seem to have implemented `afterEach`
  if (typeof afterEach === 'function') {
    afterEach(() => {
      // Explicitly call the garbage collector on Node. This makes the tests
      // finish faster (otherwise, they hang until garbage collection occurs)
      if (typeof gc === 'function') {
        // eslint-disable-next-line no-undef
        gc()
      }
    })
  }

Review generated using google/gemini-3.1-pro-preview via OpenRouter. Comment /crush_fast {{openrouter, google/gemini-3.1-pro-preview, google/gemini-3-flash-preview}} to re-run.

devin-ai-integration[bot]

This comment was marked as resolved.

@taoeffect
Copy link
Member

taoeffect commented Mar 6, 2026

/crush_fast {{openrouter, openai/gpt-5.4, google/gemini-3-flash-preview}}

Note: this model is new and expensive - it should only be run after the other models stop finding things. I'm running it here as you've indicated the PR is ready to merge.


AI review started.

@github-actions
Copy link

github-actions bot commented Mar 6, 2026

Advanced AI Review

  • Type: Agentic (crush)
  • Model: openai/gpt-5.4
Click to expand review

1. 🔴 FinalizationRegistry can close a still-live shared proxy port nondeterministically

  • Addressed
  • Dismissed

src/index.ts:30-42, src/index.ts:263-304

const registry = new FinalizationRegistry((heldValue: MessagePort) => {
  heldValue.close()
})
...
return (fn: object, port: MessagePort) => {
  registry.register(fn, port)
}
const mp = value[2]
      return
    }
    cleanup()
  }

  mc.port1.onmessageerror = () => {
    clearTimeout(timer)
    cleanup(new Error('Message error'))
  }

  try {
    mp.postMessage([mc.port2, data], [mc.port2, ...transferables])
  } catch (e) {
    clearTimeout(timer)
    cleanup(e)
  }
})

5. ⚪️ The new “memory leak” test does not actually assert on memory or port cleanup

  • Addressed
  • Dismissed

src/index.test.ts:32-53, package.json:22

it('should not leak memory', async () => {
  ...
  assert.ok(results.length === 4)
  assert.ok(results.some(([cb, str]) => cb === 'cb1' && str === 'foo'))
  ...
})
"test": "npm run lint && node --expose-gc ... src/index.test.ts"

This test only checks functional behavior. It will still pass if every call leaks ports. Adding --expose-gc and calling gc() in afterEach therefore increases test complexity without validating the leak fix.

A better test is to assert cleanup behavior directly, e.g. by spying on MessagePort.prototype.close or counting finalizer-driven closures in a controlled harness:

it('closes revokable ports after proxied calls settle', async () => {
  let closeCount = 0
  const originalClose = MessagePort.prototype.close
  MessagePort.prototype.close = function () {
    closeCount++
    return originalClose.call(this)
  }

  try {
    const serialized = _.serializer((cb: (fn: () => string) => Promise<void>) => cb(() => 'ok'))
    const fn = _.deserializer(serialized.data) as (cb: (fn: () => Promise<string>) => Promise<void>) => Promise<void>

    await fn(async getValue => { await getValue() })

    assert.ok(closeCount > 0)
  } finally {
    MessagePort.prototype.close = originalClose
  }
})

npm test is also currently blocked in this checkout because ESLint is missing (sh: 1: eslint: not found).

Note

I had to manually grab the review from the workflow console output. For some reason the review extraction step (that was added because Crush used to include a large preamble in front of some reviews) stripped out the entire review.


Review generated using openai/gpt-5.4 via OpenRouter. Comment /crush_fast {{openrouter, openai/gpt-5.4, google/gemini-3-flash-preview}} to re-run.

@corrideat
Copy link
Member Author

It goes from 1 to 5, skipping 2, 3 & 4. None of the two issues seem too relevant.

  • Issue about closing ports non-deterministically: Without a specific issue, this sounds like things working as intended. The point of this PR is to ensure that ports are closed.
  • Issue about test not actually checking for leaks: Without hooking into Node internals, an indirect test is the best we can do to check the process terminates.

Copy link
Member

@taoeffect taoeffect left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fantastic work @corrideat !

@taoeffect taoeffect merged commit 953daf7 into main Mar 6, 2026
3 checks passed
@corrideat corrideat deleted the 1-fix-memory-leak branch March 6, 2026 17:35
@taoeffect
Copy link
Member

taoeffect commented Mar 7, 2026

@corrideat wrote:

It goes from 1 to 5, skipping 2, 3 & 4

Oops, this had to do with how copy/paste works in the Github Actions workflow. The UI doesn't actually render all of the lines, only the visible and near-visible ones.

I've recovered them. Was there anything important we missed?

Recovered review:

2. 🔴 Revokables leak when a proxied function throws an error whose cause contains function proxies

  • Addressed
  • Dismissed

src/index.ts:163-166

const { data, transferables } = serializer(e, true)
ev.data[0].postMessage([false, data], transferables)

serializer(e, true) can still create revokables while serializing Error.cause, because the error branch recursively calls serializer(value.cause, true) and that recursion may preserve ports created deeper in the structure. In the success path above, those revokables are ignored and never closed, so repeated failing calls can still leak ports on the sender side.

Close them after postMessage, just like the caller path already does:

const { data, transferables, revokables } = serializer(e, true)
try {
  ev.data[0].postMessage([false, data], transferables)
} catch (err) {
  revokables.forEach(port => port.close())
  throw err
} finally {
  revokables.forEach(port => port.close())
}

3. 🟡 Falsy Error.cause values are silently dropped

  • Addressed
  • Dismissed

src/index.ts:96-106, src/index.ts:256-257

if (value.cause) {
if (value[2].cause) {
  value[2].cause = deserializer(value[2].cause)
}

This changes behavior for valid causes like 0, false, or ''. Those are real cause values, but the truthiness checks skip them entirely, so the round-trip is lossy.

Use an existence check instead of truthiness:

if ('cause' in value) {
  const causeCopy = value.cause
  ...
}
if ('cause' in value[2]) {
  value[2].cause = deserializer(value[2].cause)
}

4. 🟡 A hung or broken remote call can still leak per-call ports indefinitely

  • Addressed
  • Dismissed

src/index.ts:265-299

return new Promise((resolve, reject) => {
  const mc = new MessageChannel()
  const { data, transferables, revokables } = serializer(args)
  const rcvPort = mc.port1
  const sendingPort = mc.port2
  rcvPort.onmessage = (ev) => {
    ...
    rcvPort.close()
    revokables.forEach(port => port.close())
  }
  rcvPort.onmessageerror = () => {
    ...
    rcvPort.close()
    revokables.forEach(port => port.close())
  }
  try {
    mp.postMessage([sendingPort, data], [sendingPort, ...transferables])
  } catch (e) {
    rcvPort.close()
    revokables.forEach(port => port.close())
    reject(e)
  }
})

Cleanup only happens on response, messageerror, or synchronous postMessage failure. If the remote side never responds because the backing port was already closed, a handler crashes before replying, or a finalizer races, this leaves rcvPort and all nested revokables open forever. That is still a denial-of-service vector.

A simple improvement is to add a one-shot cleanup helper and optional timeout/abort path:

return new Promise((resolve, reject) => {
  const mc = new MessageChannel()
  const { data, transferables, revokables } = serializer(args)

  const cleanup = (error?: unknown) => {
    mc.port1.close()
    revokables.forEach(port => port.close())
    if (error !== undefined) reject(error)
  }

  const timer = setTimeout(() => {
    cleanup(new Error('Timed out waiting for proxied function response'))
  }, 30000)

  mc.port1.onmessage = (ev) => {
    clearTimeout(timer)
    try {
      resolve(ev.data[0] ? deserializer(ev.data[1]) : Promise.reject(deserializer(ev.data[1])))
    } catch (e) {
      cleanup(e)
      return
    }
    cleanup()
  }

  mc.port1.onmessageerror = () => {
    clearTimeout(timer)
    cleanup(new Error('Message error'))
  }

  try {
    mp.postMessage([mc.port2, data], [mc.port2, ...transferables])
  } catch (e) {
    clearTimeout(timer)
    cleanup(e)
  }
})

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Fix memory leak

2 participants