This section focuses on the parts of async JavaScript that break simplistic mental models.
Topics:
- Promise internals and resolution procedure
- Async/await desugaring and ordering
- Microtask starvation
- Backpressure
- Async iterators and stream-style pumping with cancellation
The goal is not syntax fluency; it is deterministic reasoning under scheduler pressure.
A promise is always in exactly one state:
pendingfulfilled(with a value)rejected(with a reason)
It transitions at most once from pending to settled.
Each .then(onFulfilled, onRejected) registers reactions.
Reactions run asynchronously in a microtask turn after settlement.
Important guarantee:
- Even when a promise is already fulfilled,
.then(...)callbacks do not run synchronously (Zalgo avoidance).
When resolving with x, Promise machinery must decide how to adopt x:
- If
xis the same promise (self-resolution), reject withTypeError. - If
xis an object/function with callablethen, treat it as a thenable. - Call
thenwith resolve/reject wrappers. - First call wins; later calls are ignored.
- If reading/calling
thenthrows before settlement, reject.
This is why direct thenable.then(resolve, reject) without guards is unsafe.
Microtasks run after current JS stack and before timers/IO callbacks.
Zalgo avoidance means callbacks run predictably async:
'use strict';
const p = Promise.resolve(1);
let sync = true;
p.then(() => {
console.log(sync); // false
});
sync = false;async function f() always returns a promise.
await expr is conceptually:
- evaluate
expr - convert to promise (
Promise.resolve(expr)semantics) - suspend function
- resume continuation in a microtask when settled
'use strict';
async function run() {
try {
const value = await mayReject();
return value;
} catch (err) {
return 'recovered';
}
}catch handles rejection from awaited promise like synchronous throw at the suspension point.
Sequential pattern:
const a = await taskA();
const b = await taskB();taskB starts only after taskA resolves.
Parallel pattern:
const pa = taskA();
const pb = taskB();
const [a, b] = await Promise.all([pa, pb]);Both tasks start immediately before awaiting.
'use strict';
console.log('sync-1');
Promise.resolve().then(() => console.log('promise-then'));
queueMicrotask(() => console.log('queueMicrotask'));
setTimeout(() => console.log('timeout'), 0);
setImmediate(() => console.log('immediate'));
console.log('sync-2');Deterministic parts:
sync-1,sync-2first- microtasks before timer/immediate
Relative ordering between setTimeout(0) and setImmediate can differ by phase/context; do not hardcode broad claims.
'use strict';
async function demo() {
console.log('A');
await null;
console.log('B');
}
demo();
Promise.resolve().then(() => console.log('C'));
console.log('D');Output ordering:
AD- then microtasks:
BandCordering depends on enqueue order from this turn (engine/spec mechanics).
The safe statement: await continuation is microtask-scheduled, never synchronous.
Microtasks include:
- Promise reactions (
.then/.catch/.finally) queueMicrotask
Macrotask-like phases include timers, poll/IO callbacks, check (setImmediate).
If code recursively schedules microtasks without yielding, it can starve timers/IO.
'use strict';
let count = 0;
function loop() {
count++;
if (count < 1_000_000) queueMicrotask(loop);
}
loop();
setImmediate(() => console.log('late')); // may be delayed until microtask chain endsSymptoms:
- Timer/IO callbacks delayed while CPU active
- Event loop lag spikes
Mitigation pattern:
- Yield every N microtasks to macrotask queue (
setImmediateorsetTimeout(0)).
if (i % yieldEvery === 0) {
await new Promise((resolve) => setImmediate(resolve));
}Backpressure is flow control: producer should not outrun consumer.
Without backpressure:
- buffers grow
- memory rises
- latency and GC pressure worsen
In writable-like flows, write(chunk) returning false means:
- internal buffer is at/above high water mark
- producer must pause
- resume when
drainoccurs
Push-based:
- source emits regardless of downstream state
- must negotiate pause/resume or fail on overflow
Pull-based:
- consumer asks for next item
- naturally applies demand control
- bounded queues (
highWaterMark) - explicit stop signal (
pushreturns false) - awaitable drain before resuming
- cancellation via
AbortSignal
Consumes async iterables sequentially with implicit await on each next().
for await (const chunk of iterable) {
// process chunk
}async function* generate() {
yield 1;
yield 2;
}Push source shape:
const unsubscribe = subscribe((value) => {
// called when source emits
});Bridging concerns:
- buffer management
- overflow policy
- cleanup in
return()
For this module's exercises, overflow policy is explicit error:
- if buffer is full and source pushes another value, iterator enters errored state with
Backpressure overflow - iterator unsubscribes once
Pump loop concerns:
- write backpressure (
writefalse => awaitdrain) - cancellation (
AbortSignal) - cleanup (
close())
- "
awaitmakes code multithreaded." - "Promise callbacks can run synchronously if already resolved."
- "Microtasks are always harmless and tiny."
- "If memory grows in async pipeline, it is always a leak." (could be temporary buffering/backpressure issue)
- "
Promise.allruns tasks one by one." (it does not)
When asked about ordering:
- Separate synchronous stack, microtasks, and macrotasks/phases.
- Explain enqueue points (where continuation is scheduled).
- State which ordering guarantees are strict vs environment-dependent.
When asked about starvation:
- repeated microtasks can delay timers/IO
- mitigation is explicit yielding to macrotask queue
When asked about backpressure:
- describe producer signal (
false, pause, or overflow error) - bounded buffering + resume protocol + cancellation
What not to claim:
- "
awaitis parallel by default" - "timer vs immediate ordering is globally fixed"
- "GC/backpressure details are identical in all JS runtimes"