0.16.2 step 4: native wasm-gc record/replay in the playground#18
Merged
Conversation
Driving feature of 0.16.2. Trace capture and replay used to bounce
through the VM-in-wasm32 bridge (`comp.aver_run_record` /
`comp.aver_replay_run`); the playground now compiles user source to
wasm-gc bytes via the existing `aver_compile_project` and drives
`--record` / `--replay` on the WebWorker, with effect outcomes
flowing through a JS-side mirror of `aver::replay::EffectReplayState`.
Layout:
- `tools/website/playground/replay_state.js` (new, ~225 lines).
`EffectReplayState` with Normal / Recording / Replaying modes,
per-effect args / outcome trace, structural-scope (`?!` / `!`)
group / branch / effect_occurrence tracking, `replayEffect` /
`recordEffect` / `ensureReplayConsumed` API. Shape and JSON
contract match the Rust EffectReplayState exactly so traces
cross between the playground and the CLI by file copy alone.
- `tools/website/playground/wasm_host.js` extended.
`AverBrowserHost` carries an `EffectReplayState` instance.
Every host import (Args, Console, Random, Time, Terminal, plus
the three structural-scope markers `record_enter_group` /
`record_set_branch` / `record_exit_group`) routes through a
shared `recordOrDispatch(name, args, realCall, decodeOutcome,
encodeOutcome)` helper:
- In Recording mode: real call runs, outcome is appended to the
trace via `encodeOutcome`.
- In Replay mode: trace outcome is decoded back into a wasm-gc
Val (via `decodeOutcome` + per-type factory exports like
`__rt_result_string_string_ok`, `__rt_option_string_some`,
`__rt_record_terminal_size_make`) instead of touching the
real I/O — same `try_replay` contract the CLI host enforces.
- In Normal mode: passthrough (zero overhead beyond the mode
check).
- `tools/website/playground/worker.js` gains `record` and `replay`
message handlers. `recordModule(wasmBytes, …)` calls
`host.recorder.startRecording()`, instantiates + runs the wasm-gc
module, posts the recorded trace back to the main thread.
`replayModule(wasmBytes, recording, checkArgs)` primes the
recorder with the recording's effects, runs the module, asserts
`ensureReplayConsumed`, posts replay progress + args-diff count
back. The pre-existing `run` message stays unchanged.
- `tools/website/playground/app.js` `doRecord` / `doReplay` rewritten.
Compiles via `comp.aver_compile_project` then post-messages the
WebWorker; awaits `record-finished` / `replay-finished` on a
Promise primed against `state.recordResolve` /
`state.replayResolve`. Replay summary string covers `matched` /
`prefix` / `diverge` and surfaces args-diff warnings when
non-zero. Per-fn `--expr` recording (`Record fn…`) still falls
back to the VM-in-wasm32 path until the compiler exposes a
wasm-bindgen entry-call wrapper — flagged in the code with a
comment.
JS syntax sanity (`node --check`): clean across all four files.
End-to-end browser testing requires a live playground tab and is
deferred to the manual smoke pass post-merge.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Contributor
Deploying with
|
| Status | Name | Latest Commit | Preview URL | Updated (UTC) |
|---|---|---|---|---|
| ✅ Deployment successful! View logs |
aver | f7aba1e | Commit Preview URL | May 06 2026, 08:09 AM |
…replay `aver_compile_project` returns a `Uint8Array` (wasm-bindgen `Vec<u8>`), and `worker.postMessage(msg, transferList)` only accepts `ArrayBuffer` items in the transfer list — passing the typed-array view itself raises `invalid transferable array for structured clone`. Pass `wasmBytes.buffer` instead in `doRecord` / `doReplay`. The existing `Run` callsite already does the right thing because its `wasmBytes` originates from `file.arrayBuffer()` and is an `ArrayBuffer` outright. Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
…yground Followup on user feedback: clicking Record on an interactive program (checkers, snake, tetris, …) was a one-way trip — the worker ran the game, the recording grew inside the worker, and there was no way for the user to end the session on their own terms. Older versions had a hard cap that gave up after N effects; the wasm-gc playground path didn't (rightly: caps are an arbitrary cliff), but that left infinite-loop programs without an exit. UX target: Record (main) starts a real, playable game with the recorder live in the background. Stop (or any natural game end) finalises the trace. Plumbing: - `EffectReplayState.recordEffect` now returns the freshly-pushed record so callers can stream it. Worker-side: every effect lands on the main thread as a `trace-effect` postMessage right after `recordEffect` appends. The trace lives in two places — the worker's `recordedEffects` (full record-finished payload) and a main-thread mirror (`state.recordingBuffer`) that survives a worker.terminate. - `doRecord` flips `runButton` / `stopButton` like Run does, sets up the live mirror + meta (program_file, module_root), and surfaces the per-effect counter in the status line: "Recording: N effect(s)… (click Stop to finish)". The user sees the trace grow in real time as the game plays. - `stopRun` is now dual-purpose: when called mid-recording it builds a session JSON from the main-thread mirror (instead of going through `state.recordResolve`'s `record-finished` path), finalises the trace, and surfaces "Stopped recording at N effect(s)." For non-recording sessions the old "Run stopped." flow stays unchanged. - `record-finished` (worker reaches end of `main` on its own) still works — it overrides the buffered finalisation with the worker's full record_finished payload, so a clean game-end produces an identical trace to a cap-style run. Net result: programs without an exit (game loops, REPLs, infinite recursion-with-effects) can be recorded for as long as the user wants, then closed by clicking Stop. The trace is whole, the JSON shape is identical to a CLI `aver run --record`, and the UX finally matches what `Record` always promised. Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
…resize ship
Both `doRecord` and `doReplay` were creating workers raw (`new
Worker(...)` + manual `worker.onmessage`) and then immediately
dispatching the `record`/`replay` task. That skipped the two init
messages `spawnWorker` sends along with worker creation:
- `init-input` — ships the SharedArrayBuffer key + line queues so
`Terminal.readKey` can wait on the main thread's keypresses
(without it games never see a single key, just sit there).
- `resize` — reports the actual playground terminal cols/rows so
`terminal_size` reflects the rendered surface (without it
every game stays at 80×35 default and nothing draws).
Switching to `spawnWorker()` reuses the same init sequence Run uses;
the `record`/`replay` postMessage lands after init-input + resize in
the worker's onmessage queue, so by the time `instantiateAndCallEntry`
runs the host already has its key/line buffers and the right
terminal dimensions. Plus the worker's `setTerminalSize` call (via
the `resize` handler) keeps the game's redraw consistent with the
playground's real layout.
End-to-end: clicking Record on a TUI program now lights up the
terminal surface immediately, accepts keypresses, and the live
"Recording: N effect(s)…" counter ticks as the game plays. Stop
finalises from the main-thread mirror as before.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Trace records used to stamp `caller_fn: "main"` on every effect
captured under the wasm-gc backend regardless of which Aver fn
actually emitted the call — wasm-gc executes natively and the host
had no way to recover the originating fn (VM and self-host get it
from their interpreter stack frames). The playground recording
panel surfaced this as a column of identical "in main" labels even
when effects flowed across helper fns, which was the immediate
trigger here.
Fix: every host import declares a trailing `caller_fn: any_ref`
String-ref param. The codegen pushes the current fn name as a
String literal right before each `Call(effect_idx)`, the host
extracts it once at the top of `dispatch_aver_import`, slices it
off the param list, and threads it through per-namespace dispatch
into `record_effect_if_recording`. The recorder stamps the value
into the EffectRecord and the playground panel finally shows
real per-fn labels.
Wiring:
- `effects.rs::params` wraps each namespace's existing param list
with a trailing `any_ref_ty()`. New `params_without_caller`
carries the old logic; the public `params` is a thin shim that
appends the caller-fn slot.
- `types.rs::build` pre-registers every `fn def`'s name as a
passive String literal segment so `emit_string_literal_bytes`
has an idx to reach for, and force-allocates
`string_array_type_idx` whenever the program has any fn defs
(programs without a single user String literal — e.g.
`fn main() -> Int { _ = Time.unixMs(); 42 }` — used to skip
the slot, which broke caller-fn emission).
- `body/builtins.rs` effect emit now pushes the current fn name
via `emit_string_literal_bytes` before `Call(idx)`. The
`?!` / `!` group markers in `body/emit.rs::emit_group_call` /
`emit_branch_marker` do the same so their signatures still
match the trailing-arg ABI (the host ignores it for group
ops).
- `imports.rs::dispatch_aver_import` decodes `params.last()` via
the LM transport and forwards both the sliced real params and
the caller_fn through the per-namespace dispatch chain. Each
per-namespace `dispatch(name, …, caller_fn: &str)` now
passes the caller_fn to every `record_effect_if_recording`
call inside its arms.
- `replay_glue.rs::record_effect_if_recording` takes `caller_fn`
as an explicit argument (no host-state, no thread-locals) and
forwards it to `EffectReplayState::record_effect`.
- Playground side mirrors the same shape: `wasm_host.js`
callbacks all take a trailing `callerRef` arg and forward
`this.averToJs(callerRef)` as the 6th arg to
`recordOrDispatch`. Pure imports (`float_*`) and the group
markers ignore it. JS `EffectReplayState.recordEffect`
accepts the value through its existing `callerFn` parameter.
Verified on the CLI:
fn helper(n: Int) -> Int ! [Console.print] ...
fn main() -> Unit ! [Console.print, helper] ...
`aver run --wasm-gc --record …` produces:
"caller_fn": "helper"
"caller_fn": "main"
(was "main" / "main" before this change). 10/10 record-replay
smokes still pass.
ABI break: 0.16.2 wasm-gc binaries pass an extra String arg to
every effect import. Programs compiled under 0.16.1 won't
instantiate against the 0.16.2 host — a release-time concern,
documented in the CHANGELOG entry for this release.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
CI Clippy (wasm) flagged the two `if let Err(_) = …` guards added around the new caller_fn string-literal emit in `emit_group_call` and `emit_branch_marker`. Switch to `.is_err()` per the lint hint. Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Rework caller_fn delivery so the hot path stops paying an
`array.new_data` per effect call site.
Before this change every effect import got a fresh
`(array i8)` materialised at the call site:
i32.const 0
i32.const N
array.new_data $string $segment
call $effect
That meant one GC alloc per effect call plus ~6 bytes of
emit per site, for a value that's identical for every call
inside the same fn.
Now: each fn that emits caller_fn (a dotted call or an `?!`/
`!` independent product) gets one immutable-ish
`(mut (ref null $string))` global. A wasm-level start fn
runs once at instantiation and `array.new_data → global.set`s
each one. The hot path collapses to:
global.get $caller_fn_<idx>
call $effect
Three bytes per site, zero alloc per effect call, O(N_fns)
allocs total at instantiation.
Selectivity: only fns whose body contains a dotted call or
an independent product get a global. Pure fns and plain
forwarders that only call other user fns don't allocate a
slot — the new walker `fn_body_emits_effect_call` in
`types.rs` runs the same shape as `fn_body_calls_int_mod`
and is conservative on dotted (counts builtin namespace
calls like `List.length` too — false-positive cost is one
segment + one global, false-negative would crash wasm
validation).
Wasm-level start vs `_start` export: the host invokes
`main` directly when both are exported (see `run_wasm_gc.rs`
~L347), bypassing `_start`. Putting init in `_start` left
globals null and `lm_string_to_host` decoded
`Val::AnyRef(None)` → fallback "main" stamped on every
event. Wasm-level start runs at instantiation regardless of
which export the host calls, which is what we need.
Verified with `examples/core/effects_explicit.av`:
- before: caller_fn = "main" / "main"
- after: caller_fn = "greet" / "farewell"
10/10 record-replay smokes pass.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
CI Format step rejected the inline `StartSection { function_index: idx }`
in `module.rs::module.section(&...)`.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Documents the per-fn label feature for the release notes — covers the user-visible benefit (real per-fn labels in playground trace panel + CLI dumps), the codegen mechanism (shared globals + wasm-level start init), and explicitly mentions the rejected per-call alloc approach so the hot path trade-off is in the record. Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
432a65d to
8a0e09a
Compare
Playground per-fn recordings (`-e 'fn(args)'`) used to dive
through the VM-in-wasm32 bridge — `aver_run_record_entry`
compiled to wasm32 and ran the program under the embedded
VM, which gave ~10× slower exec than native wasm-gc plus
all the gc-tracing it carries with it. The motivating user
case (CLI `aver run --wasm-gc -e 'helper(7)'` with a real
caller_fn label) had no playground equivalent.
Solution suggested by the user, which turns out to land
*more* code than it deletes: instead of writing a JS-side
arg encoder that mirrors `decode::encode_entry_args_for_wasm_gc`
(per-Aver-type Value → wasmtime::Val mapping, painful for
compound shapes — Lists, Tuples, Variants, Records), the
compiler synthesises a no-arg wrapper fn whose body calls
the user expression with literal args:
fn __entry__() -> ReturnT
! [target_fn]
target_fn(7, 35)
Each `Value` arg lowers to its matching `Expr::Literal`;
type-checker, resolver, and codegen treat the synth fn
identically to a hand-written one. The wasm-gc `_start`
synthesis prefers `__entry__` over `main` when both are
present, so `instance.exports._start()` runs the user
expression via the same path `main` recordings use — no
new worker message shape, no JS-side encoding, single
source of truth in the codegen.
Wiring:
- `src/playground.rs::compile_project_to_wasm_with_entry`
parses the expression with `replay::parse_entry_call`,
scans `entry_items` and `loaded` deps for the target's
signature (return_type + effects), builds the synthetic
FnDef, runs the standard pipeline. Returns
`(wasm_bytes, target_fn_name)` so the JS host can label
the recording with the user-facing name.
- `value_to_literal_expr` covers Int / Float / Bool / Str /
Unit. Compound shapes fail with a clear message —
`--expr` literal grammar already restricts to those,
but compound `--expr` extensions (List, Tuple, Variant
literals from `parse_entry_call`) are a follow-up if
user code hits them.
- `src/codegen/wasm_gc/module.rs::main_idx` first checks
for `__entry__`, falls back to `main`. `_start` is the
thin `call entry; drop?` shim it always was; no other
changes to the entry pipeline.
- wasm-bindgen exposes `aver_compile_project_with_entry`
(returns wasm bytes) and `aver_parse_entry_target`
(returns the target fn name for trace labelling) — JS
picks both up, posts to the existing `record` worker
message with `entryLabel` so `recording.entry_fn`
reflects the target.
- `tools/website/playground/app.js::doRecord` collapses
the entry-expr branch into the same compile + spawnWorker
path the `main` flow uses; the `aver_run_record_entry`
call (VM-in-wasm32) is gone.
- `tools/website/playground/worker.js::recordModule`
threads `entryLabel` into the recording metadata; `app.js
::stopRun` mirrors it from `state.recordingMeta.entry_fn`
for mid-recording stops.
CHANGELOG bullet rewritten — the previous "still falls
back to VM-in-wasm32" caveat is gone.
10/10 record-replay smokes still pass. Playground rebuild
clean.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Two fixes bundled: - `cargo fmt` clean: long signature on `value_to_literal_expr`, body shape of `lookup_fn_signature`'s `scan` closure, and the one-liner under `aver_compile_project_with_entry` all get reformatted to whatever the CI rustfmt wants. - Playground Download wasm now mirrors the Run / Record / Replay entry shape. If `state.lastEntryExpr` is set the download is the same synth-`__entry__` build that recording captured; otherwise it's the regular `main`-entry binary. Without this, recording `add(7, 35)` and downloading the wasm gave the user a binary whose `_start` pointed at `main` while the trace's `entry_fn` said `add` — the two stopped matching mid-flow. Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Misread the question — user asked which target the playground's Download button produces (answer: wasm-gc, via `compile_to_wasm_gc`), not a behavioural request. Restore the plain `aver_compile_project` / `aver_compile` chain so Download keeps producing the regular `main`-entry binary regardless of whether the user is mid-`-e` session. Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
The browser host has been wasm-gc-only since 0.16.0 (engine GC,
no `aver_runtime.wasm` sidecar to fetch and cache, factory
exports for structured effect returns) — the legacy NaN-boxed
emitter was only reachable through `aver_runtime_wasm` /
`build_aver_runtime_wasm`, neither of which any playground
JS / HTML calls anymore. ~9.4 kLoC of `codegen::wasm` plus the
embedded runtime blob were riding along inside `aver_bg.wasm`
for nothing.
Split: new `wasm-legacy` Cargo feature gates `pub mod wasm;`
in `codegen/mod.rs`. The CLI `wasm` feature pulls it in so
`aver compile --target wasm`, `aver compile --bridge {wasip1,
fetch}`, and `aver wasm-runtime` keep building. The
`playground` feature does not — both `compile_to_wasm` and
`aver_compile_project*` only reach into `codegen::wasm_gc`,
and the now-dead `aver_runtime_wasm` binding is gone with
its `build_aver_runtime_wasm` host wrapper.
Net: `aver_bg.wasm` 4811 KiB → 4646 KiB after `wasm-opt -Oz`
(-165 KiB on the first-load path that ships to every
playground visitor).
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Two related fixes after the Args.get inline path tripped wasm validation on checkers.av: - `body/builtins.rs::emit_args_get_inline` was emitting raw `Call(args_len_idx)` and `Call(args_get_idx)` without first pushing the caller_fn String ref the new ABI requires. Validation failed with `expected anyref but nothing on stack` at the inline path's first call site (offset 0x350e on checkers — discovered by reading the dumped invalid bytes from `/tmp/aver_wasm_gc_invalid .wasm`). Inserted `super::emit::emit_caller_fn_global(...)?` ahead of both `Call`s. Args.get is the only inline path that calls effect imports directly today; the regular `FnCall` lowering already pushes caller_fn at every effect call site. - `types.rs::fn_body_emits_effect_call` walker is restored after the previous "alloc a global per fn def" regression. Predicate loosened to flag any `FnCall(Attr(_, _), …)` callee — covers nested-module shape `Module.Sub.fn(args)` too — instead of the earlier `Attr(Ident(_), _)` shape that false-negatived two-level qualified calls. Pure fns and forwarders still get skipped. Game wasm sizes drop back closer to baseline: - snake: 4896 → 4506 B (full→selective) - checkers: 26315 → 23969 B - rogue: 31800 → 28666 B - doom: 25096 → 23197 B 10/10 record-replay smokes pass; all 7 prebuilt games compile + validate under the fresh wasm-gc compiler. Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
- `cargo fmt` reformat: trailing blank line in `playground.rs`, `types.rs::fn_body_emits_effect_call` doc shape. - `clippy::doc-lazy-continuation` triggered on the walker's multi-line doc — collapsed the `false positives ... false negatives ...` sentence into a single line so the lint stops reading it as an unindented paragraph break. Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Driving feature of 0.16.2 — last piece. Playground trace capture / replay used to bounce through the VM-in-wasm32 bridge (
comp.aver_run_record/comp.aver_replay_run); now compiles user source to wasm-gc bytes via existingaver_compile_projectand drives--record/--replaynatively on the WebWorker. Effect outcomes flow through a JS-side mirror ofaver::replay::EffectReplayState.Trace JSON is byte-compatible with
aver run --record(any backend), so a playground-recorded.replay.jsonreplays under the CLI replayer and vice versa. Independent-product (?!) markers from Step 2 are wired in the JS-side recorder via the newrecord_enter_group/record_set_branch/record_exit_grouphost imports the wasm-gc compiler emits.Layout
replay_state.js(new)EffectReplayState: Normal/Recording/Replay modes, group/branch tracking,replayEffect/recordEffect/ensureReplayConsumedwasm_host.jsAverBrowserHostextended; each host import routes throughrecordOrDispatch(name, args, realCall, decodeOutcome, encodeOutcome)worker.jsrecord/replaymessage handlersapp.jsdoRecord/doReplayrewritten to compile + post to workerCHANGELOG.mdOut of scope (deferred to a follow-up)
mainreturns need a compiler-injected__rt_main_to_lm_jsonper main return type. Until thenrecording.outputstaysnullon the playground side andMATCHis determined by effect-sequence + outcomes (same as the wasm-gc CLI before phase 4d).Record fn…(per---exprentry). Still goes through the VM-in-wasm32 path until the compiler exposes a wasm-bindgen entry-call wrapper. Flagged indoRecordwith an explicit fallback branch.Test plan
node --check) clean across all four files## 0.16.2 (unreleased)examples/services/console_demo.avin the playground, download.replay.json, replay throughaver replay --wasm-gc0.16.2 status after this merge
imports.rsper-namespace?!cross-backendReady for
release.py 0.16.2after merge.🤖 Generated with Claude Code