Skip to content

Add autoencoder configs and update defaults#328

Merged
sgreenbury merged 8 commits intomainfrom
2026-04-16/ae
Apr 18, 2026
Merged

Add autoencoder configs and update defaults#328
sgreenbury merged 8 commits intomainfrom
2026-04-16/ae

Conversation

@sgreenbury
Copy link
Copy Markdown
Contributor

@sgreenbury sgreenbury commented Apr 17, 2026

Summary

Adds four dataset-specific autoencoder experiment configs under local_hydra/ and tunes three defaults in src/autocast/configs/ that are currently the right choices for the runs we're doing across datasets.

New local experiment configs (local_hydra/local_experiment/ae/<dataset>/ae_dc_large.yaml)

  • advection_diffusion
  • conditioned_navier_stokes
  • gpe_laser_wake_only
  • gray_scott

Default changes in src/autocast/configs/ (documented inline)

Config Change Restore previous behaviour
autoencoder.yaml float32_matmul_precision: null -> high float32_matmul_precision=null
datamodule/gpe_laser_only_wake.yaml add channel_idxs: [1, 2] (real + imag of wave function) ~datamodule.channel_idxs
trainer/default.yaml add best-val-{epoch}-{val_loss} ModelCheckpoint (top-1, monitors val_loss) ~trainer.callbacks.1

Each change has a short inline comment dated 2026-04-17 noting what changed and how to opt out. A TODO in trainer/default.yaml flags a follow-up to refactor callbacks into composable config groups so the best-val opt-out does not rely on a list index.

Note on dataset channel choices

The new channel_idxs: [1, 2] on gpe_laser_only_wake keeps the real and imaginary parts of the wavefunction (full state that the GPE simulator evolves).

Test plan

  • CI passes
  • Spot-check that the four new ae_dc_large.yaml configs resolve via uv run autocast ae --dry-run local_experiment=ae/<dataset>/ae_dc_large

@sgreenbury sgreenbury changed the title 2026 04 16/ae Add autoencoder configs and update defaults Apr 17, 2026
@sgreenbury sgreenbury merged commit 20f633e into main Apr 18, 2026
3 checks passed
@sgreenbury sgreenbury deleted the 2026-04-16/ae branch April 18, 2026 06:48
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant