Skip to content

Refactor loss structure (WIP)#305

Draft
sgreenbury wants to merge 4 commits intomainfrom
refactor-loss-structure
Draft

Refactor loss structure (WIP)#305
sgreenbury wants to merge 4 commits intomainfrom
refactor-loss-structure

Conversation

@sgreenbury
Copy link
Copy Markdown
Contributor

No description provided.

Split loss into ambient and latent parts so that they can be composed
more easily.
Adds a latent_noise_injector parameter to EncoderProcessorDecoder that
applies noise in latent space (after encoding) rather than ambient space.
This mirrors the existing pattern in ProcessorModel and enables latent
regularization independently of ambient data augmentation.

- Add _apply_latent_noise method and wire into forward() and _latent_loss()
- Update ensemble _latent_loss to also apply latent noise
- Refactor _resolve_input_noise_injector into generic _resolve_noise_injector
- Add latent_noise_injector config group and model config default
- Add tests for additive, concatenated, and None latent noise injection
Adds a Lightning Callback that dynamically adjusts ambient_loss_weight
and latent_loss_weight during training based on configurable schedules.
Enables curriculum learning patterns like training diffusion models
primarily with latent loss then gradually ramping in ambient loss.

- LossWeightScheduleCallback with step-based and epoch-based progress
- Built-in schedules: LinearRampSchedule, CosineSchedule, ConstantSchedule
- Handles encoder/decoder unfreezing when ambient weight becomes positive
- Checkpoint state_dict support for resume
- Comprehensive tests for schedules and callback behavior
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant