This repository provides workflows for efficiently performing Hyperparameter Optimization (HPO) on HPC clusters. Designed for large-scale ML experiments, it leverages wandb sweep, Ray Tune and Optuna while integrating seamlessly with Slurm.
-
Notifications
You must be signed in to change notification settings - Fork 0
This repository provides workflows for efficiently performing Hyperparameter Optimization (HPO) on HPC clusters. Designed for large-scale ML experiments, it leverages Ray Tune and Optuna while integrating seamlessly with Slurm.
License
AaltoRSE/hpo-on-hpc
Folders and files
| Name | Name | Last commit message | Last commit date | |
|---|---|---|---|---|
Repository files navigation
About
This repository provides workflows for efficiently performing Hyperparameter Optimization (HPO) on HPC clusters. Designed for large-scale ML experiments, it leverages Ray Tune and Optuna while integrating seamlessly with Slurm.
Resources
License
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published