This is a template repository for running Optuna optimization on Google Cloud Batch and GCS.
python>=3.11dockermakeyqgcloud
- Create two service accounts, SA1 and SA2. SA1 is for optimizer and SA2 is for job batches. Add permissions
roles/storage.objectUser,roles/batch.jobsEditor,roles/artifactregistry.readerandroles/logging.logWriterto a service account for SA1 androles/storage.admin,roles/artifactregistry.reader,roles/batch.agentReporterandroles/logging.logWriterfor SA2. - Edit
optunabatch/custom.pyto define your objective function and study settings. - Edit
config.yamlto define your cloud batch settings. Service account for the job instances (SA2) should be specified inservice_accountfield. make buildto build the Docker images.make pushto push the Docker images to the Artifact Resigtry.- Create a GCE instance. You can use
make create-instance SERVICE_ACCOUNT={SA1}to create a VM. The service account should have permissions forStorage Object User,Batch Job EditorandLogs Writer. make update-containerto deploy the Docker images to the GCP.- Trials will be stored in the GCS bucket specified in
config.yaml.
The command make run-optimizer is for debugging in the local environment. Note that it uses credentials stored in ~/.config/gcloud/.