Skip to content

Commit 617356d

Browse files
authored
Merge pull request #414 from WenkelF/explore_finetuning
Finetuning pipeline
2 parents 9515ad6 + 6625e0c commit 617356d

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

53 files changed

+2165
-836
lines changed

README.md

Lines changed: 9 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -85,21 +85,26 @@ If you are not familiar with [PyTorch](https://pytorch.org/docs) or [PyTorch-Lig
8585
## Running an experiment
8686
We have setup Graphium with `hydra` for managing config files. To run an experiment go to the `expts/` folder. For example, to benchmark a GCN on the ToyMix dataset run
8787
```bash
88-
python main_run_multitask.py dataset=toymix model=gcn
88+
graphium-train dataset=toymix model=gcn
8989
```
9090
To change parameters specific to this experiment like switching from `fp16` to `fp32` precision, you can either override them directly in the CLI via
9191
```bash
92-
python main_run_multitask.py dataset=toymix model=gcn trainer.trainer.precision=32
92+
graphium-train dataset=toymix model=gcn trainer.trainer.precision=32
9393
```
9494
or change them permamently in the dedicated experiment config under `expts/hydra-configs/toymix_gcn.yaml`.
9595
Integrating `hydra` also allows you to quickly switch between accelerators. E.g., running
9696
```bash
97-
python main_run_multitask.py dataset=toymix model=gcn accelerator=gpu
97+
graphium-train dataset=toymix model=gcn accelerator=gpu
9898
```
9999
automatically selects the correct configs to run the experiment on GPU.
100+
Finally, you can also run a fine-tuning loop:
101+
```bash
102+
graphium-train +finetuning=admet
103+
```
104+
100105
To use a config file you built from scratch you can run
101106
```bash
102-
python main_run_multitask.py --config-path [PATH] --config-name [CONFIG]
107+
graphium-train --config-path [PATH] --config-name [CONFIG]
103108
```
104109
Thanks to the modular nature of `hydra` you can reuse many of our config settings for your own experiments with Graphium.
105110

docs/cli_references.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -5,4 +5,5 @@ This page provides documentation for our command line tools.
55
::: mkdocs-click
66
:module: graphium.cli
77
:command: main_cli
8-
:command: data_cli
8+
:style: table
9+
:prog_name: graphium

docs/tutorials/model_training/running-multitask-ipu.ipynb

Lines changed: 8 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -420,7 +420,14 @@
420420
"logger.info(metrics)\n",
421421
"\n",
422422
"predictor = load_predictor(\n",
423-
" cfg, model_class, model_kwargs, metrics, accelerator_type, datamodule.task_norms\n",
423+
" cfg,\n",
424+
" model_class,\n",
425+
" model_kwargs,\n",
426+
" metrics,\n",
427+
" datamodule.get_task_levels(),\n",
428+
" accelerator_type,\n",
429+
" datamodule.featurization,\n",
430+
" datamodule.task_norms\n",
424431
")\n",
425432
"logger.info(predictor.model)\n",
426433
"logger.info(ModelSummary(predictor, max_depth=4))"

0 commit comments

Comments
 (0)