Skip to content

Commit ac14df1

Browse files
authored
Merge pull request #500 from datamol-io/new_banner_color_logo
Updated readme `graphium-train` to newer version
2 parents 45f5016 + 70133a4 commit ac14df1

File tree

2 files changed

+11
-11
lines changed

2 files changed

+11
-11
lines changed

README.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -79,16 +79,16 @@ If you are not familiar with [PyTorch](https://pytorch.org/docs) or [PyTorch-Lig
7979
## Running an experiment
8080
We have setup Graphium with `hydra` for managing config files. To run an experiment go to the `expts/` folder. For example, to benchmark a GCN on the ToyMix dataset run
8181
```bash
82-
graphium-train dataset=toymix model=gcn
82+
graphium-train architecture=toymix tasks=toymix training=toymix model=gcn
8383
```
8484
To change parameters specific to this experiment like switching from `fp16` to `fp32` precision, you can either override them directly in the CLI via
8585
```bash
86-
graphium-train dataset=toymix model=gcn trainer.trainer.precision=32
86+
graphium-train architecture=toymix tasks=toymix training=toymix model=gcn trainer.trainer.precision=32
8787
```
8888
or change them permanently in the dedicated experiment config under `expts/hydra-configs/toymix_gcn.yaml`.
8989
Integrating `hydra` also allows you to quickly switch between accelerators. E.g., running
9090
```bash
91-
graphium-train dataset=toymix model=gcn accelerator=gpu
91+
graphium-train architecture=toymix tasks=toymix training=toymix model=gcn accelerator=gpu
9292
```
9393
automatically selects the correct configs to run the experiment on GPU.
9494
Finally, you can also run a fine-tuning loop:

docs/cli/graphium-train.md

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -1,28 +1,28 @@
11
# `graphium-train`
22

3-
To support advanced configuration, Graphium uses [`hydra`](https://hydra.cc/) to manage and write config files. A limitation of `hydra`, is that it is designed to function as the main entrypoint for a CLI application and does not easily support subcommands. For that reason, we introduced the `graphium-train` command in addition to the [`graphium`](./graphium.md) command.
3+
To support advanced configuration, Graphium uses [`hydra`](https://hydra.cc/) to manage and write config files. A limitation of `hydra`, is that it is designed to function as the main entrypoint for a CLI application and does not easily support subcommands. For that reason, we introduced the `graphium-train` command in addition to the [`graphium`](./graphium.md) command.
44

55
!!! info "Curious about the configs?"
66
If you would like to learn more about the configs, please visit the docs [here](https://github.com/datamol-io/graphium/tree/main/expts/hydra-configs).
77

88
This page documents `graphium-train`.
99

1010
## Running an experiment
11-
To run an experiment go to the `expts/hydra-configs` folder for all available configurations. For example, to benchmark a GCN on the ToyMix dataset run
11+
We have setup Graphium with `hydra` for managing config files. To run an experiment go to the `expts/` folder. For example, to benchmark a GCN on the ToyMix dataset run
1212
```bash
13-
graphium-train dataset=toymix model=gcn
13+
graphium-train architecture=toymix tasks=toymix training=toymix model=gcn
1414
```
1515
To change parameters specific to this experiment like switching from `fp16` to `fp32` precision, you can either override them directly in the CLI via
1616
```bash
17-
graphium-train dataset=toymix model=gcn trainer.trainer.precision=32
17+
graphium-train architecture=toymix tasks=toymix training=toymix model=gcn trainer.trainer.precision=32
1818
```
19-
or change them permamently in the dedicated experiment config under `expts/hydra-configs/toymix_gcn.yaml`.
19+
or change them permanently in the dedicated experiment config under `expts/hydra-configs/toymix_gcn.yaml`.
2020
Integrating `hydra` also allows you to quickly switch between accelerators. E.g., running
2121
```bash
22-
graphium-train dataset=toymix model=gcn accelerator=gpu
22+
graphium-train architecture=toymix tasks=toymix training=toymix model=gcn accelerator=gpu
2323
```
2424
automatically selects the correct configs to run the experiment on GPU.
25-
Finally, you can also run a fine-tuning loop:
25+
Finally, you can also run a fine-tuning loop:
2626
```bash
2727
graphium-train +finetuning=admet
2828
```
@@ -50,5 +50,5 @@ graphium-train [...] datamodule.args.processed_graph_data_path=[path_to_cached_d
5050
??? note "Config vs. Override"
5151
As with any configuration, note that `datamodule.args.processed_graph_data_path` can also be specified in the configs at `expts/hydra_configs/`.
5252

53-
??? note "Featurization"
53+
??? note "Featurization"
5454
Every time the configs of `datamodule.args.featurization` change, you will need to run a new data preparation, which will automatically be saved in a separate directory that uses a hash unique to the configs.

0 commit comments

Comments
 (0)