-
Notifications
You must be signed in to change notification settings - Fork 24
Config Guide (incl. Integration from Parameters to Config)
Author: Michael Baumgartner (@mibaumgartner)
The current Delira Master Branch introduced a new Config
class to interact with Delira and save all kinds of settings for your experiments. This wiki page will go over some of the new functions and show how to use the new config.
Delira Parameters was an object to hold your settings needed for training and your model. The parameters were split into four categories: fixed model
, fixed training
, variable model
, variable training
. The model
and training
specified the destination of the parameters while variable
parameters are intended for the hyperparameter search (coming soon :) ).
DeliraConfig
is the new config class to hold all the information needed for your experiment. This includes parameters needed for data loading, augmentation and so on. It is located in delira.utils
or more specifically in delira.utils.config.py
. Furthermore, all old functions were reimplemented to remove the trixi
and trixi-slim
dependency. This enables a completely backend agnostic framework.
DeliraConfig
works similar to the old Parameters
class with some slight modifications. You can save every setting of your parameters inside the config and use it to reproduce your experiment later (a timestamp and the current version of delira are saved when dump
or dumps
is called). Your model and training parameters from the old parameters object need to be located at fixed_model
, fixed_training
, variable_model
and variable_training
respectively. Additionally, new setter and getter functions were implemented for easy access to these parameters:
Example for settings fixed parameters
from delira.utils import DeliraConfig
import torch
from sklearn.metrics import mean_absolute_error
config = DeliraConfig()
config.fixed_params = {
"model": {
"input_channels": 1,
"output_channels": 2,
},
"training": {
"n_epochs": 100,
}
}
Equivalently it is possible to set model parameters by using model_params
and passing a dict like object with fixed
and variable
. The same principles also apply to training_params
and variable_params
. The getter works the same way as the setter:
# returns a config with model and training
print(config.fixed_params)
The main features of the old parameters class are still here.
Nested access with '.' inside keys (warning only use string keys because the type of the objects os lost when using this method)
print(config['fixed_model.output_channels'])
Alls keys are attributes, too.
print(config.fixed_model.input_channels)
Nested get still works like a charm.
print(config.nested_get("n_epochs"))
Update config with dict like objects.
update_dict = {'fixed_training': {
'new_param0': 0,
'new_param1': 1,
}}
config.update(update_dict)
# update of nested dict (overwrite is enabled because the keys already exist inside the config)
# if overwrite is not enabled and there are duplicate keys a KeyError is raised
update_dict = {
'new_param0': 10,
'new_param1': 11,
}
config.fixed_training.update(update_dict, overwrite=True)
- Adjust the new import path, e.g.
from delira.utils import DeliraConfig
- Set the dict used to initialize the
parameters
object asfixed_params
inside the config.
Old API (parameters)
params = Parameters(fixed_params={
"model": {},
"training": {
"losses": {
"L1":
torch.nn.BCEWithLogitsLoss()},
"optimizer_cls": torch.optim.Adam,
"optimizer_params": {},
"num_epochs": 2,
"val_metrics": {"mae": mean_absolute_error},
"lr_sched_cls": None,
"lr_sched_params": {}})
New API (config)
config = DeliraConfig()
config.fixed_params = {
"model": {},
"training": {
"losses": {
"L1":
torch.nn.BCEWithLogitsLoss()},
"optimizer_cls": torch.optim.Adam,
"optimizer_params": {},
"num_epochs": 2,
"val_metrics": {"mae": mean_absolute_error},
"lr_sched_cls": None,
"lr_sched_params": {}}
}
in
is now supported for nested keys
print('training.num_epochs' in config)
Save and load configs from files (the config can contains any object you want and includes support for almost all serialisation standards including but not limited to yaml
, json
and pickle
)
# by default yaml is used
config.dump('.my_config')
#json
import json
config.dump('.my_config', formatter=json.dump, indent=4)
Create objects from dicts, argument parser, files or strings with create_from_dict
, create_from_argparse
, create_from_file
and create_from_str
respectively.
You can easily log your config inside your logging environment with log_as_string
.
Furthermore, is is now possible to decode arbitrary objects which can be specified by you. Here is an example for a yaml file:
my_array:
__classargs__:
module: "numpy"
name: "ndarray"
args: [[1, 2, 3]]
my_function:
__functionargs__:
module: "numpy"
name: "min"
kwargs: {"axis": (1, 2)}
Happy experimenting :D
If there are any questions left, feel free to contact us. The best way to do so is via our slack community or by just opening an issue at this repo.
If there are any questions left, feel free to contact us. The best way to do so is via our slack community or by just opening an issue at this repo.