You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The ``examples`` directory `on the NetPyNE github <https://github.com/suny-downstate-medical-center/netpyne/tree/batch/netpyne/batchtools/examples/rosenbrock>`_ contains multiple methods of performing automatic parameter search of a
3198
-
2 dimensional Rosenbrock function. These examples are used to quickly demonstrate some of the functionality of batch communications rather than the full process of running parameter searches on a detailed
3199
-
NEURON simulation (see 7. Performing parameter optimization searches (CA3 example)) and therefore only contain the a `batch.py`file containing the script detailing the parameter space and search method, and a
3200
-
`rosenbrock.py`file containing the function to explore, and the appropriate declarations and calls for batch automation and communication (rather than the traditional `cfg.py`, `netParams.py`, and`init.py` files).
3197
+
The ``examples`` directory `on the NetPyNE github <https://github.com/suny-downstate-medical-center/netpyne/tree/batch/netpyne/batchtools/examples/rosenbrock>`_ contains multiple methods of performing automatic parameter search of a 2-dimensional Rosenbrock function. These examples are used to quickly demonstrate some of the functionality of batch communications rather than the full process of running parameter searches on a detailed NEURON simulation (see 7. Performing parameter optimization searches (CA3 example)) and therefore only contain the a `batch.py`file containing the script detailing the parameter space and search method, and a `rosenbrock.py`file containing the function to explore, and the appropriate declarations and calls for batch automation and communication (rather than the traditional `cfg.py`, `netParams.py`, and`init.py` files).
The ``examples`` directory `on the NetPyNE github <https://github.com/suny-downstate-medical-center/netpyne/tree/batch/netpyne/batchtools/examples>`_ shows both a ``grid`` based search as well as an ``optuna`` based optimization.
3272
3269
3273
-
In the ``CA3`` example, we tune the ``PYR->BC````NMDA``and``AMPA`` synaptic weights, as well as the ``BC->PYR````GABA`` synaptic weight. Note the search space is defined
3270
+
In the ``CA3`` example, we tune the ``PYR->BC````NMDA``and``AMPA`` synaptic weights, as well as the ``BC->PYR````GABA`` synaptic weight.
3271
+
3272
+
For the optuna-based parameter optimization, the search space upper and lower bounds are defined in``optuna_search.py``as:
3274
3273
3275
3274
.. code-block:: python
3276
3275
@@ -3280,7 +3279,7 @@ In the ``CA3`` example, we tune the ``PYR->BC`` ``NMDA`` and ``AMPA`` synaptic w
3280
3279
'gaba.BC->PYR' : [0.4e-3, 1.0e-3],
3281
3280
}
3282
3281
3283
-
in both ``optuna_search.py``, defining the upper and lower bounds of the search space, whilein``grid_search.py``the search space is defined
3282
+
In contrast, for the grid-based parameter explorations, the ``3x3x3`` specific values to search over are defined in``grid_search.py``as:
3284
3283
3285
3284
.. code-block:: python
3286
3285
@@ -3290,9 +3289,7 @@ in both ``optuna_search.py``, defining the upper and lower bounds of the search
which defines ``3x3x3`` specific values to search over
3294
-
3295
-
Note that the ``metric`` specifies a specific ``string`` (``loss``) to report and optimize around. This value is generated and``sent`` by the ``init.py`` simulation
3292
+
Note that the ``metric`` sets a specific ``string`` (``loss``) to report and optimize around. This value is generated and sent by the ``init.py`` simulation:
3296
3293
3297
3294
.. code-block:: python
3298
3295
@@ -3314,17 +3311,18 @@ In a multi-objective optimization, the relevant ``PYR_loss``, ``BC_loss``, and `
3314
3311
9. Ray Checkpointing and Resuming Interrupted Searches
A new feature in this beta release is the checkpointing and saving of search progress via the ``ray`` backend. This data is saved in the ``checkpoint_path`` directory specified in the ``search`` function, (which defaults to a newly created ``checkpoint`` folder within the source directory, and the default behavior of ``search``is to automatically attempt a restore if the batch job is interrupted.
3314
+
3317
3315
Upon successful completion of the search, the default behavior is to delete these checkpoint files. If the user manually ends the search due to coding error and wishes to restart the search, the ``checkpoint_path`` should be deleted first.
3318
3316
3319
3317
10. Parameter Importance Evaluation Using fANOVA (unstable)
A new feature in this beta release is the ability to evaluate parameter importance using a functional ANOVA inspired algorithm via the ``Optuna``and``scikit-learn`` libraries.
3319
+
Another new feature in this beta release is the ability to evaluate parameter importance using a functional ANOVA inspired algorithm via the ``Optuna``and``scikit-learn`` libraries.
3322
3320
(See `the original Hutter paper <http://proceedings.mlr.press/v32/hutter14.pdf>`_ and its `citation <https://automl.github.io/fanova/cite.html>`_)
3323
3321
3324
3322
Currently, only unpaired single parameter importance to a single metric score is supported through the ``NetPyNE.batchtools.analysis````Analyzer``object, with an example of its usage
To run the example, generate an output ``grid.csv`` using ``batch.py``, then loading that ``grid.csv`` into the ``Analyzer``object. Then, using``run_analysis`` will generate, per parameter, a single score indicative of the estimated ``importance`` of the parameter: that is, the estimated effect on the total variance of the model within the given bounds.
3325
+
To run the example, generate an output ``grid.csv`` using ``batch.py``, andthen load that ``grid.csv`` into the ``Analyzer``object. Finally, executing``run_analysis`` will generatea single score per parameter indicative of the estimated ``importance`` of the parameter: that is, the estimated effect on the total variance of the model within the given bounds:
0 commit comments