Skip to content

Add support for axonal delays #2989

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 105 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from 99 commits
Commits
Show all changes
105 commits
Select commit Hold shift + click to select a range
101a5cc
Handle axonal delay in original stdp-pl synapse
suku248 Mar 31, 2022
bb37f53
Add first version of framework for STDP with long axonal delay
Mar 4, 2022
0bbfc27
Use ring buffer to temporarily store spike data for retrospective cor…
suku248 May 23, 2022
7362c26
Added stdp-pl python test and fixed postsynaptic trace history cleanu…
JanVogelsang Jul 25, 2022
1e73b4c
Adding python test for stdp with axonal delays and fixing correction-…
JanVogelsang Sep 22, 2022
811f18e
Applied clang-format v13
JanVogelsang Oct 6, 2022
92a0ca0
Made stdp test runnable
JanVogelsang Mar 2, 2023
e69c17c
Improved stdp pl synapse hom test
JanVogelsang Mar 8, 2023
78aadfa
Merged master
JanVogelsang Mar 10, 2023
9d1729e
Update stdp_pl_synapse_hom_ax_delay.h
JanVogelsang Mar 10, 2023
dee2589
Fixed formatting
JanVogelsang Mar 10, 2023
cd6887b
Merge branch 'stdp_long_axonal_delays' of github.com:JanVogelsang/nes…
JanVogelsang Mar 10, 2023
b624ad5
Removed old tests
JanVogelsang Mar 10, 2023
d76f654
Made whole kernel axonal-delay aware
JanVogelsang Mar 14, 2023
1e171b0
Fixed issues with min and max delays
JanVogelsang Mar 31, 2023
780915b
Fixed default delays
JanVogelsang Apr 4, 2023
0f91f45
Improved BadDelay exception
JanVogelsang Apr 4, 2023
c2dbd84
Improved stdp bugfix
JanVogelsang Apr 17, 2023
bf0f6fc
Fixed issues with correction-based STDP
JanVogelsang Apr 24, 2023
75a870c
Fixed issues with correction-based STDP
JanVogelsang Apr 24, 2023
c9be932
Fixed issues with correction-based STDP
JanVogelsang Apr 24, 2023
cf704e6
Added detailed timers
JanVogelsang May 9, 2023
5f60ede
Added stopwatch for time spent for correction
JanVogelsang May 23, 2023
68d3a93
Added ignore and fire neuron and collecting number of corrections
JanVogelsang Jun 7, 2023
9731013
Added ignore and fire neuron
JanVogelsang Jun 7, 2023
c447b9f
Fixed ignore and fire for corrections
JanVogelsang Jun 13, 2023
ed5147e
Reducing correction entry vector size when clearing
JanVogelsang Jun 13, 2023
cb83c25
Fixing detailed timers
JanVogelsang Jun 19, 2023
0f540fb
Fixing detailed timers
JanVogelsang Jun 19, 2023
16e68d4
Fixing detailed timers
JanVogelsang Jun 19, 2023
e90d973
Fixing detailed timers
JanVogelsang Jun 19, 2023
4f70a06
Merge remote-tracking branch 'nest-master/master' into stdp_long_axon…
JanVogelsang Oct 13, 2023
eec3e7b
Fixing bugs related to the addition of axonal delays into the kernel
JanVogelsang Oct 24, 2023
f0e183b
Fixing bugs
JanVogelsang Nov 13, 2023
4222534
Fixed remaining bugs
JanVogelsang Nov 15, 2023
4fb5665
Merge remote-tracking branch 'nest-master/master' into stdp_long_axon…
JanVogelsang Nov 15, 2023
bd77090
Merging master and fixing format
JanVogelsang Nov 15, 2023
be79118
Fixed python formatting
JanVogelsang Nov 15, 2023
7531a9e
Fixed models
JanVogelsang Nov 15, 2023
5ef7af8
Fixed remaining issues in models
JanVogelsang Nov 15, 2023
fc5677c
Fixed remaining issues in models
JanVogelsang Nov 15, 2023
d086729
Fixed sonata
JanVogelsang Dec 5, 2023
adabcaf
Merge remote-tracking branch 'nest-master/master' into stdp_long_axon…
JanVogelsang Dec 5, 2023
7144093
Merged master
JanVogelsang Dec 5, 2023
fc5474d
Fixed black fomatting
JanVogelsang Dec 6, 2023
fbec6c2
Modified archiving node to work with NESTML
JanVogelsang Jan 19, 2024
371029a
Added stdp power-law synapse with homogeneous axonal delays
JanVogelsang Feb 2, 2024
3b84bf1
Added many-to-one example to generate results for the paper
JanVogelsang Feb 2, 2024
71c3a48
Added many-to-one example to generate results for the paper and some …
JanVogelsang Feb 13, 2024
76f47db
Removed grayscale versions of figures
JanVogelsang Feb 13, 2024
227a0bb
Merge remote-tracking branch 'nest-master/master' into stdp_long_axon…
JanVogelsang Feb 16, 2024
0a8a2f4
Added more results
JanVogelsang Feb 16, 2024
2f0fa6d
Merged master
JanVogelsang Feb 16, 2024
0806b1f
Merged master
JanVogelsang Feb 16, 2024
96c082c
Merged master
JanVogelsang Feb 16, 2024
b32ceea
Removed SpikeData from CorrectionEntry class
JanVogelsang Feb 29, 2024
f6000fb
Removed SpikeData from CorrectionEntry class
JanVogelsang Mar 6, 2024
9b50af3
New results
JanVogelsang Mar 21, 2024
c9c9391
New results
JanVogelsang Mar 26, 2024
7b38dde
Fixed detailed timers
JanVogelsang Apr 4, 2024
73ac032
Generated plots for presentation
JanVogelsang Jun 19, 2024
66a019a
Cleanup
JanVogelsang Jun 19, 2024
0bbf711
Added axonal delays as a member to connection class activated by dema…
JanVogelsang Jun 24, 2024
ebaa796
Updated plots
JanVogelsang Jun 24, 2024
343c290
Merge branch 'refs/heads/stdp_long_axonal_delays_paper' into stdp_lon…
JanVogelsang Jun 24, 2024
0a1fee5
Cleanup
JanVogelsang Jun 24, 2024
5564296
Cleanup
JanVogelsang Jun 24, 2024
e5223ef
Merge remote-tracking branch 'refs/remotes/nest-master/master' into s…
JanVogelsang Jun 24, 2024
5e88d27
Merging master
JanVogelsang Jun 24, 2024
bd9ef84
Merge branch 'refs/heads/nest-master' into stdp_long_axonal_delays
JanVogelsang Jun 24, 2024
120bc78
Cleanup
JanVogelsang Jun 24, 2024
c9b7df7
Black formatting
JanVogelsang Jun 24, 2024
d58bf7d
Fixed layer implementation
JanVogelsang Jun 25, 2024
ed26325
Fixing tests
JanVogelsang Jun 28, 2024
bf8a30a
Fixed formatting
JanVogelsang Jun 28, 2024
4d84321
Removed debugging files
JanVogelsang Jun 28, 2024
edefa0a
Bugfixes
JanVogelsang Jun 28, 2024
18b74b5
Bugfixes
JanVogelsang Jul 1, 2024
b050da0
Bugfixes
JanVogelsang Jul 1, 2024
cca3d98
Now throwing an exception if connections change during simulation and…
JanVogelsang Aug 13, 2024
bda2f0b
Merge branch 'stdp_long_axonal_delays' of github.com:JanVogelsang/nes…
JanVogelsang Aug 13, 2024
e137c86
Added docstring for sizes of newly introduces bitfields
JanVogelsang Sep 11, 2024
5fdd66a
Applied suggestions
JanVogelsang Feb 14, 2025
79304c2
Merge remote-tracking branch 'nest-master/master' into stdp_long_axon…
JanVogelsang Feb 14, 2025
7d3a2a8
Applied suggestions
JanVogelsang Feb 14, 2025
24a1005
Added documentation, example, additional tests, and improved user int…
JanVogelsang Mar 17, 2025
81b913e
Formatting
JanVogelsang Mar 17, 2025
7446f15
Formatting
JanVogelsang Mar 18, 2025
5fbc81d
Merge remote-tracking branch 'nest-master/master' into stdp_long_axon…
JanVogelsang Mar 18, 2025
e4cc119
Merged master
JanVogelsang Mar 18, 2025
73bdc9a
Bugfixes
JanVogelsang Mar 18, 2025
4480839
add modified content
jessica-mitchell Mar 19, 2025
7860494
add indexing to new files
jessica-mitchell Mar 19, 2025
4ea9718
rm notebook
jessica-mitchell Mar 19, 2025
9091464
update image
jessica-mitchell Mar 19, 2025
4b6e7d2
Bugfixes
JanVogelsang Mar 19, 2025
b2ae907
Merge branch 'stdp_long_axonal_delays' of github.com:jessica-mitchell…
JanVogelsang Mar 19, 2025
9b38135
Merge branch 'jessica-mitchell-stdp_long_axonal_delays' into stdp_lon…
JanVogelsang Mar 19, 2025
edaea29
Adjusted docs and now supporting ax-delay synapse and arbitrary neuro…
JanVogelsang Mar 19, 2025
bfb3f61
Fixed spatial implementation
JanVogelsang Mar 20, 2025
c57da34
Merge branch 'master' into stdp_long_axonal_delays
JanVogelsang Mar 20, 2025
d016563
Formatting
JanVogelsang Mar 20, 2025
6532f98
Merge branch 'stdp_long_axonal_delays' of github.com:JanVogelsang/nes…
JanVogelsang Mar 20, 2025
6958b3a
Bugfix
JanVogelsang Mar 20, 2025
f2b004a
Adjusted SP tests
JanVogelsang Mar 20, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
86 changes: 86 additions & 0 deletions doc/htmldoc/developer_space/axonal_delays.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,86 @@
.. _axonal_delays_dev:

Axonal Delays
=============

Adding axonal delays to NEST is non-trivial when it comes to their interaction with spike-timing dependent plasticity (STDP).
Axonal delays lower than their dendritic counterpart are non-problematic, however larger axonal delays cause causality
issues due to the way how and when pre- and post-synaptic spikes are processed in NEST by synapses implementing STDP weight dynamics.

If a pre-synaptic spike is processed at a synapse, it will also process all post-synaptic spikes that reached the synapse
between the last and current pre-synaptic spike. Weight changes due facilitation (post-synaptic spike following a
pre-synaptic one) or depression (pre-synaptic spike following a post-synaptic one) are only relevant at the time when pre-synaptic spikes
reach the synapse, as this is the only point in time when the exact weight is of importance. Post-synaptic spikes can
therefore be archived in the post-synaptic neuron until the next pre-synaptic spike is processed by the synapse.
As all pre-synaptic spikes are delivered to their target synapse and neuron right after they have been communicated,
they might be processed before they would actually reach the synapse when taking axonal delays into account.
If the axonal delay is now larger than the dendritic delay, post-synaptic
spikes occurring at time `t` will reach the synapse before pre-synaptic spikes occurring before `t`,
but might not be taken into account by the pre-synaptic spike, if it was already communicated,
and thus delivered, before `t`. Each pre-synaptic spike sent over a connection
with a predominant axonal delay must therefore also process post-synaptic spikes which have not yet occurred,
but could be emitted in the future. Multiple implementations were implemented and
benchmarked before coming to the conclusion that the implementation at hand should be used inside NEST.

The main idea of the axonal delays implementation in NEST is based on the fact, that neurons only emit few spikes per second.
It should thus be rare that a post-synaptic spike occurs right after a pre-synaptic one in the critical region before
the pre-synaptic spike reaches the synapse, but has already been processed. In typical networks, there will most likely
only be few occurrences where causality becomes an issue. In order to still guarantee correct synaptic weights,
incorrect STDP weight changes are rolled back, re-calculated, and the weight of pre-synaptic spike, which already reached
the target neuron's ring buffer, is corrected. Undoing the STDP weight changes and re-calculating them obviously comes
with a cost, however as only few such occurrences are to be expected, this solution is more efficient than restructuring
the kernel to make sure axonal delays are always handled correctly (see Alternative implementations).

Changes to the kernel and neuron models
---------------------------------------

Introducing axonal delays changes the way the min- and max-delays must be calculated, as they are now a combination of
dendritic and axonal delays. The default value for the delay which is now referring to the dendritic delay remains 1,
while the default value for axonal_delay is set to 0. In the default case, purely dendritic delay is assumed.

The ``ArchivingNode`` was made axonal-delay-aware. Each pre-synaptic spike after which a correction could potentially follow,
will be archived in the post-synaptic neuron in a dynamic ring-buffer-like structure. Post-synaptic spikes will then
trigger a correction for all relevant pre-synaptic spikes in this buffer. The way spikes are received at a neuron is
model-dependent, as the implementation of spike accumulation and buffering until being processed might vary between
neuron models. Neurons models will therefore also have to handle correction of previously handled spikes differently.
In the simplest case, all incoming spikes to a neuron are simply accumulated in a single scalar value per time slot.
A correction of a previously handled spike would therefore just subtract the previous, wrong weight and add the new,
corrected weight. Therefore, simply sending another spike with the difference of the old and new weight would be
sufficient in this case. However, some neurons might have different buffers for spikes being sent over inhibitory and
excitatory connections, which could be distinguished by the sign of the weight. If a new spike is now sent to correct
an old one, the sign might be negative even though both the old and new weight were originally positive, the new weight
is just smaller. In such a case, the spike would be accumulated in the wrong buffer.

Instead of sending a regular ``SpikeEvent`` to signal a correction, a ``CorrectionSpikeEvent`` is sent. Overloading the handle
function now allows handling the correction in the correct way, depending on the model implementation.
Furthermore, neuron models must now call ``ArchivingNode::pre_run_hook_()`` in their derived pre_run_hook implementation
and call ``reset_correction_entries_stdp_ax_delay_()`` at the end of their update implementation.
Currently, only the ``iaf_psc_alpha`` neuron model supports STDP with axonal delays.
All other neurons will act as if the delay of incoming connections was purely dendritic.

Synapse models only support dendritic delay by default. If axonal delays are required, the synapse model must be derived
from ``AxonalDelayConnection`` instead of ``Connection``. The ``AxonalDelayConnection`` is derived from ``Connection`` and adds a single
double-precision member for the axonal delay. The main differences compared to synapses with purely dendritic delays are
different handling of delays inside the send function and the addition of the ``correct_synapse_stdp_ax_delay`` which is
called by the ``ConnectionManager`` when a synapse needs to re-calculate its weight given a new post-synaptic spike and a previous pre-synaptic one.
Currently, only the ``stdp_pl_synapse_hom_ax_delay`` synapse model supports axonal delays.

Changes to the python interface
-------------------------------

In general, the kernel was made axonal-delay-aware and this is reflected in the user interface, as it is now possible
to set the ``names::dendritic_delay`` and ``names::axonal_delay`` for each synapse (given that the synapse model is
derived from ``AxonalDelayConnection``).

Remaining work
---------------


Currently, only one neuron and synapse model are supporting axonal delays. All neuron models that support STDP could
also support axonal delays, without sacrificing performance, changing their behavior, or requiring more memory, but need
to be adapted slightly (i.e., implement handle for ``CorrectionSpikeEvent``, call ``ArchivingNode::pre_run_hook_`` and call
``reset_correction_entries_stdp_ax_delay_``).

Existing STDP synapse models need one version with and one without axonal delays. Alternatively, synapse models could
be templatized to either use only dendritic or dendritic and axonal delays. However, this branching should be resolved
at compile time to not negatively impact performance.
7 changes: 4 additions & 3 deletions doc/htmldoc/developer_space/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -6,9 +6,9 @@ Developer space
Here is all documentation pertaining to the development of NEST.
It is documentation for anyone needing to touch the code or documentation.

.. grid:: 3
.. grid:: 3

.. grid-item-card::
.. grid-item-card::
:link-type: ref
:link: dev_install
:class-card: sd-bg-success sd-text-white
Expand Down Expand Up @@ -112,7 +112,7 @@ Developer guides

.. toctree::
:maxdepth: 1
:hidden:
:hidden:
:glob:

workflows/*
Expand All @@ -122,4 +122,5 @@ Developer guides
guidelines/styleguide/vim_support_sli
templates/*
sli_docs/index
axonal_delays
cppcomments
5 changes: 5 additions & 0 deletions doc/htmldoc/examples/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -219,6 +219,10 @@ PyNEST examples
* :doc:`/auto_examples/eprop_plasticity/eprop_supervised_regression_sine-waves`
* :doc:`/auto_examples/eprop_plasticity/eprop_supervised_classification_neuromorphic_mnist`

.. grid-item-card:: Axonal delays
:img-top: ../static/img/nest_logo-faded.png

* :doc:`/auto_examples/axonal_delays`

.. grid:: 1 1 2 3

Expand Down Expand Up @@ -375,3 +379,4 @@ PyNEST examples
../auto_examples/pong/run_simulations
../auto_examples/pong/pong
../auto_examples/pong/generate_gif
../auto_examples/axonal_delays
1 change: 1 addition & 0 deletions doc/htmldoc/get-started_index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -159,6 +159,7 @@ More topics

* :ref:`sim_gap_junctions`
* :ref:`weight_normalization`
* :ref:`delays`



Expand Down
73 changes: 73 additions & 0 deletions doc/htmldoc/synapses/delays.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,73 @@
.. _delays:

Delays
======


In NEST, transmission delays are specified with the ``delay`` parameter.
Delays are considered fully dendritic by all built-in models and therefore, the ``delay`` parameter is still used by
most models.

Since NEST 3.9, it is also possible to specify both explicit, heterogeneous axonal and dendritic delays for models
supporting this feature. This is useful for STDP models and other models relying on the timing of spike arrival at the
synapse.

Currently, only ``stdp_pl_synapse_hom_ax_delay`` supports explicitly specifying axonal and dendritic delays with the
``axonal_delay`` and ``dendtritic_delay`` parameters. For STDP with predominant axonal delays, neuron models must be
adjusted to correctly handle these delays. At this point, only ``iaf_psc_alpha`` supports STDP with predominant axonal
delays.

When using ``stdp_pl_synapse_hom_ax_delay``:

- The parameter ``delay`` is no longer valid. This is to prevent ambiguity between the two types of delays.
- The parameter names ``dendritic_delay`` and ``axonal_delay`` have to be used to specify delay.
- If these parameters are not explicitly provided, then the default values are used:

``dendritic_delay: 1.0`` and ``axonal_delay: 0.0``.
- If only axonal delay is provided and no dendritic delay, the dendritic delay is assumed to be 0 and vice-versa.


Use of ``axonal_delay`` and ``dendritic_delay`` is the same as ``delay``:


**Using syn_spec**

.. code-block:: python

nest.Create("iaf_psc_alpha")
nest.Connect(neuron, neuron, syn_spec={"synapse_model": "stdp_pl_synapse_hom", "delay": 1.0})

.. code-block:: python

nest.Create("iaf_psc_alpha")
nest.Connect(neuron, neuron, syn_spec=
{"synapse_model": "stdp_pl_synapse_hom_ax_delay", "axonal_delay": 1.0, "dendritic_delay": 1.0})

**Using SetStatus**

.. code-block:: python

conn = nest.Connect(neuron, neuron, syn_spec={"synapse_model": "stdp_pl_synapse_hom"})
nest.SetStatus(conn, {"delay": 1.0})

.. code-block:: python

conn = nest.Connect(neuron, neuron, syn_spec={"synapse_model": "stdp_pl_synapse_hom_ax_delay"})
nest.SetStatus(conn, {"axonal_delay": 1.0, "dendritic_delay": 1.0})

**Using SetDefaults**

.. code-block:: python

nest.SetDefaults("stdp_pl_synapse_hom", {"delay": 1.0})

.. code-block:: python

nest.SetDefaults("stdp_pl_synapse_hom_ax_delay", {"axonal_delay": 1.0, "dendritic_delay": 1.0})


.. seealso::

:doc:`Example using axonal delays </auto_examples/axonal_delays>`

For details on further developments see :ref:`axonal_delays_dev`.
8 changes: 7 additions & 1 deletion libnestutil/nest_types.h
Original file line number Diff line number Diff line change
Expand Up @@ -91,9 +91,15 @@ constexpr uint8_t NUM_BITS_LCID = 27U;
constexpr uint8_t NUM_BITS_PROCESSED_FLAG = 1U;
constexpr uint8_t NUM_BITS_MARKER_SPIKE_DATA = 2U;
constexpr uint8_t NUM_BITS_LAG = 14U;
constexpr uint8_t NUM_BITS_DELAY = 21U;
constexpr uint8_t NUM_BITS_NODE_ID = 62U;

// These types are used in delay_types.h and denote the space available for the dendritic and axonal portions of the
// total transmission delay. The delay is only split into two parts for selected synapse types.
// Given that axonal delays can be much larger than dendritic/backpropagation delays, they require more bits.
constexpr uint8_t NUM_BITS_DENDRITIC_DELAY = 14U;
constexpr uint8_t NUM_BITS_AXONAL_DELAY = sizeof( unsigned int ) * 8 - NUM_BITS_DENDRITIC_DELAY;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I do not entirely understand the logic here, in part because the old code lacks comments. First of all, which bitfields belong together, i.e., should fit into some given space such as 4 or 8 bytes? Furthermore, we had 21 bits for delay in the past, now we go for an implementation-defined size (number of bytes in an unsigned int). Is there any specific reason for choosing this size? If so, comment on it. If not, wouldn't it make more sense to use a well defined size, e.g., always 32 bits? That would make for 18:14 split, which seems generous towards the dendritic delays.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Right now, in case of axonal+dendritic delays we use a 18:14 split, which is a lot and could be reduced by a few bits in case they were required for anything else. But right now we simply don't need more bits for anything, so this should be fine. However, I am not sure how to best guarantee that the underlying data type (here: unsigned int) always has exactly 32 bits. For the single-delay-value case (which represents total delay, but will also be interpreted as pure dendritic delay by some models) we could also just use 21 bits as before, but we don't need those remaining 11 bits for anything else, so there is no reason to use a bitfield here (which might add a tiny bit of unnecessary computational overhead). However, one must definitely ensure that we always have at least 21b for this value, i.e., enforce 32 bits.

According to StackOverflow, this would be a solution:

You can use exact-width integer types int8_t, int16_t, int32_t, int64_t declared in . This way the sizes are fixed on all the platforms

I think we should therefore use these values in all locations where we want to specify size explicitly. Then we could also remove the StaticAssert calls, which become redundant.



// Maximally allowed values for bitfields

constexpr uint64_t MAX_LCID = generate_max_value( NUM_BITS_LCID );
Expand Down
39 changes: 20 additions & 19 deletions models/ac_generator.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -56,30 +56,29 @@ RecordablesMap< ac_generator >::create()
{
insert_( Name( names::I ), &ac_generator::get_I_ );
}
}

/* ----------------------------------------------------------------
* Default constructors defining default parameters and state
* ---------------------------------------------------------------- */

nest::ac_generator::Parameters_::Parameters_()
ac_generator::Parameters_::Parameters_()
: amp_( 0.0 ) // pA
, offset_( 0.0 ) // pA
, freq_( 0.0 ) // Hz
, phi_deg_( 0.0 ) // degree
{
}

nest::ac_generator::Parameters_::Parameters_( const Parameters_& p )
ac_generator::Parameters_::Parameters_( const Parameters_& p )
: amp_( p.amp_ )
, offset_( p.offset_ )
, freq_( p.freq_ )
, phi_deg_( p.phi_deg_ )
{
}

nest::ac_generator::Parameters_&
nest::ac_generator::Parameters_::operator=( const Parameters_& p )
ac_generator::Parameters_&
ac_generator::Parameters_::operator=( const Parameters_& p )
{
if ( this == &p )
{
Expand All @@ -94,19 +93,19 @@ nest::ac_generator::Parameters_::operator=( const Parameters_& p )
return *this;
}

nest::ac_generator::State_::State_()
ac_generator::State_::State_()
: y_0_( 0.0 )
, y_1_( 0.0 ) // pA
, I_( 0.0 ) // pA
{
}

nest::ac_generator::Buffers_::Buffers_( ac_generator& n )
ac_generator::Buffers_::Buffers_( ac_generator& n )
: logger_( n )
{
}

nest::ac_generator::Buffers_::Buffers_( const Buffers_&, ac_generator& n )
ac_generator::Buffers_::Buffers_( const Buffers_&, ac_generator& n )
: logger_( n )
{
}
Expand All @@ -116,7 +115,7 @@ nest::ac_generator::Buffers_::Buffers_( const Buffers_&, ac_generator& n )
* ---------------------------------------------------------------- */

void
nest::ac_generator::Parameters_::get( DictionaryDatum& d ) const
ac_generator::Parameters_::get( DictionaryDatum& d ) const
{
( *d )[ names::amplitude ] = amp_;
( *d )[ names::offset ] = offset_;
Expand All @@ -125,14 +124,14 @@ nest::ac_generator::Parameters_::get( DictionaryDatum& d ) const
}

void
nest::ac_generator::State_::get( DictionaryDatum& d ) const
ac_generator::State_::get( DictionaryDatum& d ) const
{
( *d )[ names::y_0 ] = y_0_;
( *d )[ names::y_1 ] = y_1_;
}

void
nest::ac_generator::Parameters_::set( const DictionaryDatum& d, Node* node )
ac_generator::Parameters_::set( const DictionaryDatum& d, Node* node )
{
updateValueParam< double >( d, names::amplitude, amp_, node );
updateValueParam< double >( d, names::offset, offset_, node );
Expand All @@ -145,7 +144,7 @@ nest::ac_generator::Parameters_::set( const DictionaryDatum& d, Node* node )
* Default and copy constructor for node
* ---------------------------------------------------------------- */

nest::ac_generator::ac_generator()
ac_generator::ac_generator()
: StimulationDevice()
, P_()
, S_()
Expand All @@ -154,7 +153,7 @@ nest::ac_generator::ac_generator()
recordablesMap_.create();
}

nest::ac_generator::ac_generator( const ac_generator& n )
ac_generator::ac_generator( const ac_generator& n )
: StimulationDevice( n )
, P_( n.P_ )
, S_( n.S_ )
Expand All @@ -168,20 +167,20 @@ nest::ac_generator::ac_generator( const ac_generator& n )
* ---------------------------------------------------------------- */

void
nest::ac_generator::init_state_()
ac_generator::init_state_()
{
StimulationDevice::init_state();
}

void
nest::ac_generator::init_buffers_()
ac_generator::init_buffers_()
{
StimulationDevice::init_buffers();
B_.logger_.reset();
}

void
nest::ac_generator::pre_run_hook()
ac_generator::pre_run_hook()
{
B_.logger_.init();

Expand All @@ -206,7 +205,7 @@ nest::ac_generator::pre_run_hook()
}

void
nest::ac_generator::update( Time const& origin, const long from, const long to )
ac_generator::update( Time const& origin, const long from, const long to )
{
long start = origin.get_steps();

Expand All @@ -231,7 +230,7 @@ nest::ac_generator::update( Time const& origin, const long from, const long to )
}

void
nest::ac_generator::handle( DataLoggingRequest& e )
ac_generator::handle( DataLoggingRequest& e )
{
B_.logger_.handle( e );
}
Expand All @@ -241,7 +240,7 @@ nest::ac_generator::handle( DataLoggingRequest& e )
* ---------------------------------------------------------------- */

void
nest::ac_generator::set_data_from_stimulation_backend( std::vector< double >& input_param )
ac_generator::set_data_from_stimulation_backend( std::vector< double >& input_param )
{
Parameters_ ptmp = P_; // temporary copy in case of errors

Expand All @@ -264,3 +263,5 @@ nest::ac_generator::set_data_from_stimulation_backend( std::vector< double >& in
// if we get here, temporary contains consistent set of properties
P_ = ptmp;
}

} // namespace nest
2 changes: 2 additions & 0 deletions models/aeif_cond_alpha.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -426,6 +426,8 @@ nest::aeif_cond_alpha::init_buffers_()
void
nest::aeif_cond_alpha::pre_run_hook()
{
ArchivingNode::pre_run_hook_();

// ensures initialization in case mm connected after Simulate
B_.logger_.init();

Expand Down
Loading
Loading