Skip to content
Open
Show file tree
Hide file tree
Changes from 82 commits
Commits
Show all changes
129 commits
Select commit Hold shift + click to select a range
101a5cc
Handle axonal delay in original stdp-pl synapse
suku248 Mar 31, 2022
bb37f53
Add first version of framework for STDP with long axonal delay
Mar 4, 2022
0bbfc27
Use ring buffer to temporarily store spike data for retrospective cor…
suku248 May 23, 2022
7362c26
Added stdp-pl python test and fixed postsynaptic trace history cleanu…
JanVogelsang Jul 25, 2022
1e73b4c
Adding python test for stdp with axonal delays and fixing correction-…
JanVogelsang Sep 22, 2022
811f18e
Applied clang-format v13
JanVogelsang Oct 6, 2022
92a0ca0
Made stdp test runnable
JanVogelsang Mar 2, 2023
e69c17c
Improved stdp pl synapse hom test
JanVogelsang Mar 8, 2023
78aadfa
Merged master
JanVogelsang Mar 10, 2023
9d1729e
Update stdp_pl_synapse_hom_ax_delay.h
JanVogelsang Mar 10, 2023
dee2589
Fixed formatting
JanVogelsang Mar 10, 2023
cd6887b
Merge branch 'stdp_long_axonal_delays' of github.com:JanVogelsang/nes…
JanVogelsang Mar 10, 2023
b624ad5
Removed old tests
JanVogelsang Mar 10, 2023
d76f654
Made whole kernel axonal-delay aware
JanVogelsang Mar 14, 2023
1e171b0
Fixed issues with min and max delays
JanVogelsang Mar 31, 2023
780915b
Fixed default delays
JanVogelsang Apr 4, 2023
0f91f45
Improved BadDelay exception
JanVogelsang Apr 4, 2023
c2dbd84
Improved stdp bugfix
JanVogelsang Apr 17, 2023
bf0f6fc
Fixed issues with correction-based STDP
JanVogelsang Apr 24, 2023
75a870c
Fixed issues with correction-based STDP
JanVogelsang Apr 24, 2023
c9be932
Fixed issues with correction-based STDP
JanVogelsang Apr 24, 2023
cf704e6
Added detailed timers
JanVogelsang May 9, 2023
5f60ede
Added stopwatch for time spent for correction
JanVogelsang May 23, 2023
68d3a93
Added ignore and fire neuron and collecting number of corrections
JanVogelsang Jun 7, 2023
9731013
Added ignore and fire neuron
JanVogelsang Jun 7, 2023
c447b9f
Fixed ignore and fire for corrections
JanVogelsang Jun 13, 2023
ed5147e
Reducing correction entry vector size when clearing
JanVogelsang Jun 13, 2023
cb83c25
Fixing detailed timers
JanVogelsang Jun 19, 2023
0f540fb
Fixing detailed timers
JanVogelsang Jun 19, 2023
16e68d4
Fixing detailed timers
JanVogelsang Jun 19, 2023
e90d973
Fixing detailed timers
JanVogelsang Jun 19, 2023
4f70a06
Merge remote-tracking branch 'nest-master/master' into stdp_long_axon…
JanVogelsang Oct 13, 2023
eec3e7b
Fixing bugs related to the addition of axonal delays into the kernel
JanVogelsang Oct 24, 2023
f0e183b
Fixing bugs
JanVogelsang Nov 13, 2023
4222534
Fixed remaining bugs
JanVogelsang Nov 15, 2023
4fb5665
Merge remote-tracking branch 'nest-master/master' into stdp_long_axon…
JanVogelsang Nov 15, 2023
bd77090
Merging master and fixing format
JanVogelsang Nov 15, 2023
be79118
Fixed python formatting
JanVogelsang Nov 15, 2023
7531a9e
Fixed models
JanVogelsang Nov 15, 2023
5ef7af8
Fixed remaining issues in models
JanVogelsang Nov 15, 2023
fc5677c
Fixed remaining issues in models
JanVogelsang Nov 15, 2023
d086729
Fixed sonata
JanVogelsang Dec 5, 2023
adabcaf
Merge remote-tracking branch 'nest-master/master' into stdp_long_axon…
JanVogelsang Dec 5, 2023
7144093
Merged master
JanVogelsang Dec 5, 2023
fc5474d
Fixed black fomatting
JanVogelsang Dec 6, 2023
fbec6c2
Modified archiving node to work with NESTML
JanVogelsang Jan 19, 2024
371029a
Added stdp power-law synapse with homogeneous axonal delays
JanVogelsang Feb 2, 2024
3b84bf1
Added many-to-one example to generate results for the paper
JanVogelsang Feb 2, 2024
71c3a48
Added many-to-one example to generate results for the paper and some …
JanVogelsang Feb 13, 2024
76f47db
Removed grayscale versions of figures
JanVogelsang Feb 13, 2024
227a0bb
Merge remote-tracking branch 'nest-master/master' into stdp_long_axon…
JanVogelsang Feb 16, 2024
0a8a2f4
Added more results
JanVogelsang Feb 16, 2024
2f0fa6d
Merged master
JanVogelsang Feb 16, 2024
0806b1f
Merged master
JanVogelsang Feb 16, 2024
96c082c
Merged master
JanVogelsang Feb 16, 2024
b32ceea
Removed SpikeData from CorrectionEntry class
JanVogelsang Feb 29, 2024
f6000fb
Removed SpikeData from CorrectionEntry class
JanVogelsang Mar 6, 2024
9b50af3
New results
JanVogelsang Mar 21, 2024
c9c9391
New results
JanVogelsang Mar 26, 2024
7b38dde
Fixed detailed timers
JanVogelsang Apr 4, 2024
73ac032
Generated plots for presentation
JanVogelsang Jun 19, 2024
66a019a
Cleanup
JanVogelsang Jun 19, 2024
0bbf711
Added axonal delays as a member to connection class activated by dema…
JanVogelsang Jun 24, 2024
ebaa796
Updated plots
JanVogelsang Jun 24, 2024
343c290
Merge branch 'refs/heads/stdp_long_axonal_delays_paper' into stdp_lon…
JanVogelsang Jun 24, 2024
0a1fee5
Cleanup
JanVogelsang Jun 24, 2024
5564296
Cleanup
JanVogelsang Jun 24, 2024
e5223ef
Merge remote-tracking branch 'refs/remotes/nest-master/master' into s…
JanVogelsang Jun 24, 2024
5e88d27
Merging master
JanVogelsang Jun 24, 2024
bd9ef84
Merge branch 'refs/heads/nest-master' into stdp_long_axonal_delays
JanVogelsang Jun 24, 2024
120bc78
Cleanup
JanVogelsang Jun 24, 2024
c9b7df7
Black formatting
JanVogelsang Jun 24, 2024
d58bf7d
Fixed layer implementation
JanVogelsang Jun 25, 2024
ed26325
Fixing tests
JanVogelsang Jun 28, 2024
bf8a30a
Fixed formatting
JanVogelsang Jun 28, 2024
4d84321
Removed debugging files
JanVogelsang Jun 28, 2024
edefa0a
Bugfixes
JanVogelsang Jun 28, 2024
18b74b5
Bugfixes
JanVogelsang Jul 1, 2024
b050da0
Bugfixes
JanVogelsang Jul 1, 2024
cca3d98
Now throwing an exception if connections change during simulation and…
JanVogelsang Aug 13, 2024
bda2f0b
Merge branch 'stdp_long_axonal_delays' of github.com:JanVogelsang/nes…
JanVogelsang Aug 13, 2024
e137c86
Added docstring for sizes of newly introduces bitfields
JanVogelsang Sep 11, 2024
5fdd66a
Applied suggestions
JanVogelsang Feb 14, 2025
79304c2
Merge remote-tracking branch 'nest-master/master' into stdp_long_axon…
JanVogelsang Feb 14, 2025
7d3a2a8
Applied suggestions
JanVogelsang Feb 14, 2025
24a1005
Added documentation, example, additional tests, and improved user int…
JanVogelsang Mar 17, 2025
81b913e
Formatting
JanVogelsang Mar 17, 2025
7446f15
Formatting
JanVogelsang Mar 18, 2025
5fbc81d
Merge remote-tracking branch 'nest-master/master' into stdp_long_axon…
JanVogelsang Mar 18, 2025
e4cc119
Merged master
JanVogelsang Mar 18, 2025
73bdc9a
Bugfixes
JanVogelsang Mar 18, 2025
4480839
add modified content
jessica-mitchell Mar 19, 2025
7860494
add indexing to new files
jessica-mitchell Mar 19, 2025
4ea9718
rm notebook
jessica-mitchell Mar 19, 2025
9091464
update image
jessica-mitchell Mar 19, 2025
4b6e7d2
Bugfixes
JanVogelsang Mar 19, 2025
b2ae907
Merge branch 'stdp_long_axonal_delays' of github.com:jessica-mitchell…
JanVogelsang Mar 19, 2025
9b38135
Merge branch 'jessica-mitchell-stdp_long_axonal_delays' into stdp_lon…
JanVogelsang Mar 19, 2025
edaea29
Adjusted docs and now supporting ax-delay synapse and arbitrary neuro…
JanVogelsang Mar 19, 2025
bfb3f61
Fixed spatial implementation
JanVogelsang Mar 20, 2025
c57da34
Merge branch 'master' into stdp_long_axonal_delays
JanVogelsang Mar 20, 2025
d016563
Formatting
JanVogelsang Mar 20, 2025
6532f98
Merge branch 'stdp_long_axonal_delays' of github.com:JanVogelsang/nes…
JanVogelsang Mar 20, 2025
6958b3a
Bugfix
JanVogelsang Mar 20, 2025
f2b004a
Adjusted SP tests
JanVogelsang Mar 20, 2025
b435e30
Minor changes
JanVogelsang Jun 23, 2025
37e54a2
Fixed weight correction for stdp synapses with axonal delays for edge…
JanVogelsang Jul 23, 2025
17844c0
Improved STDP with axonal delays test
JanVogelsang Jul 23, 2025
a6efaeb
Improved STDP with axonal delays tests
JanVogelsang Jul 24, 2025
b91389f
Merge remote-tracking branch 'nest-master/master' into stdp_long_axon…
JanVogelsang Jul 25, 2025
c475fbd
Merge remote-tracking branch 'nest-master/master' into stdp_long_axon…
JanVogelsang Sep 26, 2025
c38085c
Fixed remaining edge case for axonal delay implementation
JanVogelsang Sep 30, 2025
2b8c9ae
Uncommented debugging output
JanVogelsang Oct 1, 2025
5302f09
Fixed STDP pl test comments
JanVogelsang Oct 1, 2025
223910f
Fixed remaining edge case for axonal delay implementation
JanVogelsang Oct 16, 2025
80ac0af
Added ax-delay support to iaf_psc_delta; added edge cases to stdp ax-…
JanVogelsang Oct 20, 2025
20bee12
Update doc/htmldoc/synapses/delays.rst
JanVogelsang Oct 27, 2025
1c1a9b3
Merge remote-tracking branch 'nest-master/master' into stdp_long_axon…
JanVogelsang Nov 10, 2025
f9b6d77
Generated new results
JanVogelsang Nov 10, 2025
21ef67f
Merge branch 'stdp_long_axonal_delays' of github.com:JanVogelsang/nes…
JanVogelsang Nov 10, 2025
fabfed3
Removed paper plots again
JanVogelsang Nov 11, 2025
d6fc2f7
Merge branch 'master' of https://github.com/nest/nest-simulator into …
JanVogelsang Nov 24, 2025
12403bf
Removed paper plots again
JanVogelsang Nov 24, 2025
4df4467
Made iaf_psc_exp axonal-delay compatible
JanVogelsang Nov 28, 2025
fd3d024
Merge remote-tracking branch 'nest-master/master' into stdp_long_axon…
JanVogelsang Dec 2, 2025
26a166b
Merge remote-tracking branch 'nest-master/master' into stdp_long_axon…
JanVogelsang Dec 2, 2025
77d75c6
Merge branch 'master' of https://github.com/nest/nest-simulator into …
JanVogelsang Dec 4, 2025
e837ccb
Removed iaf_psc_delta from stdp ax-delay test again
JanVogelsang Dec 4, 2025
01440e2
Formatting
JanVogelsang Dec 4, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 7 additions & 1 deletion libnestutil/nest_types.h
Original file line number Diff line number Diff line change
Expand Up @@ -91,9 +91,15 @@ constexpr uint8_t NUM_BITS_LCID = 27U;
constexpr uint8_t NUM_BITS_PROCESSED_FLAG = 1U;
constexpr uint8_t NUM_BITS_MARKER_SPIKE_DATA = 2U;
constexpr uint8_t NUM_BITS_LAG = 14U;
constexpr uint8_t NUM_BITS_DELAY = 21U;
constexpr uint8_t NUM_BITS_NODE_ID = 62U;

// These types are used in delay_types.h and denote the space available for the dendritic and axonal portions of the
// total transmission delay. The delay is only split into two parts for selected synapse types.
// Given that axonal delays can be much larger than dendritic/backpropagation delays, they require more bits.
constexpr uint8_t NUM_BITS_DENDRITIC_DELAY = 14U;
constexpr uint8_t NUM_BITS_AXONAL_DELAY = sizeof( unsigned int ) * 8 - NUM_BITS_DENDRITIC_DELAY;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I do not entirely understand the logic here, in part because the old code lacks comments. First of all, which bitfields belong together, i.e., should fit into some given space such as 4 or 8 bytes? Furthermore, we had 21 bits for delay in the past, now we go for an implementation-defined size (number of bytes in an unsigned int). Is there any specific reason for choosing this size? If so, comment on it. If not, wouldn't it make more sense to use a well defined size, e.g., always 32 bits? That would make for 18:14 split, which seems generous towards the dendritic delays.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Right now, in case of axonal+dendritic delays we use a 18:14 split, which is a lot and could be reduced by a few bits in case they were required for anything else. But right now we simply don't need more bits for anything, so this should be fine. However, I am not sure how to best guarantee that the underlying data type (here: unsigned int) always has exactly 32 bits. For the single-delay-value case (which represents total delay, but will also be interpreted as pure dendritic delay by some models) we could also just use 21 bits as before, but we don't need those remaining 11 bits for anything else, so there is no reason to use a bitfield here (which might add a tiny bit of unnecessary computational overhead). However, one must definitely ensure that we always have at least 21b for this value, i.e., enforce 32 bits.

According to StackOverflow, this would be a solution:

You can use exact-width integer types int8_t, int16_t, int32_t, int64_t declared in . This way the sizes are fixed on all the platforms

I think we should therefore use these values in all locations where we want to specify size explicitly. Then we could also remove the StaticAssert calls, which become redundant.



// Maximally allowed values for bitfields

constexpr uint64_t MAX_LCID = generate_max_value( NUM_BITS_LCID );
Expand Down
39 changes: 20 additions & 19 deletions models/ac_generator.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -56,30 +56,29 @@ RecordablesMap< ac_generator >::create()
{
insert_( Name( names::I ), &ac_generator::get_I_ );
}
}

/* ----------------------------------------------------------------
* Default constructors defining default parameters and state
* ---------------------------------------------------------------- */

nest::ac_generator::Parameters_::Parameters_()
ac_generator::Parameters_::Parameters_()
: amp_( 0.0 ) // pA
, offset_( 0.0 ) // pA
, freq_( 0.0 ) // Hz
, phi_deg_( 0.0 ) // degree
{
}

nest::ac_generator::Parameters_::Parameters_( const Parameters_& p )
ac_generator::Parameters_::Parameters_( const Parameters_& p )
: amp_( p.amp_ )
, offset_( p.offset_ )
, freq_( p.freq_ )
, phi_deg_( p.phi_deg_ )
{
}

nest::ac_generator::Parameters_&
nest::ac_generator::Parameters_::operator=( const Parameters_& p )
ac_generator::Parameters_&
ac_generator::Parameters_::operator=( const Parameters_& p )
{
if ( this == &p )
{
Expand All @@ -94,19 +93,19 @@ nest::ac_generator::Parameters_::operator=( const Parameters_& p )
return *this;
}

nest::ac_generator::State_::State_()
ac_generator::State_::State_()
: y_0_( 0.0 )
, y_1_( 0.0 ) // pA
, I_( 0.0 ) // pA
{
}

nest::ac_generator::Buffers_::Buffers_( ac_generator& n )
ac_generator::Buffers_::Buffers_( ac_generator& n )
: logger_( n )
{
}

nest::ac_generator::Buffers_::Buffers_( const Buffers_&, ac_generator& n )
ac_generator::Buffers_::Buffers_( const Buffers_&, ac_generator& n )
: logger_( n )
{
}
Expand All @@ -116,7 +115,7 @@ nest::ac_generator::Buffers_::Buffers_( const Buffers_&, ac_generator& n )
* ---------------------------------------------------------------- */

void
nest::ac_generator::Parameters_::get( DictionaryDatum& d ) const
ac_generator::Parameters_::get( DictionaryDatum& d ) const
{
( *d )[ names::amplitude ] = amp_;
( *d )[ names::offset ] = offset_;
Expand All @@ -125,14 +124,14 @@ nest::ac_generator::Parameters_::get( DictionaryDatum& d ) const
}

void
nest::ac_generator::State_::get( DictionaryDatum& d ) const
ac_generator::State_::get( DictionaryDatum& d ) const
{
( *d )[ names::y_0 ] = y_0_;
( *d )[ names::y_1 ] = y_1_;
}

void
nest::ac_generator::Parameters_::set( const DictionaryDatum& d, Node* node )
ac_generator::Parameters_::set( const DictionaryDatum& d, Node* node )
{
updateValueParam< double >( d, names::amplitude, amp_, node );
updateValueParam< double >( d, names::offset, offset_, node );
Expand All @@ -145,7 +144,7 @@ nest::ac_generator::Parameters_::set( const DictionaryDatum& d, Node* node )
* Default and copy constructor for node
* ---------------------------------------------------------------- */

nest::ac_generator::ac_generator()
ac_generator::ac_generator()
: StimulationDevice()
, P_()
, S_()
Expand All @@ -154,7 +153,7 @@ nest::ac_generator::ac_generator()
recordablesMap_.create();
}

nest::ac_generator::ac_generator( const ac_generator& n )
ac_generator::ac_generator( const ac_generator& n )
: StimulationDevice( n )
, P_( n.P_ )
, S_( n.S_ )
Expand All @@ -168,20 +167,20 @@ nest::ac_generator::ac_generator( const ac_generator& n )
* ---------------------------------------------------------------- */

void
nest::ac_generator::init_state_()
ac_generator::init_state_()
{
StimulationDevice::init_state();
}

void
nest::ac_generator::init_buffers_()
ac_generator::init_buffers_()
{
StimulationDevice::init_buffers();
B_.logger_.reset();
}

void
nest::ac_generator::pre_run_hook()
ac_generator::pre_run_hook()
{
B_.logger_.init();

Expand All @@ -206,7 +205,7 @@ nest::ac_generator::pre_run_hook()
}

void
nest::ac_generator::update( Time const& origin, const long from, const long to )
ac_generator::update( Time const& origin, const long from, const long to )
{
long start = origin.get_steps();

Expand All @@ -231,7 +230,7 @@ nest::ac_generator::update( Time const& origin, const long from, const long to )
}

void
nest::ac_generator::handle( DataLoggingRequest& e )
ac_generator::handle( DataLoggingRequest& e )
{
B_.logger_.handle( e );
}
Expand All @@ -241,7 +240,7 @@ nest::ac_generator::handle( DataLoggingRequest& e )
* ---------------------------------------------------------------- */

void
nest::ac_generator::set_data_from_stimulation_backend( std::vector< double >& input_param )
ac_generator::set_data_from_stimulation_backend( std::vector< double >& input_param )
{
Parameters_ ptmp = P_; // temporary copy in case of errors

Expand All @@ -264,3 +263,5 @@ nest::ac_generator::set_data_from_stimulation_backend( std::vector< double >& in
// if we get here, temporary contains consistent set of properties
P_ = ptmp;
}

} // namespace nest
2 changes: 2 additions & 0 deletions models/aeif_cond_alpha.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -426,6 +426,8 @@ nest::aeif_cond_alpha::init_buffers_()
void
nest::aeif_cond_alpha::pre_run_hook()
{
ArchivingNode::pre_run_hook_();

// ensures initialization in case mm connected after Simulate
B_.logger_.init();

Expand Down
2 changes: 2 additions & 0 deletions models/aeif_cond_alpha_astro.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -429,6 +429,8 @@ nest::aeif_cond_alpha_astro::init_buffers_()
void
nest::aeif_cond_alpha_astro::pre_run_hook()
{
ArchivingNode::pre_run_hook_();

// ensures initialization in case mm connected after Simulate
B_.logger_.init();

Expand Down
2 changes: 2 additions & 0 deletions models/aeif_cond_alpha_multisynapse.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -441,6 +441,8 @@ aeif_cond_alpha_multisynapse::init_buffers_()
void
aeif_cond_alpha_multisynapse::pre_run_hook()
{
ArchivingNode::pre_run_hook_();

// ensures initialization in case mm connected after Simulate
B_.logger_.init();

Expand Down
2 changes: 2 additions & 0 deletions models/aeif_cond_beta_multisynapse.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -449,6 +449,8 @@ aeif_cond_beta_multisynapse::init_buffers_()
void
aeif_cond_beta_multisynapse::pre_run_hook()
{
ArchivingNode::pre_run_hook_();

// ensures initialization in case mm connected after Simulate
B_.logger_.init();

Expand Down
2 changes: 2 additions & 0 deletions models/aeif_cond_exp.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -421,6 +421,8 @@ nest::aeif_cond_exp::init_buffers_()
void
nest::aeif_cond_exp::pre_run_hook()
{
ArchivingNode::pre_run_hook_();

// ensures initialization in case mm connected after Simulate
B_.logger_.init();

Expand Down
2 changes: 2 additions & 0 deletions models/aeif_psc_alpha.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -416,6 +416,8 @@ nest::aeif_psc_alpha::init_buffers_()
void
nest::aeif_psc_alpha::pre_run_hook()
{
ArchivingNode::pre_run_hook_();

// ensures initialization in case mm connected after Simulate
B_.logger_.init();

Expand Down
2 changes: 2 additions & 0 deletions models/aeif_psc_delta.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -392,6 +392,8 @@ nest::aeif_psc_delta::init_buffers_()
void
nest::aeif_psc_delta::pre_run_hook()
{
ArchivingNode::pre_run_hook_();

// ensures initialization in case mm connected after Simulate
B_.logger_.init();

Expand Down
2 changes: 2 additions & 0 deletions models/aeif_psc_delta_clopath.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -460,6 +460,8 @@ nest::aeif_psc_delta_clopath::init_buffers_()
void
nest::aeif_psc_delta_clopath::pre_run_hook()
{
ArchivingNode::pre_run_hook_();

// ensures initialization in case mm connected after Simulate
B_.logger_.init();

Expand Down
2 changes: 2 additions & 0 deletions models/aeif_psc_exp.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -411,6 +411,8 @@ nest::aeif_psc_exp::init_buffers_()
void
nest::aeif_psc_exp::pre_run_hook()
{
ArchivingNode::pre_run_hook_();

// ensures initialization in case mm connected after Simulate
B_.logger_.init();

Expand Down
2 changes: 2 additions & 0 deletions models/amat2_psc_exp.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -264,6 +264,8 @@ nest::amat2_psc_exp::init_buffers_()
void
nest::amat2_psc_exp::pre_run_hook()
{
ArchivingNode::pre_run_hook_();

// ensures initialization in case mm connected after Simulate
B_.logger_.init();

Expand Down
4 changes: 2 additions & 2 deletions models/bernoulli_synapse.h
Original file line number Diff line number Diff line change
Expand Up @@ -148,10 +148,10 @@ class bernoulli_synapse : public Connection< targetidentifierT >
};

void
check_connection( Node& s, Node& t, size_t receptor_type, const CommonPropertiesType& )
check_connection( Node& s, Node& t, const size_t receptor_type, const synindex syn_id, const CommonPropertiesType& )
{
ConnTestDummyNode dummy_target;
ConnectionBase::check_connection_( dummy_target, s, t, receptor_type );
ConnectionBase::check_connection_( dummy_target, s, t, syn_id, receptor_type );
}

bool
Expand Down
2 changes: 2 additions & 0 deletions models/binary_neuron.h
Original file line number Diff line number Diff line change
Expand Up @@ -434,6 +434,8 @@ template < class TGainfunction >
void
binary_neuron< TGainfunction >::pre_run_hook()
{
ArchivingNode::pre_run_hook_();

// ensures initialization in case mm connected after Simulate
B_.logger_.init();
V_.rng_ = get_vp_specific_rng( get_thread() );
Expand Down
10 changes: 5 additions & 5 deletions models/clopath_synapse.h
Original file line number Diff line number Diff line change
Expand Up @@ -147,7 +147,7 @@ class clopath_synapse : public Connection< targetidentifierT >
// ConnectionBase. This avoids explicit name prefixes in all places these
// functions are used. Since ConnectionBase depends on the template parameter,
// they are not automatically found in the base class.
using ConnectionBase::get_delay;
using ConnectionBase::get_delay_ms;
using ConnectionBase::get_delay_steps;
using ConnectionBase::get_rport;
using ConnectionBase::get_target;
Expand Down Expand Up @@ -184,13 +184,13 @@ class clopath_synapse : public Connection< targetidentifierT >
};

void
check_connection( Node& s, Node& t, size_t receptor_type, const CommonPropertiesType& )
check_connection( Node& s, Node& t, const size_t receptor_type, const synindex syn_id, const CommonPropertiesType& )
{
ConnTestDummyNode dummy_target;

ConnectionBase::check_connection_( dummy_target, s, t, receptor_type );
ConnectionBase::check_connection_( dummy_target, s, t, syn_id, receptor_type );

t.register_stdp_connection( t_lastspike_ - get_delay(), get_delay() );
t.register_stdp_connection( t_lastspike_ - get_delay_ms(), get_delay_ms(), 0 );
}

void
Expand Down Expand Up @@ -241,7 +241,7 @@ clopath_synapse< targetidentifierT >::send( Event& e, size_t t, const CommonSyna
// use accessor functions (inherited from Connection< >) to obtain delay and
// target
Node* target = get_target( t );
double dendritic_delay = get_delay();
double dendritic_delay = get_delay_ms();
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
double dendritic_delay = get_delay_ms();
const double dendritic_delay = get_delay_ms();

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done!


// get spike history in relevant range (t1, t2] from postsynaptic neuron
std::deque< histentry_extended >::iterator start;
Expand Down
2 changes: 2 additions & 0 deletions models/cm_default.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -303,6 +303,8 @@ nest::cm_default::init_recordables_pointers_()
void
nest::cm_default::pre_run_hook()
{
ArchivingNode::pre_run_hook_();

logger_.init();

// initialize the pointers within the compartment tree
Expand Down
Loading