Output of npm list:
-- [email protected]
Weights in the network [2, 3, 1] (XOR example) at second layer stay the same:
[ { bias: 1, weights: [ -0.4384047377812256, 0.71476953983941 ] },
{ bias: 1, weights: [ 0.836493930022522, 0.7845843352178457 ] },
{ bias: 1, weights: [ -0.8790343793408859, -1.049725288141738 ] } ]
This produces the same results in 5 (or any number of) epochs training.
Interestingly, the javascript version does not have this problem.