Skip to content

Cross entropy learning #135

@mikerabat

Description

@mikerabat

I hope I'm not too annoying but you guys are the experts in that area so I hope I can discuss another neat feature with you...

While browsing through the "Neural Networks for Pattern Recognition" from C. M. Bishop I recognized that there are more than
the standard learning error propagation method with mean squared error but rather there is one called
Cross Entropy loss function... There are a few sources that claim that this error/loss function would indeed allow faster learning
progress....

What do you think? Would that be a viable feature for the library?

Metadata

Metadata

Labels

documentationImprovements or additions to documentation

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions