Skip to content

Add clipped relu to deepspeech2#20

Open
delta2323 wants to merge 2 commits intomasterfrom
fix-deepspeech2
Open

Add clipped relu to deepspeech2#20
delta2323 wants to merge 2 commits intomasterfrom
fix-deepspeech2

Conversation

@delta2323
Copy link
Owner

@delta2323 delta2323 commented Nov 8, 2016

  • Add missing Clipped ReLU after Convolution Layers and Bidirectional RNNs
  • Concatenate forward and reverse RNN instead adding them.

Currently there remains (at least) one point that is different from torch's implementation. Specifically, we should add BatchNormalization to RNN units (Reference). We need to modify LSTM link and GRU link to do it. So we should apply these changes to Chainer itself.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant