Activation Function #1648
Unanswered
KostasVog20
asked this question in
Q&A
Replies: 1 comment
-
The latter.
You need to implement the code yourself. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Greetings everyone, including @lululxvi!
I'm interested in learning more about activation functions and their functionality within this package. Specifically, I'd like clarification on whether when we specify "tanh" as the activation function, it is applied to both the hidden layers and the output layer, or if the output layer defaults to a "linear approximation" activation function. Furthermore, I'm curious if it's possible to implement the "tanh" activation function for all hidden layers while utilizing a "relu" activation function solely for the output layer without requiring any modifications to the existing code.
Beta Was this translation helpful? Give feedback.
All reactions