Skip to content
This repository was archived by the owner on May 27, 2019. It is now read-only.

Conversation

@Freyr-Wings
Copy link

the original code wrongly use sigmoid twice and is weird

@pangahn
Copy link

pangahn commented Oct 15, 2018

error IN [18] please fix it as following
dAL = - (np.divide(Y, AL) - np.divide(1 - Y, 1 - AL))

grads["dA" + str(L-1)], grads["dW" + str(L)], grads["db" + str(L)] = linear_activation_backward(dAL, current_cache, "sigmoid")

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants