This challenge is to return the WH_delta and WP_delta, given X, WH, WP, EPY using ReLU on the hidden layer and Softmax on the output layer. Test Cases will accumulate dWP and dWH to solve neural nets for Counter, Subtractor,Mux. Test Cases will have four output cases. ReLU performs well on multiple output cases.
[dWP,dWH]=Neural_Back_Propagation_ReLU(X,WH,WP,EPY)
The matlab Latex code for making the Back Propagation chart included in template.
Solution Stats
Solution Comments
Show comments
Loading...
Problem Recent Solvers9
Suggested Problems
-
2526 Solvers
-
Back to basics 8 - Matrix Diagonals
967 Solvers
-
984 Solvers
-
Removing rows from a matrix is easy - but what about inserting rows?
268 Solvers
-
Basics: 'Find the eigenvalues of given matrix
438 Solvers
More from this Author305
Problem Tags
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!