Back propagation neural network
조회 수: 7 (최근 30일)
이전 댓글 표시
I learnt that the output of activation functions, logsig and tansig, returns the value between [0 1] and [-1 1] respectively. What will happen if the target values are beyond these limits?
댓글 수: 2
Mohammad Sami
2020년 6월 8일
One of the reasons is that larger values can result in a problem of exploding gradients, when training the network.
답변 (0개)
참고 항목
제품
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!