Back propagation neural network
이전 댓글 표시
I learnt that the output of activation functions, logsig and tansig, returns the value between [0 1] and [-1 1] respectively. What will happen if the target values are beyond these limits?
댓글 수: 2
Mohammad Sami
2020년 6월 8일
One of the reasons is that larger values can result in a problem of exploding gradients, when training the network.
Sivamani S
2020년 6월 8일
답변 (0개)
카테고리
도움말 센터 및 File Exchange에서 Define Shallow Neural Network Architectures에 대해 자세히 알아보기
제품
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!