- I agree that gradient descent is vector quantity & points in the direction of maximum change of the cost function.
- The ‘net.trainParam.min_grad’ is a scalar(numeric) quantity. The parameter ‘min_grad’ denotes the minimum magnitude (which is scalar) of gradient descent (which is vector), for which the training of neural network terminates.
- When the magnitude of gradient descent becomes less than ‘min_grad’, the neural network model is said to be optimized (and hence, further training stops).
What is the parameter minimum performance gradient (trainParam.min_grad) of traingd?
조회 수: 5 (최근 30일)
이전 댓글 표시
I use the training function "traingd" to train a shallow neural network:
trainedNet = train(net,X,T)
For the training function "traingd": How is the parameter minimum performance gradient (net.trainParam.min_grad) defined?
As the gradient for the gradient descent is usually a vector, but net.trainParam.min_grad is a scalar value, I am confused.
Is it the change in the performace (loss) between 2 iterations, and if yes: Does it refer to the training, validation or testing errror?
Thanks in advance!
I use MATLAB 2013 and 2015 with the neural network toolbox.
댓글 수: 0
채택된 답변
Rishabh Mishra
2020년 9월 28일
편집: Rishabh Mishra
2020년 9월 28일
Hi,
Based on your description of the issue, I would state a few points:
For better understanding, refer the following links:
Hope this helps.
댓글 수: 2
추가 답변 (0개)
참고 항목
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!