Effective number of parameters in a neural network

조회 수: 1 (최근 30일)
Jérôme
Jérôme 2013년 5월 19일
Hello ;
I'm training a neural network using the Bayesian approach. In the documentation, I read the following : "One feature of this algorithm is that it provides a measure of how many network parameters (weights and biases) are being effectively used by the network."
But I don't quite understand something : once I know the amount of effective parameters, what can I do with this information? For starter, how come some of the parameters are not used? Why are some weights inactive? Secondly, can knowing that help me prune the network and reduce the amount of neurons, for example? If yes, how? If no, then what is the practical use of that piece of information?
Thanks in advance for your help!
J

채택된 답변

Greg Heath
Greg Heath 2013년 5월 19일
TRAINBR automatically chooses the weighting ratio that multiplies the sum of squared weights that is added to the sum of squared errors to form the objective function. The choice depends on the effective number of weights.
I don't recall the formula, however, you should be able to find it in the source code, it's references, or online.
The only way I can see you using it is if you use TRAINLM with the regularization option of mse. In that case the user chooses the ratio. However, I don't know of a good reason to do that instead of using TRAINBR.
Hope this helps.
Thank you for formally accepting my answer.
Greg

추가 답변 (0개)

카테고리

Help CenterFile Exchange에서 Sequence and Numeric Feature Data Workflows에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by