Zero-weights initialization in feedforward network

조회 수: 7 (최근 30일)
Christoph
Christoph 2013년 12월 21일
답변: Christoph 2013년 12월 23일
Hello everybody, i've got a problem by programming a neural network.
r=xlsread('Juni_Test_Korrelation');
u=r(2:31,3);
u1=u';
net.inputweights{1,1}.initFcn='rands';
net.biases{1}.initFcn='rands';
net=init(net)
net.IW{1,1}
net.b{1}
net=newff(minmax(u1),[5,1],{'tansig','purelin'},'trainlm');
net.trainParam.show = 50;
net.trainParam.lr = 0.09;
net.trainParam.epochs = 120;
net.trainParam.goal = 1;
At the beginning i set the initFcn for the weights and biases to "random". I init the net after this and want to have a look at the weights and biases but i always get the same values. The only this that is different is a "minus" coming randomly in front of the values. So i get this values for the weights
-0.0319
-0.0319
0.0319
-0.0319
0.0319
and these for the biases
15.8047
-12.3047
8.8047
5.3047
-1.8047
and this everytime. Even if i set the initFcn to "initzero" the weights and biases remain the same. So i dont get any reproducable conditions. Can pls someone tell me what to do, so i cant get rather constant values for initialization or just zeros? (I know, i could write the values for weights and biases manuel like this net.IW{1,1}=[0;0;...] but this will take to long for this matrices)
Thx for any helpfull advice

채택된 답변

Greg Heath
Greg Heath 2013년 12월 22일
편집: Greg Heath 2013년 12월 22일
You have to define a net via net = newff before you can assign any properties or values.
With OBSOLETE functions like newff, the nets are automatically initialized with random weights.
In order to get a different set of initial weights, initialize the RNG, e.g.,
rng(0)
For details see
help rng
doc rng
Also see some of my code examples. Search NEWSGROUP and ANSWERS using
greg newff rng
Hope this helps.
Thank you for formally accepting my answer
Greg
  댓글 수: 1
Greg Heath
Greg Heath 2013년 12월 22일
I recommend using all of the defaults except the number of hidden nodes. See
help newff
doc newff
for the basic examples.

댓글을 달려면 로그인하십시오.

추가 답변 (1개)

Christoph
Christoph 2013년 12월 23일
Thanks Greg I actually could solve the problem be including one line of code:
net.layers{1}.initFcn='initwb'
And like this i can initialize the weights and biases with a zero vector and have a reproducable start. The problem i've got know, is reducing the performance to the goal I set. Theirefor theire are several ways to solve, if I'am right. Like: -trainingfunction
-layernodes
-transferfunction
-learningrate
etc.
but i do not really got the point, wich parameter will do the best change. You got any advices?
Thx for helping!!!

카테고리

Help CenterFile Exchange에서 Matrix Indexing에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by