필터 지우기
필터 지우기

Weights don't initialize.

조회 수: 1 (최근 30일)
Noisy
Noisy 2011년 10월 29일
I created the following network:
P = dataH;
T = dataXsm;
net=network;
net.numInputs = 1;
net.numLayers = 3;
net.biasConnect(1) = 1;
net.biasConnect(2) = 1;
net.biasConnect(3) = 1;
net.inputConnect = [1; 0; 0];
net.layerConnect = [0 0 0; 1 0 0; 0 1 0];
net.outputConnect = [0 0 1];
net.inputs{1}.size = 2;
net.layers{1}.size = 2;
net.layers{1}.transferFcn = 'hardlim';
net.layers{1}.initFcn = 'initnw';
net.layers{2}.size = 10;
net.layers{2}.transferFcn = 'hardlim';
net.layers{2}.initFcn = 'initnw';
net.layers{3}.size = 10;
net.layers{3}.initFcn = 'initnw';
net.layers{3}.transferFcn = 'hardlim';
net.initFcn = 'initlay';
net.IW{1,1}, net.IW{2,1},
net.LW{3,2}
net.b{1}, net.b{3}
net.trainFcn = 'trainc';
net.performFcn = 'sse';
net.adaptFcn = 'trains';
net.trainParam.goal=0.01;
net.trainParam.epochs=100;
net.trainParam.passes = 1;
net = init(net);
a = sim(net,P), e = T-a
net=train(net,P,T);
net.adaptParam.passes = 100;
[net,a,e] = adapt(net,P,T); e
twts = net.IW, tbiase = net.b
but it doesn't work, weights don't initialize and it gives all 1 as result: twts =
[2x2 double]
[]
[]
a =
1 1 1...1
...
1 1 1...1
Is something wrong with layer connection? Or do I initialize something wrong?

채택된 답변

Vito
Vito 2011년 10월 30일
No.
Multilayer percetron doesn't contain 'hardlim'(hardlim -is capable to classify only linearly separable set. Two or more layers in network - aren't separable linearly. ). Using 'logsig'.
The equivalent network - multilayer percetron.
P =[0 1 0 1; 0 0 1 1];
T = [0 0 0 1];
net=newff(minmax(P),[2,10,1],{'logsig','logsig','logsig'},'trainbfg');
net.trainParam.epochs = 100;
net = init (net);
net.IW{1,1}, net.IW{2,1},
net.LW{3,2}
net.b{1}, net.b{3}
net=train(net,P,T);
a = sim(net,P)
'trainbfg' – back propagation learning.
Error in network design.
  댓글 수: 1
Greg Heath
Greg Heath 2011년 10월 31일
Typically, only one hidden layer is needed.
Use as many defaults as possible (help newff).
newff automatically initializes weights with initnw
.
if [I N] = size(p) and [O N] = size(t) then
there are Neq = N*O training equations and
Nw = (I+1)*H+(H+1)*O unknown weights. For
accurate weight estimation it is desired that
Neq >> Nw
Typically Neq >= 10*Nw is adequate. However,
sometimes a larger ratio (e.g., > 30) is needed
and sometimes a smaller ratio (e.g., 2) will suffice.
Hope this helps.
Greg

댓글을 달려면 로그인하십시오.

추가 답변 (0개)

카테고리

Help CenterFile Exchange에서 Sequence and Numeric Feature Data Workflows에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by