problem with configuration of neural network
조회 수: 7 (최근 30일)
이전 댓글 표시
hi I have a problem with using neural network. although I set all the parameter in 'trainparam' but it uses its default setting in training. the method that i set the parameter is as below: if true %
end
L=[4 8];
net=newcf(P,T,L);
net.trainparam.goal=1e-5;
net.trainParam.min_grad=1e-5;
net.trainparam.epochs=200;
net.trainFcn='traincgf';
net.layers{1}.transferFcn='tansig';
net.layers{2}.transferFcn='tansig';
[net,tr]=train(net,P,T);
댓글 수: 0
채택된 답변
Greg Heath
2013년 8월 7일
>> help newcf
Create a cascade-forward backpropagation network.
Obsoleted in R2010b NNET 7.0. Last used in R2010a NNET 6.0.4.
The recommended function is cascadeforwardnet.
Using the data in help and rng(0) to initialize the random number generator, I get
close all, clear all, clc
P = [0 1 2 3 4 5 6 7 8 9 10];
T = [0 1 2 3 4 3 2 1 2 3 4];
L=[4 8];
net=newcf(P,T,L);
%Defaults
net.trainparam.goal % 0
net.trainParam.min_grad % 1e-5
net.trainparam.epochs % 1000
net.trainFcn % trainlm
net.layers{1}.transferFcn % tansig
net.layers{2}.transferFcn % tansig
%Assigned
net.trainparam.goal = 1e-5;
net.trainParam.min_grad = 1e-5;
net.trainparam.epochs = 200;
net.trainFcn = 'traincgf';
net.layers{1}.transferFcn = 'tansig';
net.layers{2}.transferFcn = 'tansig';
rng(0)
[net,tr,Y]=train(net,P,T);
NMSE = mse(T-Y)/var(T,1) % 0.2933
%Final
net.trainparam.goal % => 0
net.trainParam.min_grad % => 1e-10
net.trainparam.epochs % => 1000
net.trainFcn % traincgf
net.layers{1}.transferFcn % tansig
net.layers{2}.transferFcn % tansig
Which shows that goal, min_grad, and epochs have changed
However, only goal and epochs have changed to default values
Obviously, there are bugs.
The important thing is whether the normalized MSE is acceptable when longer data sets are used
Hope this helps.
Thank you for formally accepting my answer
Greg
추가 답변 (0개)
참고 항목
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!