필터 지우기
필터 지우기

backpropogation ,Multilayer perceptron,neural network

조회 수: 1 (최근 30일)
rajesh yakkundimath
rajesh yakkundimath 2011년 12월 29일
dear sir,
i m attaching a matlab code in which i tried to train the network using Feed forward Backpropogation.Here i m finding difficulty in in instruction
net_FFBP = createNet(inputsize, mimax, hneurons, fcnCELL, initflag, trainalgo, paramatrix, sameWEIGHT);
can i get how to save parameters in net_FFBP.I have attached the code below
function TrainingNet
load Feature.txt; %load the features
FeatureS = Feature'; %Convert to column array
load Outtype.txt; %load output type
OuttypeS = Outtype';
inputsize = size(FeatureS, 1);
min_data = min(min(FeatureS));
max_data = max(max(FeatureS));
mimax = [min_data max_data];
hneurons = 2000;
%initialize parameters for creating the MLP.
fcnCELL = {'logsig' 'logsig'};
initflag = [0 1];
trainalgo = 'gdm';
paramatrix = [10000 50 0.9 0.6]; % epochs = 100, show = 50, learning rate = 0.9, momentum term = 0.6
sameWEIGHT = [];
net_FFBP = creteNet(inputsize, mimax, hneurons, fcnCELL, initflag, trainalgo, paramatrix, sameWEIGHT);
net_FFBP = newff(FeatureS, OuttypeS, 39);
[net_FFBP] = train(net_FFBP, FeatureS, OuttypeS);
save net_FFBP net_FFBP;
disp('Done: Training Network');

채택된 답변

Greg Heath
Greg Heath 2011년 12월 29일
% function TrainingNet
% load Feature.txt; %load the features
% FeatureS = Feature'; %Convert to column array
% load Outtype.txt; %load output type
% OuttypeS = Outtype';
[I N ] = size(FeatureS)
[O N ] = size(Outtypes)
minmaxF = minmax(FeatureS) % Is a matrix [I 2]
Neq = N*O % Number of training equations
% I-H-O node topology
% Nw = (I+1)*H+(H+1)*O % Number of unknown weights
% Want Neq >> Nw or % H << Hub
Hub = (Neq-O)/(I+O+1) % Neq = Nw
r = 10 % Neq > r*Nw, ~2 < r < ~30
H = floor((Neq/r-O)/(I+O+1))
How did you get H = 2000 ???
% %initialize parameters for creating the MLP.
% fcnCELL = {'logsig' 'logsig'};
% initflag = [0 1];
What does initflag do?
% trainalgo = 'gdm';
% paramatrix = [10000 50 0.9 0.6]; % epochs = 100, show = 50,
100 or 10,000?
% learning rate = 0.9, momentum term = 0.6
% sameWEIGHT = [];
I suggest first using the defaults in NEWFF
% net_FFBP = creteNet(inputsize, mimax, hneurons, fcnCELL, initflag, trainalgo, paramatrix, sameWEIGHT);
Is this supposed to be a replacement for NEWFF and net.Param.* ??
% net_FFBP = newff(FeatureS, OuttypeS, 39);
Now H = 39 ??
% [net_FFBP] = train(net_FFBP, FeatureS, OuttypeS);
% save net_FFBP net_FFBP;
% disp('Done: Training Network');
What is your question ??
Greg

추가 답변 (0개)

카테고리

Help CenterFile Exchange에서 Predictive Maintenance Toolbox에 대해 자세히 알아보기

태그

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by