5-fold cross validation with neural networks (function approximation)

조회 수: 84 (최근 30일)
I have matlab code which implement hold out cross validation (attached). I am looking for help to perform 5-fold cross validation on the same model architecture. Please help me to figure this out. Thank you.
%% Data
X = x'; % input Always stays same
Y = yte'; % target
%% Model parameter change
% Choose a Training Function ('trainlm', 'trainscg','traingdx')
trainFcn = 'trainlm';
% Choose a Neuron in hidden layers
hiddenLayerSize =17;
% Choose an activation fucntion ( logsig,tansig, purelin)
net.layers{1}.transferFcn = 'logsig'; % hidden layer
net.layers{2}.transferFcn = 'poslin'; % output layer
% Choose an evaluation metrics (mae, mse)
net.performFcn = 'mse';
net.plotFcns = {'plotperform','plottrainstate','ploterrhist', 'plotregression', 'plotfit'};
% view network
net = fitnet(hiddenLayerSize,trainFcn);
%view(net)
%% Data-processing
net.input.processFcns = {'removeconstantrows','mapstd'}; % Input: remove const values and map values between [0 to 1]
net.output.processFcns = {'removeconstantrows','mapstd'}; % Input: remove const values and map values between [0 to 1]
%% Data split (0.7,0.15 & 0.15)
net.divideFcn = 'dividerand'; % randonmly
net.divideMode = 'sample'; % each obs as sample
net.divideParam.trainRatio = 70/100; % train
net.divideParam.valRatio = 15/100; % test
net.divideParam.testRatio = 15/100; % validation
%% Train a neural network
[net,tr] = train(net,X,Y);
% net- gives train model
% tr-training records
%% network performance
figure(1), plotperform(tr) % Plot network performance
figure(2), plottrainstate(tr) % Plot training state values.
%% Error and R2
Ytest = net(X); % prediction on X
e = gsubtract(Y,Ytest); % subtraction( Yactual-ypred)
MSE = perform(net, Y,Ytest); % Calculate network performance = mae or mse value
MAE=mae(net, Y,Ytest);
%% Regression performance
trOut = Ytest(tr.trainInd); %traing output-predicted
trTarg = Y(tr.trainInd); % training target-Actual
vOut = Ytest(tr.valInd); % val output
vTarg = Y(tr.valInd); % val target
tsOut = Ytest(tr.testInd); % test output
tsTarg = Y(tr.testInd); %test target
figure(4), plotregression(trTarg, trOut, 'Train', vTarg, vOut, 'Validation', tsTarg, tsOut, 'Testing',Y,Ytest,'All')
% R2
R2_Train= regression(trTarg, trOut)^2;
R2_Val= regression(vTarg, vOut)^2;
R2_Test= regression(tsTarg, tsOut)^2;
R2_all= regression(Y,Ytest)^2;
%figure(3), ploterrhist(e) % Plot error histogram
  댓글 수: 1
Chetan Badgujar
Chetan Badgujar 2021년 3월 5일
Here is what I did!!
%% row to colum
x = X'; % input Always stays same
t = yte'; % target change a/c to tr, te and Pn
%%
k=10;
[ I N ] = size(x)
[O N ] = size(t) % [ 1 94 ]
rng('default') % Or substitute your lucky number
ind0 = randperm(N);
% ind0 = 1:N; % For debugging
M = floor(N/k) % 9 length(valind & tstind)
Ntrn = N-2*M % 76 length(trnind)
Ntrneq = Ntrn*O % 76 No. of training equations
H = 17 % 4 No. of hidden nodes (default is 10)
%%
net = fitnet(H);
net.divideFcn = 'divideind';
%%
for i = 1:k
rngstate(i) = rng;
net = configure(net,x,t);
valind = 1 + M*(i-1) : M*i;
if i==k
tstind = 1:M;
trnind = [ M+1:M*(k-1) , M*k+1:N ];
else
tstind = valind + M;
trnind = [ 1:valind(1)-1 , tstind(end)+1:N ];
end
trnInd = ind0(trnind); % Note upper & lower case "i"
valInd = ind0(valind);
tstInd = ind0(tstind);
net.divideParam.trainInd = trnInd;
net.divideParam.valInd = valInd;
net.divideParam.testInd = tstInd;
[ net tr e] = train( net, x, t );
stopcrit{i,1} = tr.stop;
bestepoch(i,1) = tr.best_epoch;
Ytest = e; % prediction on X
%e(i) = gsubtract(t,Ytest); % subtraction( Yactual-ypred)
MSE(i,1) = perform(net, t,Ytest); % Calculate network performance = mae or mse value
MAE(i,1)=mae(net, t,Ytest);
trOut = Ytest(tr.trainInd); %traing output-predicted
trTarg = t(tr.trainInd); % training target-Actual
vOut = Ytest(tr.valInd); % val output
vTarg = t(tr.valInd); % val target
tsOut = Ytest(tr.testInd); % test output
tsTarg = t(tr.testInd); %test target
R2_Train(i,1)= regression(trTarg, trOut)^2;
R2_Val(i,1)= regression(vTarg, vOut)^2;
R2_Test(i,1)= regression(tsTarg, tsOut)^2;
R2_all(i,1)= regression(t,Ytest)^2;
% calculate weight product for sensitivity analysis
W1 = net.IW{1};
W2= net.LW{2};
u=abs(W1.*W2');
us=sum(u);
Ri=(us./sum(us))
end

댓글을 달려면 로그인하십시오.

채택된 답변

Madhav Thakker
Madhav Thakker 2021년 3월 15일
Hi Chetan,
Neural Network toolbox does not support cross validation. However, you can take advantage of Statistics and Machine learning toolbox for this purpose, if you have a license for it.
The code shown below performs the following tasks
1. creates 10 fold cross validation object for the given data, in this case a fisher iris data set.
2. Transforms the data to be suitable to neural networks. Neural networks takes data as observations in columns. So we have to do a minor transformation if our data has observations in rows.
3. creates a network and sets the validation and test data to zero. This ensures that all the training data is used for training
4. Computes the loss or performance using the data kept aside for testing for the current fold.
5. saves all the networks and performance values. Gets the best network with min loss
clear all;
close all;
clc;
load fisheriris;
y = species;
x = meas;
numFolds = 10;
c = cvpartition(y,'k',numFolds);
% table to store the results
netAry = {numFolds,1};
perfAry = zeros(numFolds,1);
for i = 1:numFolds
%get Train and Test data for this fold
trIdx = c.training(i);
teIdx = c.test(i);
xTrain = x(trIdx);
yTrain = y(trIdx);
xTest = x(teIdx);
yTest = y(teIdx);
%transform data to columns as expected by neural nets
xTrain = xTrain';
xTest = xTest';
yTrain = dummyvar(grp2idx(yTrain))';
yTest = dummyvar(grp2idx(yTest))';
%create net and set Test and Validation to zero in the input data
net = patternnet(10);
net.divideParam.trainRatio = 1;
net.divideParam.testRatio = 0;
net.divideParam.valRatio = 0;
%train network
net = train(net,xTrain,yTrain);
yPred = net(xTest);
perf = perform(net,yTest,yPred);
disp(perf);
%store results
netAry{i} = net;
perfAry(i) = perf;
end
%take the network with min Loss value
[maxPerf,maxPerfId] = min(perfAry);
bestNet = netAry{maxPerfId};
Hope this helps.
  댓글 수: 1
kasma saharuddin
kasma saharuddin 2022년 2월 16일
Hye, definitely what I am looking for! Thank you for your answer. Can it be done for multiple-output regression neural network?

댓글을 달려면 로그인하십시오.

추가 답변 (0개)

카테고리

Help CenterFile Exchange에서 Sequence and Numeric Feature Data Workflows에 대해 자세히 알아보기

제품


릴리스

R2018a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by