Setting the best training / validation ratio in a Neural Network

조회 수: 7 (최근 30일)
Jose Marques
Jose Marques 2018년 5월 4일
편집: Greg Heath 2019년 1월 16일
I am using a Neural Network to make a regression, using 10% of data to test. But how can I set the ratio values of training and validation datasets?
  댓글 수: 3
Greg Heath
Greg Heath 2019년 1월 16일
FYI: The default ratios are 0.7/0.15/0.15
Do you ave a specific reason for not accepting them?
Greg
Jose Marques
Jose Marques 2019년 1월 16일
Thanks, guys!
I am comparing different regression algorithms (Neural Networks, SVM, Regression Trees, Ensemble Trees). So, now I am using 30% of the samples as test for all algorithms.
To subdivide the 70% left in Neural Networks, I use 56% for training and 14% for crossvalidating. Do you think is a good option?

댓글을 달려면 로그인하십시오.

답변 (1개)

Greg Heath
Greg Heath 2019년 1월 16일
편집: Greg Heath 2019년 1월 16일
1. ALWAYS START WITH 10 DESIGNS USING THE MATLAB DEFAULT!
2. Then evaluate the results to determine what to modify.
3. For regression the default is FITNET. So, look at the codes in
help fitnet
and
doc fitnet
4. They are the same:
[ x, t ] = simplefit_dataset;
net = fitnet(H); % H = 10 hidden nodes
net = train(net,x,t);
view(net)
y=net(x);
perf = perform(net,t,y)
5. Since I don't trust "perform" , I add a normalized mean square error calculation which typically has a range from 0 to 1
NMSE = mse(t-y)/mse(t-mean(t)) % 0 <= NMSE <= 1
7. Search using
Greg NMSE
8. This is related to the familiar Rsquare (coefficient of determination) used in elementary statistics
(See any encyclopedia)
Rsquare = 1-NMSE
9. If successful, the next step is to try to obtain good results with the number of hidden nodes
H < 10
10. Otherwise, increase H.
11. I have a jillion examples in both the NEWSGROUP and ANSWERS.
PS: This format sucks.
Greg
  댓글 수: 2
Jose Marques
Jose Marques 2019년 1월 16일
편집: Jose Marques 2019년 1월 16일
Greg,
thanks for your kindness. Your answers are always helpful.
I have some questions:
- why do not you trust the function 'perform'? I just realized that I have my own function to calculate MSE.
- I created a function to try optimize these hiperparameters:
% Number of executions of the function 'calculate_error_NN'. In each execution,
% different samples are taken as train and test.
num_executions = 10;
% Creating variables to optimize
hidden1 = optimizableVariable('hidden1',[1,20],'Type','integer');
hidden2 = optimizableVariable('hidden2',[1,20],'Type','integer');
hidden3 = optimizableVariable('hidden3',[1,20],'Type','integer');
func_trein = optimizableVariable('func',{'trainlm' 'trainbfg' 'trainscg' 'traincgp'},'Type','categorical');
% Error function to be optimize by bayesian optmization
func = @(my_struct)calculate_error_NN(5,5,5,'trainlm',num_executions);
results = bayesopt(func,[hidden1,hidden2,hidden3,func_trein],...
'Verbose',1,...
'MaxObjectiveEvaluations',1000,...
'MaxTime',100000,...
'PlotFcn','all');
Do you think is a good approach? What hiperparameters I should optimize?
Thanks a lot!
Greg Heath
Greg Heath 2019년 1월 16일
편집: Greg Heath 2019년 1월 16일
I SEE NO REASON FOR IT'S EXISTENCE.!
My approach is as simple as possible. Typically, I accept all defaults except a double for loop over a non-overfitting number of Hidden nodes and 10 or (RARELY!) 20 sets of random initial weights for each value of H.
I have posted jillions of exmples in BOTH comp.soft-sys.matlab and ANSWERS.
HOPE THIS HELPS.
GREG

댓글을 달려면 로그인하십시오.

카테고리

Help CenterFile Exchange에서 Get Started with Statistics and Machine Learning Toolbox에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by