hyperparameter tuning Neural network

조회 수: 4 (최근 30일)
Ali
Ali 2019년 10월 18일
답변: Sai Bhargav Avula 2019년 10월 23일
How to tune below parameters in neural network?
training_function = {'traingd' 'traingda' 'traingdm' 'traingdx'}
optimizers= {'SGD', 'RMSprop', 'Adam'}
activation_functions= {'ReLU','Dropout'};
Transfer_functions= {'tansig,'tanh'};
I am trying byesian optimization 'bayesopt()' using ''optimizableVariable''. But it didnt work for me. How to define above parameters for "Vars" variable?
% Define a train/validation split to use inside the objective function
cv = cvpartition(numel(YTrain), 'Holdout', 1/3);
% Define hyperparameters to optimize
vars = [optimizableVariable('hiddenLayerSize', [1,20], 'Type', 'integer');
optimizableVariable('lr', [1e-3 1], 'Transform', 'log')];
% Optimize
minfn = @(T)kfoldLoss(XTrain', YTrain', cv, T.hiddenLayerSize, T.lr);
results = bayesopt(minfn, vars,'IsObjectiveDeterministic', false,...
'AcquisitionFunctionName', 'expected-improvement-plus');
T = bestPoint(results)

답변 (1개)

Sai Bhargav Avula
Sai Bhargav Avula 2019년 10월 23일
Hi,
You cannot directly optimize for the parameters you mentioned using Bayesian optimization.
A possible work around would be defining a custom optimizing function that the given parameters as input and solving them sequentially.
For example
function rmse = optimizerLoss(x,y,cv,numHid,optimizer,lr)
% Train net.
net = feedforwardnet(numHid, optimizer);
net.trainParam.lr = lr;
net = train(net, x(:,cv.training), y(:,cv.training));
% Evaluate on validation set and compute rmse
ypred = net(x(:, cv.test));
rmse = sqrt(mean((ypred - y(cv.test)).^2));
end
Use the optimizers the optimizers you mentioned sequentially. And finally DropOut is not a activation.

카테고리

Help CenterFile Exchange에서 Deep Learning Toolbox에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by