hyperparameter optimization (deep learning) using bayesopt
조회 수: 8 (최근 30일)
이전 댓글 표시
Following the answer here . I am trying to select best hyperparameters for my Recurrent neural network (RNN).
I want to optimize below hyperparameters in the given code using 'bayesopt()'.
How to define below parameters for 'bayesopt()' using ''optimizableVariable''.
training_function = {'traingd' 'traingda' 'traingdm' 'traingdx'}
optimizers= {'SGD', 'RMSprop', 'Adam'}
activation_functions= {'ReLU','Dropout'};
Transfer_functions= {'tansig,'tanh'};
The complete code is:
% Make some data
Daten = rand(100, 3);
Daten(:,3) = Daten(:,1) + Daten(:,2) + .1*randn(100, 1); % Minimum asymptotic error is .1
[m,n] = size(Daten) ;
% Split into train and test
P = 0.7 ;
Training = Daten(1:round(P*m),:) ;
Testing = Daten(round(P*m)+1:end,:);
XTrain = Training(:,1:n-1);
YTrain = Training(:,n);
XTest = Testing(:,1:n-1);
YTest = Testing(:,n);
% Define a train/validation split to use inside the objective function
cv = cvpartition(numel(YTrain), 'Holdout', 1/3);
% Define hyperparameters to optimize
vars = [optimizableVariable('hiddenLayerSize', [1,20], 'Type', 'integer');
optimizableVariable('epochs', [20,200], 'Type', 'integer')
optimizableVariable('lr', [1e-3 1], 'Transform', 'log')];
----------------------------------
ADD ABOVE HYPERPARAMETERS HERE
--------------------------------
% Optimize
minfn = @(T)kfoldLoss(XTrain', YTrain', cv, T.hiddenLayerSize, T.lr);
results = bayesopt(minfn, vars,'IsObjectiveDeterministic', false,...
'AcquisitionFunctionName', 'expected-improvement-plus');
T = bestPoint(results)
댓글 수: 0
답변 (1개)
Sammit Jain
2020년 1월 29일
Hello Ali,
It appears you're looking to create a BayesianOptimization object, for your set of hyperparameters. The following link has some examples that will help you customize your code:
댓글 수: 0
참고 항목
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!