How can I leverage the Bayesian Optimization framework to find the optimal hyperparameters for a non-image training task?

조회 수: 6 (최근 30일)
I would like to leverage the Bayesian Optimization framework described in the following documentation page,
to find the optimal hyperparameters for training a network in order to approximate a nonlinear function 'y = f(x)'.
How can I leverage the Bayesian Optimization framework to find the optimal hyperparameters for a non-image training task?

채택된 답변

MathWorks Support Team
MathWorks Support Team 2020년 10월 30일
It is provided below a script which leverages the Bayesian Optimization framework to find optimal hyperparameters, herein the number of epochs, the initial learning rate and the number of neurons of an intermediate layer for the approximation of nonlinear function 'y(x) = x^3':

%% Definition of the function that needs to be approximated
fnc = @(x) x.^3;
%% Definition of the training data
xTrain = linspace(-1, 1, 80)';
yTrain = fnc(xTrain);
%% Definition of the validation data
numRand = 20;
xValidation = sort(2.*rand(numRand, 1) - 1);
yValidation = fnc(xValidation);
%% Definition of the design variables
optimVars = [
    optimizableVariable('epochs', [100 10000], 'Type', 'integer')
    optimizableVariable('InitialLearnRate', [1e-4 1], 'Transform', 'log')
    optimizableVariable('numberOfNeurons', [1 100], 'Type', 'integer')];
%% Objective function for the Bayesian optimization
ObjFcn = makeObjFcn(xTrain, yTrain, xValidation, yValidation);
%% Perform bayesian optimization to find the optimal parameters
BayesObject = bayesopt(ObjFcn, optimVars, ...
    'MaxTime', 14*60*60, ...
    'IsObjectiveDeterministic', false, ...
    'UseParallel', false);
%% Definition of the objective function
function ObjFcn = makeObjFcn(XTrain, YTrain, XValidation, YValidation)
    %% Assign the output of the objective function
    ObjFcn = @valErrorFun;
    
    %% Definition of the objective function
    function [valError, cons, fileName] = valErrorFun(optVars)
        %% Definition of the layer architecture in dependence to the design variables
        layers = [ ...
            featureInputLayer(1, "Name", "myFeatureInputLayer", 'Normalization','rescale-symmetric')
            fullyConnectedLayer(optVars.numberOfNeurons, "Name", "myFullyConnectedLayer1")
            tanhLayer("Name", "myTanhLayer")
            fullyConnectedLayer(1, "Name", "myFullyConnectedLayer2")
            regressionLayer("Name", "myRegressionLayer")
        ];
        %% Definition of the training options in dependence to the design variables
        options = trainingOptions('adam', ...
            'MaxEpochs', optVars.epochs, ... % first design parameter
            'InitialLearnRate', optVars.InitialLearnRate,... % second design parameter
            'Shuffle', 'every-epoch', ...
            'MiniBatchSize', 128, ...
            'Verbose', false); % 'Plots', 'training-progress', ...
        
        %% Train the network for the actual optimization step
        [trainedNet, ~] = trainNetwork(XTrain, YTrain, layers, options);
        close(findall(groot, 'Tag', 'NNET_CNN_TRAININGPLOT_UIFIGURE'))
        
        %% Perform prediction on the provided validation data for the current optimization step
        YPredicted = predict(trainedNet, XValidation);
        
        %% Computation of the error between the expected and the predicted solution of the trained network using the validation data
        valError = norm(YPredicted - YValidation);
        
        %% Save the results of the current optimization step
        fileName = num2str(valError) + ".mat";
        save(fileName, 'trainedNet', 'valError', 'options')
        cons = [];
    end
end
This script should only serve as a proof of concept and should not be considered as the best possible setting of the corresponding Bayesian optimization task.

추가 답변 (0개)

카테고리

Help CenterFile Exchange에서 Quantization, Projection, and Pruning에 대해 자세히 알아보기

태그

아직 태그를 입력하지 않았습니다.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by