optimal values for cell data

조회 수: 5 (최근 30일)
Ali
Ali 2020년 3월 1일
댓글: Rik 2020년 3월 10일
By following answer here, i tried to optimize machine learning parameters using baysopt. However, i am getting error :
I
  댓글 수: 4
Walter Roberson
Walter Roberson 2020년 3월 10일
  • Every student claims that their code is "sensitive" on the grounds that other students might read the posting and copy from them.
  • There are no National Security considerations here.
  • If there are Trade Secret matters here, then legally speaking you destroyed the "secret" as soon as you posted the material (Trade Secret case law is really strict on that point. Like if a piece of paper with a Trade Secret blows out of your hand and someone finds it, then you have just lost Trade Secret status.)
  • You would have a difficult time convincing us that you are working on a Patent: people working on Patents know to hire consultants with Non-Disclosure Agreements
When I look at your previous postings, the most generous reading I can come up with is that you must might be working on a thesis. For thesis, the important part is that the ideas are yours; it is permitted to seek assistance with implementation .
Rik
Rik 2020년 3월 10일
Text of the flag by Ali:
I want to hide or delete this question as it contains sensitive material (code) of my project. I am grateful to MATLAB that the issue has been resolved.

댓글을 달려면 로그인하십시오.

채택된 답변

Nipun Katyal
Nipun Katyal 2020년 3월 5일
Yeah there is some problem while handling the cell
Here is the correct way to handle it
% Make some data
Daten = rand(100, 3);
Daten(:,3) = Daten(:,1) + Daten(:,2) + .1*randn(100, 1); % Minimum asymptotic error is .1
[m,n] = size(Daten) ;
% Split into train and test
P = 0.7 ;
Training = Daten(1:round(P*m),:) ;
Testing = Daten(round(P*m)+1:end,:);
XTr = Training(:,1:n-1);
YTr = Training(:,n);
XTe = Testing(:,1:n-1);
YTe = Testing(:,n);
XTrain=num2cell(XTr(:,1));
YTrain=num2cell(YTr(:,1));
XTest=num2cell(XTe);
YTest=num2cell(YTe);
% Define a train/validation split to use inside the objective function
cv = cvpartition(numel(YTrain), 'Holdout', 1/3);
% Define hyperparameters to optimize
vars = [optimizableVariable('hiddenLayerSize', [1,20], 'Type', 'integer');
optimizableVariable('lr', [1e-3 1], 'Transform', 'log')];
% Optimize
minfn = @(T)kfoldLoss(XTrain', YTrain', cv, T.hiddenLayerSize, T.lr);
results = bayesopt(minfn, vars,'IsObjectiveDeterministic', false,...
'AcquisitionFunctionName', 'expected-improvement-plus');
T = bestPoint(results);
function rmse = kfoldLoss(x, y, cv, numHid, lr)
% Train net.
net = feedforwardnet(numHid, 'traingd');
net.trainParam.lr = lr;
net = train(net, x(:,cv.training), y(:,cv.training));
% Evaluate on validation set and compute rmse
ypred = net(x(:, cv.test));
n = size(ypred);
pw = 2*ones(n);
pw = num2cell(pw);
cMinus = cellfun(@minus, ypred, y(cv.test), 'UniformOutput', false);
cSquare = cellfun(@power, cMinus, pw, 'UniformOutput', false);
cSquareVect = cell2mat(cSquare);
cMean = mean(cSquareVect);
rmse = sqrt(cMean);
%rmse = sqrt(mean((ypred - y(cv.test)).^2));
end
  댓글 수: 4
Nipun Katyal
Nipun Katyal 2020년 3월 9일
This should do
% Make some data
Daten = rand(100, 3);
Daten(:,3) = Daten(:,1) + Daten(:,2) + .1*randn(100, 1); % Minimum asymptotic error is .1
[m,n] = size(Daten) ;
% Split into train and test
P = 0.7 ;
Training = Daten(1:round(P*m),:) ;
Testing = Daten(round(P*m)+1:end,:);
XTr = Training(:,1:n-1);
YTr = Training(:,n);
XTe = Testing(:,1:n-1);
YTe = Testing(:,n);
XTrain=num2cell(XTr(:,1));
YTrain=num2cell(YTr(:,1));
XTest=num2cell(XTe);
YTest=num2cell(YTe);
% Define a train/validation split to use inside the objective function
cv = cvpartition(numel(YTrain), 'Holdout', 1/3);
% Define hyperparameters to optimize
vars = [optimizableVariable('hiddenLayerSize', [1,20], 'Type', 'integer');
optimizableVariable('lr', [1e-3 1e-1], 'Transform', 'log')];
% Optimize
minfn = @(T)kfoldLoss(XTrain', YTrain', cv, T.hiddenLayerSize, T.lr);
results = bayesopt(minfn, vars,'IsObjectiveDeterministic', false,...
'AcquisitionFunctionName', 'expected-improvement-plus');
T = bestPoint(results);
% Train final model on full training set using the best hyperparameters
net = layrecnet(1:2,T.hiddenLayerSize, 'traingd');
net.trainParam.lr = T.lr;
net = train(net, XTrain', YTrain');
% Evaluate on test set and compute final rmse
% ypred = net(XTest');
% finalrmse = sqrt(mean((ypred - YTest').^2))
% Evaluate on validation set and compute rmse
ypred = net(XTest(:,1)');
n = size(ypred);
pw = 2*ones(n);
pw = num2cell(pw);
cMinus = cellfun(@minus, ypred, YTest', 'UniformOutput', false);
cSquare = cellfun(@power, cMinus, pw, 'UniformOutput', false);
cSquareVect = cell2mat(cSquare);
cMean = mean(cSquareVect);
Rmse = sqrt(cMean)
function rmse = kfoldLoss(x, y, cv, numHid, lr)
% Train net.
net = feedforwardnet(numHid, 'traingd');
net.trainParam.lr = lr;
net = train(net, x(:,cv.training), y(:,cv.training));
% Evaluate on validation set and compute rmse
ypred = net(x(:, cv.test));
n = size(ypred);
pw = 2*ones(n);
pw = num2cell(pw);
cMinus = cellfun(@minus, ypred, y(cv.test), 'UniformOutput', false);
cSquare = cellfun(@power, cMinus, pw, 'UniformOutput', false);
cSquareVect = cell2mat(cSquare);
cMean = mean(cSquareVect);
rmse = sqrt(cMean);
%rmse = sqrt(mean((ypred - y(cv.test)).^2));
end
Ali
Ali 2020년 3월 10일
Thank you so much Nipun!!
I want to delete this question from MATLAB sentral as it contains some sensitive information ? Is it possible as been discussed here.

댓글을 달려면 로그인하십시오.

추가 답변 (1개)

Nipun Katyal
Nipun Katyal 2020년 3월 4일
Inorder to perform operations on cells use cellfun as mentioned below:
% Make some data
Daten = rand(100, 3);
Daten(:,3) = Daten(:,1) + Daten(:,2) + .1*randn(100, 1); % Minimum asymptotic error is .1
[m,n] = size(Daten) ;
% Split into train and test
P = 0.7 ;
Training = Daten(1:round(P*m),:) ;
Testing = Daten(round(P*m)+1:end,:);
XTr = Training(:,1:n-1);
YTr = Training(:,n);
XTe = Testing(:,1:n-1);
YTe = Testing(:,n);
XTrain=num2cell(XTr(:,1));
YTrain=num2cell(YTr(:,1));
XTest=num2cell(XTe);
YTest=num2cell(YTe);
% Define a train/validation split to use inside the objective function
cv = cvpartition(numel(YTrain), 'Holdout', 1/3);
% Define hyperparameters to optimize
vars = [optimizableVariable('hiddenLayerSize', [1,20], 'Type', 'integer');
optimizableVariable('lr', [1e-3 1], 'Transform', 'log')];
% Optimize
minfn = @(T)kfoldLoss(XTrain', YTrain', cv, T.hiddenLayerSize, T.lr);
results = bayesopt(minfn, vars,'IsObjectiveDeterministic', false,...
'AcquisitionFunctionName', 'expected-improvement-plus');
T = bestPoint(results);
function rmse = kfoldLoss(x, y, cv, numHid, lr)
% Train net.
net = feedforwardnet(numHid, 'traingd');
net.trainParam.lr = lr;
net = train(net, x(:,cv.training), y(:,cv.training));
% Evaluate on validation set and compute rmse
ypred = net(x(:, cv.test));
cMinus = cellfun(@minus, ypred, y(cv.test), 'UniformOutput', false);
cMean = cellfun(@mean, cMinus);
rmse = sqrt(cMean);
%rmse = sqrt(mean((ypred - y(cv.test)).^2));
end
  댓글 수: 1
Nipun Katyal
Nipun Katyal 2020년 3월 4일
In your case the rmse function will be:
function rmse = kfoldLoss(x, y, cv, numHid, lr)
% Train net.
net = feedforwardnet(numHid, 'traingd');
net.trainParam.lr = lr;
net = train(net, x(:,cv.training), y(:,cv.training));
% Evaluate on validation set and compute rmse
ypred = net(x(:, cv.test));
n = size(ypred);
pw = 2*ones(n);
pw = num2cell(pw);
cMinus = cellfun(@minus, ypred, y(cv.test), 'UniformOutput', false);
cSquare = cellfun(@power, cMinus, pw, 'UniformOutput', false);
cMean = cellfun(@mean, cSquare);
rmse = sqrt(cMean);
%rmse = sqrt(mean((ypred - y(cv.test)).^2));
end

댓글을 달려면 로그인하십시오.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by