Multiple nested for loops for machine learning model hyperparameters

조회 수: 18 (최근 30일)
Isabelle Museck
Isabelle Museck 2024년 11월 2일 18:13
댓글: Isabelle Museck 2024년 11월 3일 17:57
I have a neural network and I am trying to build a nested loop to test multiple combinations of the follwoing two hyperparameters: filterSize and numBlocks. I have the code calculate the RMSE for each of the trials using leave out one vlaidation and then take the average overall. I am trying to test the follwing combination of filterSizes: 2, 3 ,4 and the number of Blocks: 3, 4; therefore, there should be a total of 6 avgRMSE values outputed. I am, however, getting an empty matrix with only 2 RMSE values, any suggestions on whats wrong with my code and how this can be fixed?
nfilterSize = [2 3 4];
nnumBlocks = [3 4];
numFilters = 80;
droupoutFactor = 0.005;
numFeatures = 8
%Iterate each combination of hyperparameters
for j =1:length(nfilterSize)
filterSize = nfilterSize(j);
for k = length(nnumBlocks)
numBlocks = nnumBlocks(k);
% Neural Network
net = dlnetwork;
layer = sequenceInputLayer(numFeatures,Normalization="rescale-symmetric",Name="input");
net = addLayers(net,layer);
outputName = layer.Name;
for i = 1:numBlocks
dilationFactor = 2^(i-1);
layers = [
convolution1dLayer(filterSize,numFilters,DilationFactor=dilationFactor,Padding="causal",Name="conv1_"+i)
layerNormalizationLayer
spatialDropoutLayer(Name= "spat_drop_"+i,Probability=droupoutFactor)
% Add and connect layers.
net = addLayers(net,layers);
net = connectLayers(net,outputName,"conv1_"+i);
layers = [
fullyConnectedLayer(1)];
net = addLayers(net,layers);
net = connectLayers(net,outputName,"fc");
% Train the network
RMSEtot = 0;
for h = 1:length(table) %iterate over all data points
validationdataX = table(h);
validationdataY = velocity(h);
%Exclude the current index (i) for training
trainingIndices = setdiff(1:length(table),h);
traningdataX = table(trainingIndices);
trainingdataY = velocity(trainingIndices);
options = trainingOptions("adam", ...
'MaxEpochs', 60, ...
'MiniBatchSize', 1, ...
'InputDataFormat', "CTB", ...
'Metrics', "rmse", ...
'Verbose', 0);
net = trainnet(traningdataX,trainingdataY,net,"mse",options);
Predval = minibatchpredict(net,validationdataX,InputDataFormats="CTB");
TrueVal = validationdataY;
TrueValue = cell2mat(TrueVal);
Predvalue = {Predval};
PredictedValue = cell2mat(Predvalue);
RMSE = rmse(PredictedValue,TrueValue)
RMSEtot = RMSEtot + RMSE;
end
%take average of all iterations after leave-out-one-validation
SumRMSE = RMSEtot;
AvgRMSE(j,k) = SumRMSE/(length(table))
end
end

채택된 답변

Walter Roberson
Walter Roberson 2024년 11월 2일 18:31
for j =1:length(nfilterSize)
%j is active at this level
for k = length(nnumBlocks)
%j and k are active at this level
for i = 1:numBlocks
%j and k and i are active at this level
for h = 1:length(table) %iterate over all data points
%j and k and i and h are active at this level
end
%j and k and i are active at this level
end
%j and k are active at this level
end
%j is active at this level
You are missing an end matching for j
Possibly you have mismatched for/end structures in your actual code.
  댓글 수: 2
Isabelle Museck
Isabelle Museck 2024년 11월 3일 13:36
I made sure that all the loops have matching end statments and now I am getting a 3x2 matrix for the AvgRMSE but the first column is only zeros as shown here:
I think there is an issue with my number of blocks loop or the placement of the end statements, but it is not clculating AvgRMSE values for the combination of numBlocks= 3 and filterSize = 2,3, and 4. Any thoughts on how to fix this in my code to get a 3x2 matrix with AvgRMSE outputs for all 6 of the combinations of hyperparameters?
nfilterSize = [2 3 4];
nnumBlocks = [3 4];
numFilters = 80;
droupoutFactor = 0.005;
numFeatures = 60
for j =1:length(nfilterSize)
filterSize = nfilterSize(j)
for k = length(nnumBlocks)
numBlocks = nnumBlocks(k)
net = dlnetwork;
layer = sequenceInputLayer(numFeatures,Normalization="rescale-symmetric",Name="input");
net = addLayers(net,layer);
outputName = layer.Name;
for i = 1:numBlocks
dilationFactor = 2^(i-1)
layers = [
convolution1dLayer(filterSize,numFilters,DilationFactor=dilationFactor,Padding="causal",Name="conv1_"+i)
layerNormalizationLayer
spatialDropoutLayer(Name= "spat_drop_"+i,Probability=droupoutFactor)
reluLayer
additionLayer(2,Name="add_"+i)];
% Add and connect layers.
net = addLayers(net,layers);
net = connectLayers(net,outputName,"conv1_"+i);
% Skip connection.
if i == 1
% Include convolution in first skip connection.
layer = convolution1dLayer(1,numFilters,Name="convSkip");
net = addLayers(net,layer);
net = connectLayers(net,outputName,"convSkip");
net = connectLayers(net,"convSkip","add_" + i + "/in2");
else
net = connectLayers(net,outputName,"add_" + i + "/in2");
end
% Update layer output name.
outputName = "add_" + i;
end
layers = [
fullyConnectedLayer(1)];
net = addLayers(net,layers);
net = connectLayers(net,outputName,"fc");
% Train the network
RMSEtot = 0;
for h = 1:length(table) %iterate over all data points
validationdataX = table(h);
validationdataY = velocity(h);
%Exclude the current index (i) for training
trainingIndices = setdiff(1:length(table),h);
traningdataX = table(trainingIndices);
trainingdataY = velocity(trainingIndices);
options = trainingOptions("adam", ...
'MaxEpochs', 60, ...
'MiniBatchSize', 1, ...
'InputDataFormat', "CTB", ...
'Metrics', "rmse", ...
'Verbose', 0);
net = trainnet(traningdataX,trainingdataY,net,"mse",options);
Predval = minibatchpredict(net,validationdataX,InputDataFormats="CTB");
TrueVal = validationdataY;
TrueValue = cell2mat(TrueVal);
Predvalue = {Predval};
PredictedValue = cell2mat(Predvalue);
RMSE = rmse(PredictedValue,TrueValue)
RMSEtot = RMSEtot + RMSE;
end
SumRMSE = RMSEtot;
AvgRMSE(j,k) = SumRMSE/(length(IMU_table))
end
end

댓글을 달려면 로그인하십시오.

추가 답변 (0개)

카테고리

Help CenterFile Exchange에서 Sequence and Numeric Feature Data Workflows에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by