Customized Regression output layer
조회 수: 1 (최근 30일)
이전 댓글 표시
Hello everyone, following the example of https://it.mathworks.com/help/deeplearning/ug/define-custom-regression-output-layer.html I tried to build a regression output layer using the mse error. Using the provided script, I did this for the loss function:
function loss = forwardLoss(layer, Y, T)
loss = mse(Y,T);
end
But trying with a data set in matlab net = trainNetwork(bodyfatInputs,bodyfatTargets,layers,options);
it gave me
Error using trainNetwork (line 170)
Error using 'forwardLoss' in Layer mseRegressionLayer. The function threw an error and could not be executed.
I buit layers
layers = [
sequenceInputLayer(13)
lstmLayer(100)
fullyConnectedLayer(1)
mseRegressionLayer('mse')];
What did I do wrong?
Thanks for your help
댓글 수: 7
Mohammad Sami
2020년 9월 7일
편집: Mohammad Sami
2020년 9월 7일
It seems the layer should be valid. Maybe something else is wrong. Try using the built-in regression layer ( which also uses mse) to verify that there is nothing else wrong.
답변 (1개)
Uday Pradhan
2020년 9월 10일
Hi,
I tried to implement your network on my end and found two problems. One, when using "mse" as the loss function, it is advisable to mention the 'DataFormat' argument as well, for example, see this page. So, modify the line (in your definition of 'mseRegressionLayer.m')
loss = mse(Y,T);
%change to
loss = mse(Y,T,'DataFormat','T'); %for sequences
Coming to the problem you are trying to solve:
The "bodyfat_dataset" consists of two important vectors X and T where X is of size 13 - by - 252 and targets T is 1 - by - 252. From my understanding, you would like to create a LSTM network which accepts a sequence of 13 features and predicts the body fat percentage. This is a sequence to one regression problem and as advised here, I redesigned your network as such:
layers = [
sequenceInputLayer(13)
lstmLayer(100,'OutputMode',"last") % Output the last time step of the sequence
fullyConnectedLayer(1)
mseRegressionLayer('mse')];
However, in this output mode the input must be in cell array format. To do this you may use the following:
N = 240; %number of sequences
cellArrTrain = cell(N,1);
for i = 1:N
seq = xtrain(:,i);
seq = num2cell(seq,1);
cellArrTrain(i) = seq;
end
% ------ FOR TRAINING PURPOSES -------%
net = trainNetwork(cellArrTrain,Ytrain,layers,options); %be cautious of the dimensions of
% cellArrTrain and Ytrain, they should match.
% Similarly convert the test data into a cell array too
Hope this helps!
댓글 수: 5
Uday Pradhan
2020년 10월 4일
Looks like the regression loss is too high. Try normalizing the loss in the custom layer like this:
loss= mse(Y,T,'DataFormat','T')/size(Y,2); %for the mini - batch
Also, start with a smaller learning rate around 0.001. Example of a standard training Option:
options = trainingOptions('adam', ...
'MaxEpochs',500, ...
'GradientThreshold',1, ...
'InitialLearnRate',0.001, ...
'LearnRateSchedule','piecewise', ...
'LearnRateDropPeriod',100, ...
'LearnRateDropFactor',0.1,...
'Verbose',1, ...
'Plots','training-progress');
You can play around with the number of layers and LSTM nodes, learning rates and number of epochs. Also, it is advisable to used validation sets because overfitting is quite common as we increase the number of layers.
참고 항목
카테고리
Help Center 및 File Exchange에서 Custom Training Loops에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!