Time series prediction using LSTM

조회 수: 23 (최근 30일)
Mustafa Al-Nasser
Mustafa Al-Nasser 2019년 10월 31일
답변: AMMAR ATIF 2022년 8월 17일
Dear All;
I am trying to build an LSTM model to prodict the repsone of time series (deterministic) but the result is not good at all .
i try to change the parameters but still i can get good results. could you help how can i imporve the results
The code is below and i attached the data.
data=Y;
figure (2)
plot(data)
xlabel("case")
ylabel("fouling")
title("fouling plot")
numTimeStepsTrain = floor(0.95*numel(data));
dataTrain = data(1:numTimeStepsTrain+1);
dataTest = data(numTimeStepsTrain+1:end);
mu = mean(dataTrain);
sig = std(dataTrain);
dataTrainStandardized = (dataTrain - mu) / sig;
XTrain = dataTrainStandardized(1:end-1);
YTrain = dataTrainStandardized(2:end);
numFeatures = 1;
numResponses = 1;
numHiddenUnits = 100;
layers = [ ...
sequenceInputLayer(numFeatures)
lstmLayer(numHiddenUnits)
fullyConnectedLayer(numResponses)
regressionLayer];
options = trainingOptions('adam', ...
'MaxEpochs',250, ...
'GradientThreshold',1, ...
'InitialLearnRate',0.005, ...
'LearnRateSchedule','piecewise', ...
'LearnRateDropPeriod',125, ...
'LearnRateDropFactor',0.2, ...
'Verbose',false, ...
'Plots','training-progress');
net = trainNetwork(XTrain,YTrain,layers,options);
dataTestStandardized = (dataTest - mu) / sig;
XTest = dataTestStandardized(1:end-1);
net = predictAndUpdateState(net,XTrain);
[net,YPred] = predictAndUpdateState(net,YTrain(end));
numTimeStepsTest = numel(XTest);
for i = 2:numTimeStepsTest
[net,YPred(:,i)] = predictAndUpdateState(net,YPred(:,i-1),'ExecutionEnvironment','cpu');
end
YPred = sig*YPred + mu;
YTest = dataTest(2:end);
rmse = sqrt(mean((YPred-YTest).^2))
figure
plot(dataTrain(1:end-1))
hold on
idx = numTimeStepsTrain:(numTimeStepsTrain+numTimeStepsTest);
plot(idx,[data(numTimeStepsTrain) YPred],'.-')
hold off
xlabel("Time")
ylabel("Fouling Factor")
title("Fouling Prediction")
legend(["Observed" "Forecast"])
figure
subplot(2,1,1)
plot(YTest)
hold on
plot(YPred,'.-')
hold off
legend(["Observed" "Forecast"])
ylabel("Cases")
title("Forecast")
subplot(2,1,2)
stem(YPred - YTest)
xlabel("Time")
ylabel("Error")
title("RMSE = " + rmse)

답변 (2개)

Shashank Gupta
Shashank Gupta 2019년 12월 11일
Hi,
While working on LSTM, we cannot have a final, definite, rule of thumb on how many layers or nodes or hidden neuron/units one must choose, this are all hyperparameter and very often a trail and error approach will give you the considerable better results. The most common framework people use is “K-fold Validation”. Maybe you should consider looking at it.
Every LSTM layer should be accompanied by a Dropout layer. It helps to prevent from overfitting. For choosing the optimizer, adaptive moment estimation or ADAM works well. Also MATLAB provide a way to get the optimal hyperparameter for training models, May be this link give you an idea of how to approach the problem.
Hope this helps.
  댓글 수: 1
lotus whit
lotus whit 2021년 10월 23일
편집: lotus whit 2021년 10월 23일
Hi
can you please to specify the minmum number of data (rows), to get a good reult of prediction ,because i have 33 entry(as time series from 1988:2012), but the result varied when i tried to duplicate the value to get good predictor?

댓글을 달려면 로그인하십시오.


AMMAR ATIF
AMMAR ATIF 2022년 8월 17일
Hi,
Reduce the LearnRateDropFactor, you can make it 0.1 and increase the number of epochs to 1000 as long as the training time is only 2 mins, the obtained RMSE error is 9.2668e-06, which is perfect !!

카테고리

Help CenterFile Exchange에서 Sequence and Numeric Feature Data Workflows에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by