Multivariate Regression (in time and features) Using LSTM

조회 수: 2 (최근 30일)
JORGE FILHO
JORGE FILHO 2021년 7월 29일
편집: JORGE FILHO 2021년 7월 29일
Trying to feed a LSTM with different streamflow time series and their delayed sequences for gap filling. Let x be the initial matrix with selected predictors, one per line, considering size(x,2) as the number of samples. To introduce time dependence, the predictors are alternated with their delayed versions (from dt= [1:ndt], ndt being the maximum delay considered) as below:
for ii=1:size(x,2)
for j=1:ndt
x1(j:end,ndt*(ii-1)+j)=x(1:end-j+1,ii);
end
end
with the respective LSTM:
numFeatures = size(xTrain,1);
numResponses = size(yTrain,1);
numHiddenUnits = 300;
layers = [ ...
sequenceInputLayer(numFeatures)
lstmLayer(numHiddenUnits)
fullyConnectedLayer(numResponses)
regressionLayer];
The target is a line vector y. Is there a more effective arrange to introduce time dependencies in LSTM? I mean, I have tried to associate every y instance with a 3D matrix x2 containning the values of x (not of x1) from (t-ndt) to (t):
for ii=ndt:size(x,1)
x2(:,:,ii)=x(ii-ndt+1:ii,:);
end
But I don't know how to addapt the respectve LSTM.
I know the "Sequence-to-Sequence Using Deep-Learning example
I does not include explicit time dependencies.
Thanks.

답변 (0개)

카테고리

Help CenterFile Exchange에서 Deep Learning Toolbox에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by