NARX network cannot learn identity relation

Consider the simple network below, where targets are simply the inputs. I was expecting the network learn this relation perfectly but unfortunately this is not the case.
Can someone please clarify why this doesnt work?
X = rand(1,1000);
T = X;
net = narxnet(1:2,1:2,10);
[x,xi,ai,t] = preparets(net,con2seq(X),{},con2seq(T));
[net,tr] = train(net,x,t,xi,ai);

 채택된 답변

Greg Heath
Greg Heath 2015년 1월 5일
편집: Greg Heath 2015년 1월 5일

1 개 추천

For meaningful examples, always use the MATLAB EXAMPLE DATA obtained from
help nndatasets
doc nndatasets
Your example is a case of a single series target without an exogeneous external input. Therefore, the appropriate time-series function would be NARNET.
For meaningful examples, always use data with a deterministic I/O relationship. A random time series is not correlated to anything but a zero delay copy of itself. That would mean a narnet with a zero feedback delay. However, for obvious reasons, zero feedback delays are not allowed in time series.
Hope this helps
Thank you for formally accepting my answer
Greg

댓글 수: 4

Thanks for the answer but I cant get why the NAR or NARX network cannot learn this relation. A feedforwardnet perfectly learns it. I was expecting the network will simply learn to ignore past values (non zero delays).
Consider another example, where X (exogeneous input) is the random force applied to a unit mass and T is the velocity of the mass (code is below). This is a deterministic relationship with no noise, right? NARX also cannot learn this very well. I then changed the sample to identity relation to understand what is going on.
In this physics sample, T is not a function of (T,X) but the change of T (derivative of T) is a function of (T,X). Does this effect learning, if so how to model it properly?
The code of this simple physics network. 1000 samples x 100 timeteps
num_series = 1000;
x_parts = cell(1, num_series);
t_parts = cell(1, num_series);
for i=1:num_series
x_ = randn(1,100); % force applied to unit mass
t_ = cumsum(x_); % velocity of the mass. simply the cumulated sum of forces
t_parts{i} = con2seq(t_);
x_parts{i} = con2seq(x_);
end;
T = catsamples(t_parts{:}, 'pad');
X = catsamples(x_parts{:}, 'pad');
num_delays = 2;
net = narxnet(1:num_delays,1:num_delays,10);
net.divideFcn = 'divideind';
net.divideMode = 'sample';
[net.divideParam.trainInd, net.divideParam.valInd, net.divideParam.testInd] = divideblock(num_series, 0.6, 0.2, 0.2);
[x,xi,ai,t] = preparets(net,X,{},T);
[net,tr] = train(net,x,t,xi,ai);
Your net cannot learn
y(t) = x(t);
because you have imposed ID=1:2, FD=1:2 which constrains it to functions of the form
y(t) = f( x(t-1), x(t-2), y(t-1), y(t-2) ).
Try again using
ID = 0:2, FD= 1:2.
However, the result will not be an identity operator for any other function besides this particular x(t);
Hope this helps.
Thank you for formally accepting my answer
Greg
Hakan
Hakan 2015년 1월 6일
ah, of course, thx:)
a final question, now the network perfectly learns the simple physics system, error is about e-11. but the error autocorrelation does not fall into confidence interval as can be seen in the attached image. indeed as the error decreases error autocorrelation spread out of confidence interval, how should I interpret this?
Greg Heath
Greg Heath 2015년 1월 11일
편집: Greg Heath 2015년 1월 11일
Great question!!!
Except for the fact that the value at zero lag, 11e-12, is the mean square error, I'm not sure how helpful this is.

댓글을 달려면 로그인하십시오.

추가 답변 (0개)

카테고리

도움말 센터File Exchange에서 Deep Learning Toolbox에 대해 자세히 알아보기

태그

질문:

2015년 1월 3일

편집:

2015년 1월 11일

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by