Which neural network is suitable for my problem
이전 댓글 표시
Hi,
I am working on building a neural network for some data that I have. Its a time series problem but the output is only dependent on the previous time step.
y(t) = f(x1(t),x2(t)....x150(t))
So, for each output I have 150 inputs. Now I have data for a number of timesteps.
The narxnet doesn't seem appropriate as it seems to consider the output to be a function of all previous timesteps. The relationship is highly non-linear.
Please suggest on what you think would be the best solution.
Thanks
채택된 답변
추가 답변 (1개)
Greg Heath
2014년 2월 26일
NARNET is the appropriate function.
help narnet
doc narnet
Choose the row vector of positive feedback delays, FD, from the significant delays of the autocorrelation function. To find some examples in the NEWSGROUP and ANSWERS, search with
greg narnet nncorr
Hope this helps.
Thank you for formally accepting my answer
Greg
댓글 수: 3
Shivaprasad
2014년 2월 28일
Greg Heath
2014년 3월 5일
> Inputs is a 250x30 matrix and Outputs is a 1x30 matrix .
Originally you said I = 150, not 250. My answer then was to first reduce input dimensionality. My answer now is
FIRST REDUCE INPUT DIMENSIONALITY!
If 30 examples are linearly dependent, they only span a 29-dimensional space. You should either
a. Get rid of most of your valiables
b. Replace the original variables with a much smaller linear
combination
>First row of inputs is the input for the first row of output.
Only if both are changed to columns by tonndata
> function prediction = intnarnet(inputs,targets,predictioninputs,predictiontargets)
There is a narnet function. Therefore I suggest you change the name of this one.
> inputSeries = tonndata(inputs,true,false);
> targetSeries = tonndata(targets,true,false);
whos %You need to check types and dimensions
> % Create a Time Delay Network > inputDelays = 0:0;
Incorrect. This requires no delays. ???
> hiddenLayerSize = 5;
After reducing the input variable dimension, you should loop over many candidates values of H to find the smallest acceptable value.
> net = timedelaynet(inputDelays,hiddenLayerSize);
> [inputs,inputStates,layerStates,targets] = preparets(net,inputSeries,targetSeries);
use ... to continue statements on next line
Need to prevent the default random division that will destroy timeseries correlations.
net.divideFcn = 'dividetrain';
> % Setup Division of Data for Training, Validation, Testing > net.divideParam.trainRatio = 70/100; > net.divideParam.valRatio = 15/100; > net.divideParam.testRatio = 15/100;
Delete. These are defaults.
> % Train the Network > [net,tr] = train(net,inputs,targets,inputStates,layerStates); > % Test the Network > outputs = net(inputs,inputStates,layerStates); > errors = gsubtract(targets,outputs); > performance = perform(net,targets,outputs)
Last three unnecessary. Last one pretty useless. See tr below
[ net tr y e Xf Af ] = train(net, x, t, Xi, Ai); % e = t-y
tr contains performance values for trn, val and test. Final states Xf, Af can be used to continue the prediction given additional inputs.
> %Prediction
> inputSeries = tonndata(predictioninputs,true,false);
> targetSeries = tonndata(nan(size(targets)),true,false);
> [inputs,inputStates,layerStates,targets] = ...
preparets(net,inputSeries,targetSeries);
> prediction = net(inputs,inputStates,layerStates);
Shivaprasad
2014년 3월 15일
카테고리
도움말 센터 및 File Exchange에서 Deep Learning Toolbox에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!