why the network performance decreased??
조회 수: 1 (최근 30일)
이전 댓글 표시
hi all,
I applied NARXNET to predicting the time series. the problem is when I used the for loop to find the most optimum HN then run a new network with the selected HN, the performance is decreased ( R value ). why? (i.e from 0.9833 to 0.9663)
thank you for help
댓글 수: 0
채택된 답변
Greg Heath
2016년 4월 26일
Given a value for the number of hidden nodes, using different random weight initializations AND random weight divisions will yield a spread of results. The difference you state is typical.
To keep things manageable, I typically, do not train more than 100 nets at a time: , numH = numel(Hmin:dH:Hmax) = 10 and Ntrials = 10 for each H value. I display the 100 NMSE or Rsq =1-NMSE results in a Ntrials x numH matrix. Then I display the min, median, mean, std and max of Rsq in a 5 x numH matrix.
You would be surprised how disparate some results can be.
Searching the NEWSGROUP and ANSWERS using
greg Ntrials
should bring up enough examples.
Hope this helps.
Thank you for formally accepting my answer
Greg
댓글 수: 0
추가 답변 (0개)
참고 항목
카테고리
Help Center 및 File Exchange에서 Sequence and Numeric Feature Data Workflows에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!