필터 지우기
필터 지우기

How can I set the parameters of the feedforward neural network?

조회 수: 3 (최근 30일)
Xiaomin Li
Xiaomin Li 2017년 7월 4일
댓글: Xiaomin Li 2017년 7월 13일
How can I set the parameters of the feedforward neural network? How can I find the optimal number of hidden layers, number of nodes each layer? Thanks a lot!

채택된 답변

Greg Heath
Greg Heath 2017년 7월 5일
For run of the mill problems, you can use default settings except for
a. The number of hidden nodes (default is 10)
b. The initial weights and biases (the default, RANDOM, is best)
When
[ I N ] = size(input)
[ O N ] = size(target)
% Network topology is I - H - O
Nval = Ntst = round(0.15*N)
Ntrn = N - Nval - Ntst
% Number of training equations
Ntrneq = Ntrn*O % ~0.7*N*O
% No. of unknown weights and biases
% Nw = ( I + 1 )*H +( H + 1 )*O
Nw = O + (I + O + 1 )* H
% OVERFITTING (more unknowns than equations)
H > Hub = (Ntrneq-O)/(I+O+1)
To prevent overtraining an overfit net and impair its ability to perform well on nontraining data, one or a combination of the following can be implemented:
a. H <= Hub % Don't overfit!
b. Train with VALIDATION STOPPING to prevent poor
performance on the validation subset and other
(e.g., testing and unseen ) data
c. Use REGULARIZATION (see help/doc TRAINBR) to add
weighted sums of squared weights to the minimization function.
I tend to use VALIDATION STOPPING and a double loop approach to minimizing H by trial and error with random initial weights.
Search the NEWSGROUP and ANSWERS using
Hmin:dH:Hmax
Hope this helps
Thank you for formally accepting my answer
Greg

추가 답변 (0개)

카테고리

Help CenterFile Exchange에서 支持向量机回归에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!