WHAT MORE IS NECESSARY FOR A GOOD NEURAL NET DESIGN?

조회 수: 2 (최근 30일)
Greg Heath
Greg Heath 2018년 11월 5일
편집: Greg Heath 2018년 11월 5일
x = input;
t = target;
1. [I N ] = size(x)
[O N ] = size(t)
2. The default(AUTOMATIC & RANDOM) design division
procedure is
a. Train (70%)
b. Validate (15%): occurs during training and stops
training if validation error increases for
6(default) continuous epochs.
c. Test (15%)
3. The default normalization (0 mean/unit variance) is sufficient.
4. a. The (SUFFICIENT) default configuration only contains
ONE hidden layer with H = 10 nodes and (I+1)*H+(H+1)*O
randomly initialized weights.
b. If H is sufficiently large, ANY "reasonable"
input/output transformation can be approximated.
c. However, if H is too large, the phenomenon of
overfitting occurs and special steps have to be taken.
d. My approach is to find the minimum value of H that
yields an acceptable result
e. Typically my training goal is
mse(target-output) <= 0.01*var(target,1)
NOTE: var(target,1) is the error for the naïve guess
output = mean(target)
5. Weights are automatically initiallized randomly. So,
typically, all you have to do is
a. Start using the MATLAB sample code with the default
H = 10
b. Use a do loop to design a number of nets (e.g., 10 or
more) to find the best result from the random initial
weights
c. Search for the smallest value of H that will yield a
satisfactory solution.
6. Any questions?

답변 (0개)

카테고리

Help CenterFile Exchange에서 Sequence and Numeric Feature Data Workflows에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by