prune hidden neurons in neural network

조회 수: 5(최근 30일)
Peter Mills
Peter Mills 2017년 10월 17일
편집: Greg Heath 2017년 11월 7일
I am trying to use 'prune' to prune the hidden layers in the following neural network. To start of with I define 1000 hidden neurons.
%For the Neural Network model:
BDataARtraining=[x1A y1A];%create a 3rd order data set by staggering observations
BDataARpredictor=[x2F]; %hold out to test forcast
dim=size(BDataARtraining,2); %number of variables of multivariate
model = newfit(BDataARtraining(1:end,1:dim-1)',BDataARtraining(1:end,dim)',100);%create a Feedforward Neural Network with 3 inputs, 1 output and 100 hidden neurons
Please could you explain why model2 still has 100 hidden nodes after pruning (see view(model2) ) as I was expecting some to have been removed by pruning? I have tried starting with different numbers of hidden neurons and also get the same number after pruning as what I started with.

채택된 답변

Greg Heath
Greg Heath 2017년 10월 18일
Ntrials Hmin Hmax
for an easier approach to find the smallest successful number of hidden nodes in the range [Hmin Hmax] using Ntrials sets of random initial weights.
Hope this helps.
Thank you for formally accepting my answer
  댓글 수: 4
Greg Heath
Greg Heath 2017년 11월 7일
편집: Greg Heath 2017년 11월 7일
1. Solutions of equations having more unknowns than the number of equations are obviously unstable.
2. A net with one hidden layer is a universal approximator.
3. Reasons for using more than 1 hidden layer
a. Reduce the total number of weights
b. Additional information about the structure
of the transformation is known (e.g., existence of
classification subclasses).
In order to optimize both number of layers and number of nodes you have to have another constraint.
What is yours?
Hope this helps.
>> help prune --- help for network/prune ---
PRUNE: Delete neural inputs, layers and outputs with sizes
of ZERO!.

댓글을 달려면 로그인하십시오.

추가 답변(0개)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by