MATLAB neural network classification different results

조회 수: 2 (최근 30일)
Gediminas
Gediminas 2014년 5월 24일
답변: Hamza 2023년 10월 23일
Hello,
i used MATLAB function "patternet" to create 1 layer (10 neurons) neural network classifier to classify data into 3 classes with default attributes (training function, initializatio and ect.). Suppose have matrix - NxM with rows corresponding to observations and columns are classification features. I found that when i use different combination of same features i get different classification results. For example, using matrix(:,[1 2 3]) gives different classification results (and also different weight values of whole network) compared with matrix(:,[1 3 2]).
Can somebody explain why it is so? Is this drawback is fundamentally related to neural networks classification algorithm or some implementation features?
  댓글 수: 1
Gediminas
Gediminas 2014년 5월 24일
Forgot to mention that when i use the same matrix several times with "patternet" (e.g. matrix(:,[1 2 3])) i get identical weight and classification values.

댓글을 달려면 로그인하십시오.

채택된 답변

Greg Heath
Greg Heath 2014년 5월 26일
When making multiple designs in a loop, use rng to initialize the random number generator BEFORE the loop. If the training function uses batch learning, the results will be independent of the order of the columns.
  댓글 수: 3
Image Analyst
Image Analyst 2014년 5월 26일
Gediminas's "Answer" moved to here:
I found that the problem is related to initialization of neural network weights. It gives the identical results every time i use same column configuration, but if i use rng('shuffle') each time i train and test network it gives different results even with the same column configuration. So the problem to me is now how to initialize randomly all weights without previous repetition without each time calling rng('shuffle')?
Greg Heath
Greg Heath 2015년 4월 25일
When training in a loop only initialize the RNG once: just before the outer loop

댓글을 달려면 로그인하십시오.

추가 답변 (2개)

Greg Heath
Greg Heath 2014년 6월 1일
Remember that the state of the RNG changes every time it is called. Now,
1. Net creation is different for the obsolete functions newfit and newpr which call newff than the corresponding new functions fitnet and patternnet which call feedforwardnet.
2. Obsolete
a. Random weight initialization occurs at net creation
b. Random data division occurs at the beginning of training
3. Current
a. Weights are no longer assigned at creation
b. Weights can be assigned before training using configure
. c. Random data division occurs at the beginning of training
d. Train will only assign initial weights to a weightless net
Therefore, if you are training multiple nets of the current version in a loop, you have to use configure to initialize weights at the beginning of the loop.
Hope this helps
Greg

Hamza
Hamza 2023년 10월 23일
Hello everyone, I'm facing the same issue on CNN. When I shuffled the features, I obtained different accuracy values, which ideally should remain consistent. Do you have any suggestions on how to resolve this issue? I am using Malba 2023.

카테고리

Help CenterFile Exchange에서 Deep Learning Toolbox에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by