필터 지우기
필터 지우기

MATLAB gives me different value of output every time I train a neural network, why?

조회 수: 5 (최근 30일)
I was doing multilayer neural network. Input data (3 input data and 150 samples) - 3x150 target - 1x150
I did not specify the weight and bias, is it the reason to return different value of output every time I train the neural network?

채택된 답변

Greg Heath
Greg Heath 2015년 7월 2일
The default data division and weight initialization are both random.
To reproduce a design you have to know the initial state of the RNG before it is both configured with initial weights and divided into training, validation and testing subsets.
When designing multiple nets in a double for loop (creation in the outer loop and training in the inner loop), you only have to initialize the RNG once: before the first loop. The RNG changes its state every time it is called. Therefore, for reproducibility, record the RNG state at the beginning of the inner loop.
Exactly when the RNG is called differs for the different generation of designs. For special cases of the obsolete NEWFF family (e.g., NEWFIT, NEWPR and NEWFF), weights are initialized when the nets are created. For special cases of the current FEEDFORWARDNET family, (e.g., FITNET, PATTERNNET and FEEDFORWARDNET), weights can be initialized explicitly by the CONFIGURE function. Otherwise, they will be automatically initialiized by the function TRAIN.
When I find out exactly where the data is divided, I will post in both the NEWSGROUP and ANSWERS.
Hope this helps.
Thank you for formally accepting my answer
Greg

추가 답변 (1개)

Walter Roberson
Walter Roberson 2015년 7월 1일
The weights are initialized randomly unless you specifically initialize them.

카테고리

Help CenterFile Exchange에서 Deep Learning Toolbox에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by