reproducibility of results using neural networks
조회 수: 1 (최근 30일)
이전 댓글 표시
I am using 'newff' to create a neural network, 'trainParam' to set its parameters, and 'train' to train it. The problem is that it uses random initial weight values during each time the function is used so that I get different convergence results at different runs using the same data. How do I get reproducibility of the results?
댓글 수: 0
답변 (3개)
Walter Roberson
2011년 5월 12일
Provide enough training data that the random initial weights have no impact. Or don't use random initial weights.
댓글 수: 1
Greg Heath
2011년 11월 26일
I assume by reproducibility, the OP means exactly the same weights and thresholds. There are many local minima in weight space.
For an I-H-O FFMLP each solution is equivalent to 2^H * H! -1 other solutions obtained by changing weight signs (2^H) and/or reshuffling the order of the hidden nodes (H!)
Therefore, reproducibility requires using the same initial state of rand before creating the net via newff.
Hope this helps.
Greg
Flo Trentini
2011년 11월 23일
I am using 'initzero' for input , layers and biases weights, then i use net = init(net) before training the network. And yet each run gives me different results. How is that possible ?
댓글 수: 0
Greg Heath
2011년 11월 26일
newff automatically uses rand and initnw.
Therefore, all you have to do is initialize rand before calling newff.
Hope this helps.
Greg
댓글 수: 0
참고 항목
카테고리
Help Center 및 File Exchange에서 Image Data Workflows에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!