In a FeedForward NNet, what exactly is one iteration?

조회 수: 1 (최근 30일)
Sam Speake
Sam Speake 2018년 5월 23일
댓글: Greg Heath 2018년 5월 25일
When you train a feedforward neural net with no changes, you see a GUI which includes "Epoch: 0 [ x iterations ] 1000" Does the x value represent the amount of pieces of data that were passed (such as 1 image from a data set of images), or does it represent a full pass of the entire data set?

채택된 답변

Majid Farzaneh
Majid Farzaneh 2018년 5월 24일
Hello, In every neural network there is an optimization algorithm to set optimum weights and biases; and optimization algorithms are usually iterative. 1 epoch means one iteration in the optimization algorithm.
  댓글 수: 3
Majid Farzaneh
Majid Farzaneh 2018년 5월 24일
Yes, that's true. In every change for weights, network needs to calculate MSE and for MSE it needs to classify all training data with new weights.
Greg Heath
Greg Heath 2018년 5월 25일
Optimization algorithms TRY to optimize the goal. Many/most times they do not achieve the goal.
Nevertheless, they are often considered successful if they just get close enough.
For example, I often design neural networks to yield an output target t, given an input function x.
I take as a reference output
yref = mean(t')
the corresponding mean square error is
MSEref = mean(var(t',1))
My training goal is typically
MSEgoal = 0.01*MSEref
which preserves 99% of the target variance,

댓글을 달려면 로그인하십시오.

추가 답변 (0개)

카테고리

Help CenterFile Exchange에서 Deep Learning Toolbox에 대해 자세히 알아보기

제품

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by