CNN Deep learning: Data size Vs Iteration per epoch
조회 수: 6 (최근 30일)
이전 댓글 표시
I need your help to understand why the "data size" affects the number of "iteration per epoch". See below A and B.
(A) (B)
With the number of iterations per epoch, shown in figure A, the training data size = 3700 images. With the number of iterations per epoch, shown in figure B, the training data size = 57000 images. I did not change any settings in my CNN network (and, in both cases, the input images had the same size). Can you please explain why changing (or increasing) the data size has increased the number of iteration per epoch? In other words, what is the relationship between data size and number of iteration per epoch?
댓글 수: 2
Ritu Panda
2020년 9월 21일
Iterations per epoch depends on the number of training sample that the model is trained on in each epoch.
For each epoch, your training data is divided into batches of data (specified by the miniBatchSize parameter in the options argument). The model trains on every batch and updates the weight parameters.
Hence, Iterations per epoch = Number of training samples ÷ MiniBatchSize
i.e., In how many iterations in a epoch the forward and backward pass takes place during training the network.
답변 (0개)
참고 항목
카테고리
Help Center 및 File Exchange에서 Recognition, Object Detection, and Semantic Segmentation에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!