Definition of MiniBatchSize in Matlab training options
조회 수: 331(최근 30일)
i currently having confusion on the 'MiniBatchSize' function offered under trainingOptions in deep learning. I put a scenario below for better understanding on the questions.
Dataset: 4500 Sample ( 9 categories with 500 sample each)
MiniBatchSize : 10
- Does it mean that i would have 10 samples in every batch or 500 samples in every batch (5000 sample/10 batch)?
- Does having more samples in one batch size increase the accuracy of the trained network (CNN) or vise versa ?
Wish someone could help me clarify on the confusion. Thank You very much.
Srivardhan Gadila 2021년 6월 13일
For the above example with dataset having 4500 Samples ( 9 categories with 500 sample each) and MiniBatchSize = 10, it means that there are 10 samples in every mini-batch, which implies 4500/10 = 450 iterations i.e., it takes 450 iterations with 10 samples per mini-batch to complete 1 epoch (full pass on the dataset).
Regarding the impact of batch size on the accuracy, I think in general it would not effect the final accuracy although there may be variations in how many epochs it took to arrive at final accuracy depending on the batch size. The following are few things you can consider w.r.t batch size: If you have a GPU then the training time decreases significantly by setting the appropriate batch size based on the available GPU memory. Refer to Deep Learning with Big Data on GPUs and in Parallel for more information. Also if your training data is too big to fit in the available memory, you can define smaller batch size and make use of datastores. Refer to Datastores for Deep Learning for more information.