BatchNormalization layer with DropOut layer issue

I'm having issues with BatchNormalization layer during training deep learning modules (UNET, SegNet) - both 2D/3D training models.
This layer is the reason all the time for having much lower validation accuracy/high jump in error values in the finish of the training - this causing me to not able to predict with this model. If I'm trying to load a certain checkpoint - I'm missing some values for using it (mean for example).
Is there a way to use in a certain model both DropOut + BatchNormalization layers without getting this issue? I'm using Matlab version 2020a, is there a fix in updater versions perhaps..?

답변 (0개)

카테고리

도움말 센터File Exchange에서 Deep Learning Toolbox에 대해 자세히 알아보기

질문:

2020년 12월 23일

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by