Inconsistent training loss curve when training UNet with and without convergence criterion

조회 수: 10 (최근 30일)
I want to train a semantic segmentation model using UNet layers. I trained the model one time without any convergence criterion and max epoch of 500 and the other time with a convergence criterion (validation patience of 10 and validation frequency of 25). The secnd training process converged at a lower epoch number compared to the first one. The results of the application of the second model is significantly less accurate than the first one on the test slices. I noticed that the training loss in the second training process is different from the first one. My question is that why defining a convergence criterion changes the training loss curve? I would be thankful if any one can help me understand why this is happening. The learning rate is constant and equal to 1e-05.
Training curves without convergence criterion:
Training curves with convergence criterion:

답변 (1개)

Matt J
Matt J 2023년 4월 22일
편집: Matt J 2023년 4월 24일
There should be a change in the training loss curve - it should be shorter. By loosening the stopping criteria, you cause fewer iterations to be run.
There will also be changes in the training loss curves due to the stochastic nature of the SGD algorithm and the parameter initialization.
  댓글 수: 3
Matt J
Matt J 2023년 4월 22일
편집: Matt J 2023년 4월 22일
It's not clear to me how strong the differences are. The axes limits are not the same in the 2 plots.
Also, we have not been shown the 3rd plot (on the same axes as the first two) with the training curve you get after reverting back to default settings.
Memo Remo
Memo Remo 2023년 4월 24일
Dear Matt,
I appreciate your attention.
I used another approach to train this model for now. I will check this problem again as soon as I can and provide more information. Thank you.

댓글을 달려면 로그인하십시오.

카테고리

Help CenterFile Exchange에서 Dimensionality Reduction and Feature Extraction에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by