In learning curve, training error decrease with increase training datasize.
이전 댓글 표시
I’ve learned and observed that training loss / error increases with training data size as stated in Dr Andrew Ng’s ML course.

I’ve recently experienced an anomaly. Training error and Test error curves were decreasing while training data size was increasing.

is this normal?
Some post said because regularization. in my case I use trainbr :Bayesian regularization backpropagation
is this reason?
Thank you.
채택된 답변
추가 답변 (0개)
카테고리
도움말 센터 및 File Exchange에서 Deep Learning Toolbox에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!