Injection noise to CNN through customized training loop

조회 수: 8 (최근 30일)
MAHSA YOUSEFI
MAHSA YOUSEFI 2021년 1월 4일
편집: MAHSA YOUSEFI 2021년 1월 10일
Hi there.
I am using costumized loop to train my CNN. For designing my net, I need to inject Gaussian noise per each layer. I could not find in DL toolbox about noise layer and L2 regularization. I need to know how I can put a Gaussian noise layer (if there is) in my model and where exactly would be its place in layers ordering. Then how can I define L2 regularization consist with my costumized training loop (with dlNetwork(lgraph)). I mean, for computing loss function (using cross entropy) and gradient (using dlfeval(@gradientmodel, ...) ), should I add only 0.5*norm(dlnet.learnables) to loss and dlnet.learnables(i,:), where i refers to only weights or there is other approach to do this??
Thanks for any help.

채택된 답변

Shashank Gupta
Shashank Gupta 2021년 1월 7일
Hi Mahsa,
There is no explicit layer for adding Gaussian noise to each layer in MATLAB. Although you can create one custom for you. Also check out this example. It talks about some gaussian custom layer which you can take help from. It will definitely help you.
Also all the parameter in trainingOption can be implemented in the custom loop function easily and this L2 can also. I suggest you to follow up this doc page. It gives a details explaination about how different parameter can be implemented when using custom training loop.
I hope it gives you a good headstart to process further.
Cheers.
  댓글 수: 1
MAHSA YOUSEFI
MAHSA YOUSEFI 2021년 1월 10일
편집: MAHSA YOUSEFI 2021년 1월 10일
Thank you for your help. I am following your seggustions. Just one more thing about L2 regularization. In the page you liked it this, there is an update for only gradients not loss. I have to first update (adding regularized term to unregularized objective function: loss(w) = loss(w) + l2Regularization/(2N) * ||w||) and then consider the update for gradient as mentioned in this. Am I right?
Also in this link, "N" (sample sized using for computing loss and gradient) was ignored. I think the updating term must be as follow for gradient:
gradients(idx,:) = dlupdate(@(g,w) g + (l2Regularization./N)*w, gradients(idx,:), dlnet.Learnables(idx,:));
not
gradients(idx,:) = dlupdate(@(g,w) g + l2Regularization*w, gradients(idx,:), dlnet.Learnables(idx,:));

댓글을 달려면 로그인하십시오.

추가 답변 (0개)

카테고리

Help CenterFile Exchange에서 Statistics and Machine Learning Toolbox에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by