필터 지우기
필터 지우기

Forward function with frozen batch normalization layers

조회 수: 2 (최근 30일)
Imola Fodor
Imola Fodor 2024년 2월 26일
답변: Shivansh 2024년 4월 2일
In my application i have both batch normalization and dropout, and i would like to perform MC dropout with the forward function, and ideally i would freeze the parameters TrainedMean and TrainedVariance for the batch normalization layers, but i cannot seem to understand is it possible. I have the bn layers after conv layers, and the dropout after the recurrent layer in my net. Thank you in advance
  댓글 수: 1
Imola Fodor
Imola Fodor 2024년 2월 28일
actually another problem is that i get an error for using the forward function, simply putting as input the same arguments as for the predict function..

댓글을 달려면 로그인하십시오.

채택된 답변

Shivansh
Shivansh 2024년 4월 2일
Hi Imola!
It seems you are facing issues with freezing the parameters of Batch normalization layers and the forward function for the network. For the first case, you can achieve this by setting the 'OffsetLearnRateFactor' and 'ScaleLearnRateFactor' as 0. This will ensure that the "Trained Mean" and "Trained variance" are not updated during the training. You can refer to the following related MATLAB answer for freezing the weights:
For the second query regarding the "forward" function, the "forward" function doesn't take the same inputs as the "predict" function. The "forward" function needs input to be a formatted dlarray. You can convert the current input array to a formatted dlarray using the following command:
dlA = dlarray(input,"format");
A sample example using wNet network
You can refer to the following MATLAB documentation links for more information:
I hope it helps!

추가 답변 (0개)

카테고리

Help CenterFile Exchange에서 Image Data Workflows에 대해 자세히 알아보기

태그

제품


릴리스

R2022b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by