Freez specific weights in Custom Neural Network
조회 수: 3 (최근 30일)
이전 댓글 표시
Hi, I've made a custom neural network with 69 layers, I have 3 inputs and the first Input is either 1 or -1. what I need is that the connection form this Input to different layers is scalled by a constant weight, so that the NN act on the other weights. Thank you for your help ! This is my first time I ask a community on the internet :)
댓글 수: 3
답변 (1개)
Sara Perez
2019년 9월 12일
You can set the propiety value of the layer 'WeightLearnRateFactor' to zero, so the weights won't be modified or learned
more info here:
댓글 수: 0
참고 항목
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!