how can one utilize a dropout layer in a neural network during prediction?
조회 수: 15(최근 30일)
표시 이전 댓글
Dino Bellugi 2020년 2월 5일
댓글: Michael Phillips 2021년 3월 12일
I was hoping to use dropout layers at prediction time with an LSTM network in order to get confidence intervals.
Apparently, dropout layers only randomly set connections to 0 during training time.
From the dropout reference:
"A dropout layer randomly sets input elements to zero with a given probability. At training time, the layer randomly sets input elements to zero given by the dropout mask rand(size(X))<Probability, where X is the layer input and then scales the remaining elements by 1/(1-Probability). This operation effectively changes the underlying network architecture between iterations and helps prevent the network from overfitting. A higher number results in more elements being dropped during training. At prediction time, the output of the layer is equal to its input."
This explains why repeated calls to predictions with the same input result in the same output.
Has anyone come up with a workaround?
Thank you for your help,
댓글 수: 1
Michael Phillips 2021년 3월 12일
Hi Dino - did you ever create a custom dropout layer that works during network testing? If so would you be willing to share it? Thanks!
Sourav Bairagya 2020년 2월 10일
Usually dropout layers are used during training to avoid overfitting of the neural network. Currenly, 'dropoutLayer' of 'Deep learning toolbox' doesn't performs dropout during prediction. If you want to use dropout during prediction, you can write a custom dropout layer which does dropout in both 'forward' and 'prediction' method.
You can leverage this link to get idea about writing custom layers:
댓글 수: 0
Find more on Deep Learning Toolbox in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!Start Hunting!