how can one utilize a dropout layer in a neural network during prediction?
조회 수: 20(최근 30일)
I was hoping to use dropout layers at prediction time with an LSTM network in order to get confidence intervals.
Apparently, dropout layers only randomly set connections to 0 during training time.
From the dropout reference:
"A dropout layer randomly sets input elements to zero with a given probability. At training time, the layer randomly sets input elements to zero given by the dropout mask rand(size(X))<Probability, where X is the layer input and then scales the remaining elements by 1/(1-Probability). This operation effectively changes the underlying network architecture between iterations and helps prevent the network from overfitting. A higher number results in more elements being dropped during training. At prediction time, the output of the layer is equal to its input."
This explains why repeated calls to predictions with the same input result in the same output.
Has anyone come up with a workaround?
Thank you for your help,
Sourav Bairagya 2020년 2월 10일
Usually dropout layers are used during training to avoid overfitting of the neural network. Currenly, 'dropoutLayer' of 'Deep learning toolbox' doesn't performs dropout during prediction. If you want to use dropout during prediction, you can write a custom dropout layer which does dropout in both 'forward' and 'prediction' method.
You can leverage this link to get idea about writing custom layers: