Greetings! I wanted to ask if/how it is possible to add a dropout layer to a narxnet to improve regularization. Unfortunately I could not find any information elsewhere.
I Have a narxnet that used the last 3 lags of a timeseries and an exogenous input to forecast the next timestep and I would like to introduce regularization measures to help with overfitting. Thanks in advance! My current code looks as follow:
forecast_horizon = 1;
neurons = [5 5];
delays = 3;
inputDelays = (1:delays);
feedbackDelays = (1:delays);
net=narxnet(inputDelays, feedbackDelays, neurons);
net.trainFcn='trainbr';
net.trainParam.epochs = 40;
net = removedelay(net,forecast_horizon);

 채택된 답변

Shashank Gupta
Shashank Gupta 2021년 3월 29일

0 개 추천

Hi Andriy,
There are some workaround to add a dropout in narxnet, you can add the dropout by defining a custom transfer function to one of the layer. Details on how to create a custom transfer function is show here in the link.
Another convenient way is to not use a shallow network but go for deep networks. There are some resources you can check, try this, once you implement this, you can simply add a dropout layer. I will prefer deep network way. It's more easy, reliable and conveninet to implement.
I hope this helps.
Cheers.

추가 답변 (0개)

카테고리

도움말 센터File Exchange에서 Deep Learning Toolbox에 대해 자세히 알아보기

제품

릴리스

R2021a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by