How to avoid getting negative values when training a neural network?

조회 수: 76 (최근 30일)
Is there anyway to constrain the network results when we train a feed forward neural network in Matlab?
I am trying to train a supervised feed forward neural network with 100,000 observations. I have 5 continues variables and 3 countinues responses (labels). All my values are positive (labels and variables). However, when I train the network, sometimes it predicts negative results no matter what architecture I use. Negative results does not have any physical meaning and should not apear. Is there anyway to constrain the network? I also used reLU activation function for the last layer but the network cannot generalize well.
Thanks

채택된 답변

Mostafa Nakhaei
Mostafa Nakhaei 2020년 1월 30일
I found the answer for my problem. The main reason for getting negative results after I trained and tested the dataset with positive numbers was that the distribution of new dataset was different from those of train and test samples. They had more noise. In my case, the solution was not to change the activation functions of the last layer (it leaded to physically meaningless results) but to add some syntatic random noise to my dataset. This robusted the model against the noise.
Thanks
Mostafa

추가 답변 (1개)

Greg Heath
Greg Heath 2020년 1월 18일
Use a sigmoid for the output layer.
Hope this helps
THANK YOU FOR FORMALLY ACCEPTING MY ANSWER
GREG
  댓글 수: 1
Mostafa Nakhaei
Mostafa Nakhaei 2020년 1월 18일
Thanks Greg for the response.
This is the regression problem and also I guess sigmoid would give negative results as well.r

댓글을 달려면 로그인하십시오.

카테고리

Help CenterFile Exchange에서 Sequence and Numeric Feature Data Workflows에 대해 자세히 알아보기

제품

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by