neural network nprtool tansig vs logsig
조회 수: 4 (최근 30일)
이전 댓글 표시
Hello,
I am a little confused about the nprtool in the neural network toolbox. It generates a two layer feedforward network with a tansig activation on the output layer. However it expects a binary output with {0,1} and it seems to work right.
I wonder why doesn't it use logsig activation if the output will be {0,1}. When I manually modify the output activation to logsig the generated output gets compressed to [0.5,1] range, which is wrong.
I can't explain what seems to be the problem.
Thanks
댓글 수: 0
채택된 답변
Greg Heath
2013년 5월 1일
Most of the nets have a default mapminmax processing default that transforms outputs to the range (-1,1). For that range tansig is appropriate. There also is a default reverse transformation that yields the original target range.
My guess is that when you imposed 'logsig' the processing got screwed up.
Exactly what commands did you use to switch to 'logsig'?
Hope this helps.
*Thank you for formally accepting my answer.
Greg
댓글 수: 0
추가 답변 (2개)
Greg Heath
2013년 4월 15일
편집: Greg Heath
2013년 4월 15일
You can not manually add logsig after being trained with purelin
logsig(0:1) = 0.5 0.7311
The net has to be trained with logsig.
Hope this helps.
Thank you for formally accepting my answer
Greg
댓글 수: 2
Greg Heath
2013년 4월 29일
Please post the script , initial RNG state, and the results from using one of the MATLAB classification nndatasets.
Vito
2013년 4월 29일
This Fuzzy logic. AND = min(a,b), OR=max(a,b). The binary operator S can represent the addition(OR) boundary: S(1, 1) = 1, S(a, 0) = S(0, a) = a (logsig)
댓글 수: 2
Vito
2013년 5월 17일
Look at the theory, since classical logic, three-value and more fuzzy logic. The base of fuzzy logic is the algebra of Minima and Maxima which has the same properties, as Boolean algebra. In the help of ML about it it is told in general.
참고 항목
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!