How to use a custom transfer function in neural net training
이전 댓글 표시
I want to use a function similar to tansig. I don't seem to be able to find a good example, and the tansig.apply method only allows me one line! I'm wrapped around this axle, and I suspect I'm missing something simple. Any ideas? I'm using 2012b.
채택된 답변
추가 답변 (5개)
Bob
2013년 3월 27일
댓글 수: 4
Nn Sagita
2013년 8월 29일
Bob, I modified purelin transfer function, called 'mtf'. I saved in my working directory. I trained neural network and got outputs. But I got some messages too, like this:
Exception in thread "AWT-EventQueue-0" java.lang.NullPointerException at com.mathworks.toolbox.nnet.v6.diagram.nnTransfer.paint(nnTransfer.java:35) at com.mathworks.toolbox.nnet.v6.image.nnOffsetImage.paint(nnOffsetImage.java:49) at ....
Could you help me, what should I do?
kelvina
2014년 2월 15일
thanks bob, it helps me
but we can directly do this by coping file 'template transfer' from :C:\Program Files (x86)\MATLAB\R2010a\toolbox\nnet\nnet\nncustom
and just replace its function :a = apply_transfer(n,fp) by your function and then save this file in your working directory. it will work.
Mayank Gupta
2016년 5월 4일
Can you please explain in detail how to save a custom training function to the nntool directory ? I am using Firefly algorithm for optimization.
Mehdi Jokar
2018년 7월 16일
Bob, thank you for you instructions. but, is apply the only function that needs to be modified? or we need to modify the backprop and forwardprop function in the + folder ?
Mehdi
Greg Heath
2012년 12월 11일
편집: DGM
2023년 2월 23일
I cannot understand why you think y2 is better than y1
x = -6:0.1:6;
y1 = x./(0.25+abs(x));
y2 = x.*(1 - (0.52*abs(x/2.6))) % (for -2.5<x<2.5).
figure
hold on
plot(x,y1)
plot(x,y2,'r')
mladen
2013년 3월 26일
0 개 추천
Could anybody upload some examples of modified tansig.m and +tansig folder? This would be very helpful for my project and for other people too. Thank You.
댓글 수: 1
Nn Sagita
2013년 8월 29일
If you have some examples how to modify transfer function, please share for me. Thank you.
mladen
2013년 3월 29일
Thank you Bob. Nice trick with feedforwardnet.m (good for permanent use). I've managed to do this but some new questions arise:
- How to use param in apply(n,param) ? (more info-> matlabcentral/answers/686)?
- How to use different transfer functions within the same layer?
- My apply function looks something like this:
function A = apply(n,param)
%....
A=a1.*a2;
end
now I would like to use a1 and a2 to speedup the derivative computation in da_dn.m (this has already been done with tansig.m, but with the final value (A in my code))...is it possible?
카테고리
도움말 센터 및 File Exchange에서 Deep Learning Toolbox에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!