How to compute the derivative of the neural network?
조회 수: 23 (최근 30일)
이전 댓글 표시
Hi,
Once you have trained a neural network, is it possible to obtain a derivative of it? I have a neural network "net" in a structure. I would like to know if there is a routine that will provide the derivatives of net (derivative of its outputs with respect to its inputs).
It is probably not difficult, for a feedforward model, there is just matrix multiplications and sigmoid functions, but it would be nice to have a routine that will do that directly on "net".
Thanks!
댓글 수: 0
채택된 답변
Greg Heath
2012년 10월 20일
Differentiate to obtain dyi/dxn
y = b2 + LW*h
h = tanh(b1+IW*x)
or, with tensor notation(i.e., summation over repeated indices),
yi = b2i + Lwij*hj
hj = tanh(b1j + IWjk*xk)
Now just use the chain rule.
Hope this helps.
Thank you for formally accepting my answer
Greg
댓글 수: 2
Tittu Mathew
2018년 12월 7일
Hi Greg and Filipe,
I am approaching you based on your query years ago on getting the partial derivative of trained ANN outputs w.r.t each of the input parameters by using the MATLAB toolbox, years ago.
I am trying to represent a simple function P = f(X,Y,Z) where P is a scalar output and the input to ANN is a vector with 3 elements, namely X, Y and Z. Now, using the MATLAB toolbox, I was able to train, test and validate a shallow feedforward ANN. It has 3 layers, one for the input, one for the hidden and the last one being the output. So, the ANN configuration in my case will be of the form 3-x-1, where 'x' is the number of neurons in the hidden layer. Apart for the tanh() activation function used in the middle layer, linear activation functions were used in both input and output layers. MapMinMax() was used for normalizing both the inputs and the output of the ANN.
However, after successfully training the ANN, when it comes to calculating the first order derivative of the ANN output with each of the inputs, I am getting orders of difference in the results when compared with derivatives obtained using analytical equation. I tried to understand and implement the codes provided in the works of "Approximation of functions and their derivatives:A neural network implementation with applications" but in vain. The code which I developed to do the same is attached with this message (ANN_FOD.m).
I would greatly appreciate if you could help me with the implementation of the code for ANN derivatives. Thank you for your time and kind regards,
Tittu V Mathew
추가 답변 (4개)
Filipe
2012년 10월 20일
댓글 수: 1
Greg Heath
2012년 10월 24일
You can try to make life easier by doing the pre and postprocessign yourself before and after training.
trevor
2013년 11월 7일
Hi Filipe,
Could you possibly share your code for computing the partial derivative of the ANN, or provide some info on the steps you used? That would be immensely useful!
Thanks, Trevor
댓글 수: 0
Muhammad Saif ur Rehman
2019년 4월 5일
Hi Filipe,
Can you share your code for computing the partial derivative of defined cost function w.r.t input?
Regards Saif
댓글 수: 0
soo-choon kang
2021년 8월 14일
net1 = fitnet(3);
net1 = train(net1,x',y');
% normalize x
nx = (x-net1.input.processSettings{1,1}.xmin)*net1.input.processSettings{1,1}.gain+net1.input.processSettings{1,1}.ymin;
h = tanh(net1.b{1}+net1.IW{1}*nx'); % h = [3xn] IW{1} = [3x1] x' = [1xn]
ny = net1.b{2}+net1.LW{2,1}*h; % y = [1xn] LW{2,1} = [1x3]
% de-normalize y
ypredict = (ny-net1.output.processSettings{1,1}.ymin)/net1.output.processSettings{1,1}.gain+net1.output.processSettings{1,1}.xmin;
% above ypredict is equivalent to predict(net1,x)
% derivative of nn at normalized scale
dnydnx = sum(net1.LW{2,1}'.*net1.IW{1}.*(1-h.*h),1)'; % dyy = [1xn] h'*h = [nxn]
% derivative of nn at real scale
dydx = dnydnx*net1.input.processSettings{1,1}.gain/net1.output.processSettings{1,1}.gain;
댓글 수: 0
참고 항목
카테고리
Help Center 및 File Exchange에서 Deep Learning Toolbox에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!