how to adjust derivatives of backpropagation according to custom error function
    조회 수: 3 (최근 30일)
  
       이전 댓글 표시
    
I want to implement custom error function to backpropagation algorithm, and enforce to update weights with taking into account output layer outputs. 
Network consists of one hidden layer. The inputs are (n,1) vector, outputs have the same dimensions.
Lets say we have given input and target, I want to calculate error function as 
E = ((input*output-target)^2)/2
 so at each iteration target updates to reach initial target.  
updatedTarget = input*output
According to this, derivatives are modified to 
dE_dOutput(j) = sum(input .* netOutputLayerOutputs - initialTarget(j)) * input(j);
the code part of backpropagation algorithm is provided below
for i = 1:100      
    for j = 1 : length(input)        
        % forward propagation
        netHiddenLayerValues(j) = sum(netHiddenLayerWeights(:,j) .* input(j))  + netBiasValue1 * 1;
        netHiddenLayerOutputs(j) = 1/(1 + exp(-netHiddenLayerValues(j)));
        netOutputLayerValues(j) = sum(netOutputLayerWeights(:,j) * netHiddenLayerOutputs(j)) + netBiasValue2 * 1;
        netOutputLayerOutputs(j) = 1/(1 + exp(-netOutputLayerValues(j)));    
        % Custom target function
        target(j) = input(j) * netOutputLayerOutputs(j);    
        % back propagation for output layer        
        % customized error derivative with respect to output layer outputs
        dE_dOutput(j) = sum(input .* netOutputLayerOutputs - initialTarget(j)) * input(j);
        %dE_dOutput(j) = -(target(j) - netOutputLayerOutputs(j));  % this is for standart mse
        % partial derivative of logistic function with respect to network output value
        dOutput_dNetout(j) = netOutputLayerOutputs(j) * (1 - netOutputLayerOutputs(j));    
        % partial derivative of network output with respect to weight
        dNetout_dw(j) = netHiddenLayerOutputs(j);    
        % calculate total error for output layer     
        d_EtotalOut(j) = dE_dOutput(j) * dOutput_dNetout(j) * netHiddenLayerOutputs(j);
        % calculate back propagation error for hidden layer
        d_EtotalHidden_dOut(j) = dE_dOutput(j) * dOutput_dNetout(j) * netOutputLayerWeights(j);
        dOut_dNetHidden(j) = netHiddenLayerOutputs(j) * (1 - netHiddenLayerOutputs(j));
        dNetHidden_dw(j) = input(j);        
        d_EtotalHiddenOut(j) = d_EtotalHidden_dOut(j) * dOut_dNetHidden(j) * dNetHidden_dw(j);    
        % update weights for hidden layer
        netHiddenLayerWeights(:,j) = netHiddenLayerWeights(:,j) - eta * d_EtotalHiddenOut(j);
        % update weights for output layer 
        netOutputLayerWeights(:,j) = netOutputLayerWeights(:,j) - eta * d_EtotalOut(j);
    end
end
 
  
  
  
  
 The figures above shows how are changes during iterations (5,10,20,50,100 iterations respectively), outputs of net(orange) and updated target (blue). The static curves are inputs (yellow) and initial or real target (magenta). 
The issue is that updated at each iteration target doesnot reach real initial target. 
Please write if you notice error in algorithm implementation or logic.
The backpropagation algorithm with mse error is described here step-by-step-backpropagation-example.
Thanks.
댓글 수: 1
  Greg Heath
      
      
 2019년 1월 16일
				Why in the world do you think this is better than the current performance measures?
Greg
채택된 답변
  Greg Heath
      
      
 2019년 2월 4일
        Your error function

  is not at a minimum when output = target
Why did you not use the standard
E =  (output - target)^2
Thank you for formally accepting my correct answer
Greg
추가 답변 (1개)
  BERGHOUT Tarek
      
 2019년 2월 3일
        try this code : https://www.mathworks.com/matlabcentral/fileexchange/69947-back-propagation-algorithm-for-training-an-mlp?s_tid=prof_contriblnk  
댓글 수: 0
참고 항목
카테고리
				Help Center 및 File Exchange에서 Deep Learning Toolbox에 대해 자세히 알아보기
			
	Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!


