I want multivariate regression custom code
조회 수: 1 (최근 30일)
이전 댓글 표시
Hi~
I am doing a multivariate regression. I want to perform multivariate regression using deep learning low api such as dlfeval, dlgradient, etc.
This is my code and learning about 6 outputs is not working properly. I want a simple sample of multivariate regression using deep learning low api such as dlfeval and dlgradient in a similar way.
--------------------------------------------------------------------------------------------
...
...
...
Inputs=transpose(Input);
Outputs=transpose(Output);
layers = [
featureInputLayer(2,'Name','in')
fullyConnectedLayer(64,'Name','fc1')
tanhLayer('Name','tanh1')
fullyConnectedLayer(32,'Name','fc2')
tanhLayer('Name','tanh2')
fullyConnectedLayer(16,'Name','fc3')
tanhLayer('Name','tanh3')
fullyConnectedLayer(8,'Name','fc4')
tanhLayer('Name','tanh4')
fullyConnectedLayer(6,'Name','fc5')
];
lgraph=layerGraph(layers);
dlnet=dlnetwork(lgraph);
iteration = 1;
averageGrad = [];
averageSqGrad = [];
learnRate = 0.005;
gradDecay = 0.75;
sqGradDecay = 0.95;
output=[];
dlX = dlarray(Inputs,'CB');
for it=1:500
iteration = iteration + 1;
[out,loss,NNgrad]=dlfeval(@gradients,dlnet,dlX,Outputs);
[dlnet.Learnables,averageGrad,averageSqGrad] = adamupdate(dlnet.Learnables,NNgrad,averageGrad,averageSqGrad,iteration,learnRate,gradDecay,sqGradDecay);
if mod(it,100)==0
disp(it);
end
end
function [out,loss,NNgrad,grad1,grad2]=gradients(dlnet,dlx,t)
out=forward(dlnet,dlx);
loss2=sum((out(1,:)-t(1,:)).^2)+sum((out(2,:)-t(2,:)).^2)+sum((out(3,:)-t(3,:)).^2)+sum((out(4,:)-t(4,:)).^2)+sum((out(5,:)-t(5,:)).^2)+sum((out(6,:)-t(6,:)).^2);
loss=loss2;
[NNgrad]=dlgradient(loss,dlnet.Learnables);
end
-------------------------------------------------------------------------------------------------------------------------------------------------
Thanks for reading my question. I hope that a great person can answer.
댓글 수: 0
답변 (1개)
Shashank Gupta
2021년 2월 25일
Hi,
By the first look at the code, I can provide you some suggestion to get the model working. First thing is you are providing your whole input data while updating the weight. Generally it does not work. Better option is to convert input data into batches and update the weight by passing batches. Secondly, you can use mse directly when you are defining the loss function. It will be quick and less chances of you making error there. Here is the link for an example using custom training loop, try to replicate it. If your network architecture is well defined for the dataset then it should definitely work.
I hope this helps.
Cheers.
댓글 수: 0
참고 항목
카테고리
Help Center 및 File Exchange에서 Linear Regression에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!