Deep learning layer with custom backward() function

조회 수: 3 (최근 30일)
Damien T
Damien T 2021년 2월 3일
댓글: Damien T 2021년 2월 23일
I need to implement a complicated function (that computes a regularizing penalty of a deep learning model) of which I will then take the gradient with respect to the weights of the model to optimize them.
One operation within this "complicated function" is not currently supported for automatic differentiation (determinant of a matrix). I'm using the option (mentioned here) of implementing the operation as a custom layer, of which I can define the derivative myself with a custom backward() function. It works, but:
  1. In order to evaluate this custom layer, do I need to encapsulate this layer in a dlgraph/dlnetwork (a 2-layer network, since it also needs an input layer) so that I can then calling .predict() on this network ? Is there anything simpler, without having to build a network ? It would be nice if one could call something like the following, and the underlying gradient trace would be built to go through my custom backward function: y = myLayer.predict(x);
  2. I am using the automatic differentiation for second-order derivatives available in the R2021a prelease. Does this support layers with custom backward() function ?

답변 (1개)

Jack Xiao
Jack Xiao 2021년 2월 22일
dlnetwork is for customed training loop and dlfeval is for customed model gradients using defined gradient funtction.
you can refer the demo for gan:
https://www.mathworks.com/help/deeplearning/ug/train-generative-adversarial-network.html?s_tid=srchtitle
  댓글 수: 1
Damien T
Damien T 2021년 2월 23일
I don't think this answers either of the original questions ?!

댓글을 달려면 로그인하십시오.

카테고리

Help CenterFile Exchange에서 Sequence and Numeric Feature Data Workflows에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by