Is it possible (yet) to implement a maxout activation "layer" in 2017b Deep Learning network?

조회 수: 2 (최근 30일)
Maxout is an activation function that includes RELU and “leaky” RELUs as special cases, basically allowing for piecewise linear (planar/hyperplanar) activation functions. They seem to work better than either in a number of cases. Here’s a reference: Goodfellow, I. J., Warde-Farley, D., Mirza, M., Courville, A., & Bengio, Y. (2013). Maxout networks. arXiv preprint arXiv:1302.4389. https://arxiv.org/abs/1302.4389
Ultimately I’m interested in playing with architectures like this one, which use maxouts extensively:
Zhang, Y., Pezeshki, M., Brakel, P., Zhang, S., Bengio, C. L. Y., & Courville, A. (2017). Towards end-to-end speech recognition with deep convolutional neural networks. arXiv preprint arXiv:1701.02720. Speech recognition using convolutional nets with maxout activation. https://arxiv.org/abs/1701.02720
But I simply can’t see any way to fake a maxout activation in a convolutional network framework in 2017b. While I’m a Matlab vet (since Version 4, I think), I’m a total newbie to Matlab deep learning networks, so maybe I’m missing something. Any suggestions greatly appreciated.
-Terry Nearey

답변 (2개)

Pankaj Wasnik
Pankaj Wasnik 2018년 1월 2일
Hi, You can try using https://github.com/yechengxi/LightNet, which is a bit simpler cnn toolbox where you can debug easily also it's easier to understand. You can try to implement the maxout layer by yourself. I am also trying the same. If I finish before you, I will share the code.
Regards, Pankaj Wasnik

Gayathri
Gayathri 2025년 6월 5일
편집: Gayathri 2025년 6월 5일
To implement "Maxout" activation layer, we will have to implement the functionality as a custom layer.
The following code is a minimal working example of a custom “Maxout” layer:
classdef MaxoutLayer < nnet.layer.Layer
properties
NumPieces
end
methods
function layer = MaxoutLayer(numPieces, name)
layer.Name = name;
layer.Description = ['Maxout with ', num2str(numPieces), ' pieces'];
layer.NumPieces = numPieces;
end
function Z = predict(layer, X)
inputSize = size(X);
H = inputSize(1);
W = inputSize(2);
N = inputSize(4);
C = inputSize(3) / layer.NumPieces;
X = reshape(X, H, W, C, layer.NumPieces, N);
Z = max(X, [], 4);
end
end
end
Please make sure that the convolutional layer feeding into the “Maxout” layer produces a number of channels divisible by "NumPieces", as required by the reshape logic.
These are some documentation links which can be useful:
Hope you find this useful!

카테고리

Help CenterFile Exchange에서 Deep Learning Toolbox에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by