Is it possible (yet) to implement a maxout activation "layer" in 2017b Deep Learning network?
조회 수: 1 (최근 30일)
이전 댓글 표시
Maxout is an activation function that includes RELU and “leaky” RELUs as special cases, basically allowing for piecewise linear (planar/hyperplanar) activation functions. They seem to work better than either in a number of cases. Here’s a reference: Goodfellow, I. J., Warde-Farley, D., Mirza, M., Courville, A., & Bengio, Y. (2013). Maxout networks. arXiv preprint arXiv:1302.4389. https://arxiv.org/abs/1302.4389
Ultimately I’m interested in playing with architectures like this one, which use maxouts extensively:
Zhang, Y., Pezeshki, M., Brakel, P., Zhang, S., Bengio, C. L. Y., & Courville, A. (2017). Towards end-to-end speech recognition with deep convolutional neural networks. arXiv preprint arXiv:1701.02720. Speech recognition using convolutional nets with maxout activation. https://arxiv.org/abs/1701.02720
But I simply can’t see any way to fake a maxout activation in a convolutional network framework in 2017b. While I’m a Matlab vet (since Version 4, I think), I’m a total newbie to Matlab deep learning networks, so maybe I’m missing something. Any suggestions greatly appreciated.
-Terry Nearey
댓글 수: 0
답변 (1개)
Pankaj Wasnik
2018년 1월 2일
Hi, You can try using https://github.com/yechengxi/LightNet, which is a bit simpler cnn toolbox where you can debug easily also it's easier to understand. You can try to implement the maxout layer by yourself. I am also trying the same. If I finish before you, I will share the code.
Regards, Pankaj Wasnik
댓글 수: 0
참고 항목
카테고리
Help Center 및 File Exchange에서 Install Products에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!