Trouble using input parameters on a custom deep learning layer
조회 수: 10 (최근 30일)
이전 댓글 표시
I'm trying to perform feature selection methods on a deep learning layer. My initial idea was to use a learnable parameter that I inputted in the custom layer, with all the indexes of the vector I want to select with a '1' value. But when I call the layer.Variable, all the indexes changed its value and are consistent with what I initially inputted.
classdef globalindexPooling < nnet.layer.Layer
properties (Learnable)
Alpha
end
methods
function layer = globalindexPooling(numChannels, name)
% Initialize scaling coefficient
layer.Alpha = numChannels;
end
function Z = predict(layer, X)
input_size = [size(X,1) size(X,2) size(X,3)];
N = size(X,4);
X_reshaped = reshape(X,[prod(input_size) N]);
Z = X_reshaped(logical(gather(layer.Alpha)),:);
Z = reshape(Z,[6 6 256 N]);
end
function [Z, memory] = forward(layer, X)
input_size = [size(X,1) size(X,2) size(X,3)];
N = size(X,4);
X_reshaped = reshape(X,[prod(input_size) N]);
Z = X_reshaped(logical(gather(layer.Alpha)),:);
Z = reshape(Z,[6 6 256 N]);
memory = [];
end
function [dLdX,dLdalpha] = backward(layer, X, Z, dLdZ,~)
input_size = [size(X,1) size(X,2) size(X,3)];
N = size(X,4);
dLdZ = reshape(dLdZ,[6*6*256 N]);
dLdX_prima = zeros([prod(input_size) N],'like',X);
dLdX_prima(logical(gather(layer.Alpha)),:) = dLdZ;
dLdX = reshape(dLdX_prima,[input_size N]);
dLdalpha = ones(size(gather(layer.Alpha)),'like',X);
end
The input is a logical vector with ones in the selected indexes, but I got all 0.0784 and -0.9216, sometimes the first values being the original ones and other not. I tried using round() but since the assignment to each of them is inconsistent, it didn't work.
댓글 수: 0
답변 (0개)
참고 항목
카테고리
Help Center 및 File Exchange에서 Deep Learning Toolbox에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!