필터 지우기
필터 지우기

Custom Vnet layer validation failed error

조회 수: 1 (최근 30일)
Amira Youssef
Amira Youssef 2022년 9월 10일
댓글: Amira Youssef 2022년 9월 15일
I am trying to train a custom Vnet network it keeps giving me the following error
Layer 'en3_prelu1': Layer validation failed. Error using
'predict' in Layer prelu3dLayer. The function threw an error
and could not be executed.
Array dimensions must match for binary array op.
This is my class definition
classdef prelu3dLayer < nnet.layer.Layer
% Custom 3-D PReLU layer.
%
% Copyright 2019 The MathWorks, Inc.
properties (Learnable)
% Layer learnable parameters
% Scaling coefficient
Alpha
end
methods
function layer = prelu3dLayer(numChannels, name)
% layer = preluLayer(numChannels, name) creates a PReLU layer
% with numChannels channels and specifies the layer name.
% Set layer name.
layer.Name = name;
% Set layer description.
layer.Description = "PReLU with " + num2str(numChannels) + " channels";
% Initialize scaling coefficient.
layer.Alpha = rand(numChannels);
end
function Z = predict(layer, X)
% Z = predict(layer, X) forwards the input data X through the
% layer and outputs the result Z.
Z = max(0, X) + layer.Alpha .* min(0, X);
end
function [dLdX, dLdAlpha] = backward(layer, X, ~, dLdZ, memory)
% [dLdX, dLdAlpha] = backward(layer, X, Z, dLdZ, memory)
% backward propagates the derivative of the loss function
% through the layer.
% Inputs:
% layer - Layer to backward propagate through
% X - Input data
% Z - Output of layer forward function
% dLdZ - Gradient propagated from the deeper layer
% memory - Memory value which can be used in backward
% propagation
% Outputs:
% dLdX - Derivative of the loss with respect to the
% input data
% dLdAlpha - Derivative of the loss with respect to the
% learnable parameter Alpha
dLdX = layer.Alpha .* dLdZ;
dLdX(X>0) = dLdZ(X>0);
dLdAlpha = min(0,X) .* dLdZ;
%dLdAlpha = sum(sum(dLdAlpha,1),2);
% Sum over all observations in mini-batch.
dLdAlpha = sum(dLdAlpha,5);
end
end
end
And these are the network options
inputSize = [40 40 40];
numClasses = 2;
lgraph = createVnetBn(inputSize, numClasses);
figure;
lgraph.plot
maxEpochs = 250;
options = trainingOptions('adam', ...
'MaxEpochs',maxEpochs, ...
'InitialLearnRate',1e-3, ...
'LearnRateSchedule','piecewise', ...
'LearnRateDropPeriod',5, ...
'LearnRateDropFactor',0.97, ...
'ValidationData',dsVal, ...
'ValidationFrequency',400, ...
'Plots','training-progress', ...
'Verbose',false, ...
'MiniBatchSize',miniBatchSize);
doTraining = true;
if doTraining
modelDateTime = datestr(now,'dd-mmm-yyyy-HH-MM-SS');
[net,info] = trainNetwork(dsTrain,lgraph,options);
save(['trained3DUNet-' modelDateTime '-Epoch-' num2str(maxEpochs) '.mat'],'net');
end

답변 (1개)

Amanjit Dulai
Amanjit Dulai 2022년 9월 11일
It's tricky to tell without the code for createVnetBn, but I would guess the problem is this line in prelu3dLayer:
% Initialize scaling coefficient
layer.Alpha = rand(numChannels);
This will initialize layer.Alpha to a numChannels-by-numChannels matrix, which is not what you would normally want for a 3D input. For a 3D input, you probably want something like this:
% Initialize scaling coefficient
layer.Alpha = rand([1 1 1 numChannels]);
  댓글 수: 7
Amanjit Dulai
Amanjit Dulai 2022년 9월 15일
I've attached an edited version of createVnetBn.m which should work for an input size of 40x40x40. I have removed one of the downsampling and upsampling stages.
Amira Youssef
Amira Youssef 2022년 9월 15일
Thank you so much sir, I really can't thank you enough for this

댓글을 달려면 로그인하십시오.

카테고리

Help CenterFile Exchange에서 Image Data Workflows에 대해 자세히 알아보기

제품


릴리스

R2020a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by