Define a custom input layer in the deep learning toolbox
조회 수: 3 (최근 30일)
이전 댓글 표시
In the Deep Learning Toolbox, it is possible to define one's own custom output layers and hidden layers. Is there no way to define a custom input layer?
댓글 수: 0
채택된 답변
James Gross
2025년 4월 16일
Hi Matt,
Unfortunately, there is no way to create a custom input layer. However, since 2023b, you can use inputLayer to define a custom format input layer.
Hopefully, this addresses your needs, but if it doesn't, please let us know. We'd love to hear about your use case and see what we can do to help!
Cheers,
James
댓글 수: 5
James Gross
2025년 4월 28일
Hi Matt,
Sure thing, happy to elaborate! What I meant was that the imageInputLayer and dlnetwork do not prevent an input of different size than the one specified in the imageInputLayer.
However, subsequent layers may still error if the sizes specified in the imageInputLayer are used to initialize the layer such that the input sizes must be of a certain size. For example, in the particular case provided above, it's erroring because initializing the network with those sizes sets the weights of the fully connected layer to have 196608 (i.e. 256 x 256 x 3 x 1) elements, whereas you have provided an input with 49152 (i.e. 128 x 128 x 3) elements, so it cannot perform the matrix multiplication.
If instead you used a series of convolution layers rather than the fully connected layer, it would not necessarily error because the convolution operation is less sensitive to the input size. For example, modifying the code above to use convolution and global max pooling, leads to it no longer erroring.
dln=dlnetwork([imageInputLayer([256,256,3]), ...
convolution2dLayer(3,3)), ...
globalMaxPooling2dLayer, ...
fullyConnectedLayer(1)], ...
Initialize=false );
trainnet(rand(128,128,3),1,dln,'mse',trainingOptions('adam'))
Hope this answers your question!
Cheers,
James
추가 답변 (1개)
Aastha
2025년 3월 24일
As I understand you want to define a custom input layer using the Deep Learning Toolbox that performs the desired operation. You can accomplish this using the following steps:
1) Create a custom input layer class called “newInputLayer” that inherits from the “nnet.layer.Layer” class.
For more information on the “nnet.layer.Layer” class kindly refer to the MATLAB documentation link below:
2) In this class, you need to define the constructor method to set the “inputSize” and the name for the input layer. Implement the “predict” method, which takes an input “X” and applies a transformation function, “yourInputTransformation”, to perform the desired operation in the input layer.
If the input layer includes learnable parameters that need to be updated during training, you can define the “backward” function. The MATLAB code below illustrates this:
classdef newInputLayer < nnet.layer.Layer
properties
% Define any properties your layer needs
InputSize
end
methods
function layer = newInputLayer(inputSize, name)
% Set layer name
layer.Name = name;
% Set input size
layer.InputSize = inputSize;
end
function Z = predict(layer, X)
% Define the forward operation
Z = yourInputTransformation(X);
end
function dLdX = backward(layer, X, Z, dLdZ, memory)
% Define the backward operation
end
end
end
3) You can then incorporate the “newInputLayer” class into your network architecture.
As an example, create a simple network using the custom input layer, which includes two fully connected hidden layers and an output layer with ReLU activation:
inputLayer = newInputLayer([28, 28, 1], 'custom_input');
% Define the network architecture
layers = [
inputLayer
fullyConnectedLayer(128, 'Name', 'fc1')
reluLayer('Name', 'relu1')
fullyConnectedLayer(64, 'Name', 'fc2')
reluLayer('Name', 'relu2')
reluLayer('Name', 'output_relu')
];
Hope this is helpful !
참고 항목
카테고리
Help Center 및 File Exchange에서 Image Data Workflows에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!