Example of using Self attention layer in MATLAB R2023A

조회 수: 118 (최근 30일)
MAHMOUD EID
MAHMOUD EID 2023년 3월 21일
댓글: Tian,HCong 2024년 5월 25일
IN MATLAB 2023A, self-attention layer is intorduced.
can an example is provided to use it in image classication tasks?
  댓글 수: 2
Kuo
Kuo 2023년 7월 7일
Same question, can there be an example about time series forecasting? Thanks !!

댓글을 달려면 로그인하십시오.

채택된 답변

Himanshu
Himanshu 2023년 3월 29일
Hi Mahmoud,
I understand that you want to use "selfAttentionLayer" for image classification task in MATLAB.
A self-attention layer computes single-head or multihead self-attention of its input. For the following example, we will be using the "DigitDataset" in MATLAB.
% load digit dataset
digitDatasetPath = fullfile(matlabroot, 'toolbox', 'nnet', 'nndemos', 'nndatasets', 'DigitDataset');
imds = imageDatastore(digitDatasetPath, ...
'IncludeSubfolders', true, 'LabelSource', 'foldernames');
[imdsTrain, imdsValidation] = splitEachLabel(imds, 0.7, 'randomized');
% define network architecture
layers = [
imageInputLayer([28 28 1], 'Name', 'input')
convolution2dLayer(3, 32, 'Padding', 'same', 'Name', 'conv1')
batchNormalizationLayer('Name', 'bn1')
reluLayer('Name', 'relu1')
maxPooling2dLayer(2, 'Stride', 2, 'Name', 'maxpool1')
convolution2dLayer(3, 64, 'Padding', 'same', 'Name', 'conv2')
batchNormalizationLayer('Name', 'bn2')
reluLayer('Name', 'relu2')
maxPooling2dLayer(2, 'Stride', 2, 'Name', 'maxpool2')
flattenLayer('Name', 'flatten')
selfAttentionLayer(8, 64, 'Name', 'self_attention')
fullyConnectedLayer(10, 'Name', 'fc')
softmaxLayer('Name', 'softmax')
classificationLayer('Name', 'output')]
% set training options
options = trainingOptions('sgdm', ...
'InitialLearnRate', 0.01, ...
'MaxEpochs', 5, ...
'Shuffle', 'every-epoch', ...
'ValidationData', imdsValidation, ...
'ValidationFrequency', 30, ...
'Verbose', false, ...
'Plots', 'training-progress')
% training the network
net = trainNetwork(imdsTrain, layers, options);
Training Output:
In this code, the selfAttentionLayer is used to processes 28x28 grayscale images. The self-attention mechanism helps the model capture long-range dependencies in the input data, meaning it can learn to relate different parts of the image to each other. By introducing the selfAttentionLayer after a series of convolutional and pooling layers, the model can enhance its feature representation capabilities by considering spatial relationships between different regions of the input image.
You can refer to the below documentation to understand more about creating and training a simple convolutional neural network for deep learning classification.
  댓글 수: 7
Philip Brown
Philip Brown 2024년 5월 17일
For time series data, you could take a look at this recent blog post and GitHub repo. That uses a transformer network containing selfAttentionLayer for time series prediction. The use case there is finance, but the DL techniques would be generally applicable.
Tian,HCong
Tian,HCong 2024년 5월 25일
This answer is very helpful to me, but if it is an RGB image, how should I adjust the program? Can you give me some guidance?

댓글을 달려면 로그인하십시오.

추가 답변 (0개)

카테고리

Help CenterFile Exchange에서 Image Data Workflows에 대해 자세히 알아보기

태그

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by