How to use gpu for deep learning

조회 수: 3 (최근 30일)
Alexey Kozhakin
Alexey Kozhakin 2022년 10월 1일
답변: KSSV 2022년 10월 1일
I’m training detection model yolov4 on matlab. I just got a computer with a graphic card, the Nvidia GeForce RTX 3070 Ti. I want to get the maximum from it. Please help me, what I need to write in matlab code to perform training using GPU.

답변 (1개)

KSSV
KSSV 2022년 10월 1일
Check the trainingOptions, in there you have option to specify the execution environment.
Example:
options = trainingOptions('sgdm', ...
'Momentum',0.9, ...
'InitialLearnRate',initLearningRate, ...
'LearnRateSchedule','piecewise', ...
'LearnRateDropPeriod',learningDropPeriod, ...
'LearnRateDropFactor',learningRateFactor, ...
'L2Regularization',l2reg, ...
'ExecutionEnvironment', 'auto',....
'ValidationPatience',Inf,...
'MaxEpochs',maxEpochs, ...
'ValidationData',{inputVal, targetVal}, ...
'ValidationFrequency',50,...
'shuffle','every-epoch',....
'MiniBatchSize',miniBatchSize, ...
'GradientThresholdMethod','l2norm', ...
'GradientThreshold',0.01, ...
'Plots','training-progress', ...
'ExecutionEnvironment', 'auto',..... %<------ check this. Keep it auto so MATLAB can pick the best
'ValidationPatience', 10, ...
'Verbose',true);

카테고리

Help CenterFile Exchange에서 GPU Computing에 대해 자세히 알아보기

제품


릴리스

R2022a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by