GPU supoort in Multilabel Example?
조회 수: 1 (최근 30일)
이전 댓글 표시
Hi all,
I am currently working with the following matlab example:
https://de.mathworks.com/help/deeplearning/ug/multilabel-text-classification-using-deep-learning.html
My problem is that even though the "canUseGPU" function returns true (I am working on a Geforce RTX 2080 Ti on my local machine),
I think that the example does not run on the GPU. However, In the example, it is stated that it should run on the GPU (line 181 - 183 in the live script):
% If training on a GPU, then convert data to gpuArray.
if (executionEnvironment == "auto" && canUseGPU) || executionEnvironment == "gpu"
dlX = gpuArray(dlX);
end
When i run the code and start training, the task manager tells that there is no activity on the GPU.
Furthermore, I saw that the following embedding function removes the gpu-Arrray property of the data X (line 387-397):
function Z = embedding(X, weights)
% Reshape inputs into a vector.
[N, T] = size(X, 2:3);
X = reshape(X, N*T, 1);
% Index into embedding matrix.
Z = weights(:, X);
% Reshape outputs by separating batch and sequence dimensions.
Z = reshape(Z, [], N, T);
end
I assume that that leads to a CPU usage instead of the GPU.
Can anybody halp me to solve the issue?
Thank you very much!
댓글 수: 0
답변 (0개)
참고 항목
카테고리
Help Center 및 File Exchange에서 Sequence and Numeric Feature Data Workflows에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!