Hi everyone
Can you explain to me my code about gpu time?.
why gpu time is slower than CPU time.
this is my code:
tic
n = 4096;
m = 1000;
A1 = ones(1,m);
B1 = 2*ones(m,n);
C1 = sum(A1*B1);
% wait(gpu)
toc
tic
n = 4096;
m = 1000;
A2 = ones(1,m);
B2 = 2*ones(m,n);
C2 = sum(sum(bsxfun(@times,A2',B2)));
% wait(gpu)
toc
tic
n = 4096;
m = 1000;
A3 = gpuArray(ones(1,m));
B3 = gpuArray(2*ones(m,n));
C3 = sum(A3*B3);
% CB = sum(bsxfun(@times,AB',BB));
% wait(gpu)
toc
Thank

댓글 수: 1

Joss Knight
Joss Knight 2018년 7월 21일
Can you double-check your code? Because you're not actually doing the same thing on the GPU. For instance, you do a matrix multiplication on the GPU that you don't do on the CPU. The results aren't even the same dimensions.

댓글을 달려면 로그인하십시오.

답변 (0개)

카테고리

도움말 센터File Exchange에서 GPU Computing에 대해 자세히 알아보기

태그

질문:

2018년 7월 20일

댓글:

2018년 7월 21일

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by