Support for NVlink with multi GPU
조회 수: 7 (최근 30일)
이전 댓글 표시
I am new to GPU acceleration. I am currently using arrayfun on a 1080ti and running out of memory. I am planning to build a new workstation with 2X, RTX 6000 cards. will matlab support NVLINK to pool memory together or will I have to split the code to run on 2 threads on different GPUs.
채택된 답변
Joss Knight
2018년 10월 26일
MATLAB supports NVLink in Deep Learning applications (calling trainNetwork and similar) and explicitly through the GOP function. If you invoke GOP using the special syntax gop(..., 'gpuArray') then it will use whatever peer-to-peer communication is available between GPUs, which includes NVLink.
However, there is no 'general' support for distributed computation on multiple GPUs, akin to use of a distributed array. You can get desirable behaviour by implementing your own algorithms using a parallel pool and spmd.
Alternatively, if you have a non-communicating workflow (e.g. batch computation) then you can take advantage of both your GPUs with no need for direct sharing of data between them. Using parallel language concepts like parfor and parfeval is appropriate here.
댓글 수: 3
Joss Knight
2018년 10월 29일
편집: Joss Knight
2018년 10월 29일
Only gop(..., 'gpuArray') and equivalently gplus, with numeric inputs, and supported functions, and the Linux operating system, uses NVLink or other peer-to-peer communication. See help gpuArray/gop. Otherwise communication will go via the normal mechanisms (staging via CPU memory). But it will work, if that's what you are asking.
So, for instance, if you need to broadcast data between GPUs and you really need to use NVLink then you'll have to represent it as a reduction, for instance:
alldata = zeros([size(mydata) numlabs], 'like', mydata);
alldata(:,:,labindex) = mydata;
gop(@plus, alldata, 'gpuArray');
This will concatenate matrices stored on each worker.
If you do this on Windows, or use gcat instead, or the data is not numeric...then this will work it just won't use peer-to-peer.
추가 답변 (0개)
참고 항목
카테고리
Help Center 및 File Exchange에서 Parallel and Cloud에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!