Background Data Dispatch with Custom Training Loop
조회 수: 16 (최근 30일)
이전 댓글 표시
I have a question regarding the training of a deep neural network with Matlab.
I have built a custom training loop for the training of a regression network on a machine with 2 GPUs.
The training loop performs fine, however it is rather slow in comparison to the automatic trainNetwork function.
The trainNetwork function does not provide the type of network progress monitor i like. The trainNetwork function also seems to error unpredictably on my machine and sometimes the network are not "finished" properly. This is why i make use of a custom training loop.
I use a parallel pool with 2 workers and the randomPatchExtraction Datastore (which is partitionable). The parallel operations
are written in an spmd block.
What would be the best way to use data dispatching in the background in a custom training loop?
I have tried to scale up the number of workers in the parallel pool. This leads to the case that some workers
cannot read data since the Datastores are only partitioned according to the number of GPUs, not the number of workers.
Which operations do i have to assign to the workers that are supposed to preload data?
Has anybody tried using a "self-written" data dispatching in a custom training loop?
Thanks in advance!
댓글 수: 0
채택된 답변
Joss Knight
2020년 11월 22일
댓글 수: 4
Joss Knight
2020년 11월 25일
Great! labSend is blocking, so you can't have both workers 3 and 4 call labSend at the same time. You need to choose which one goes first.
추가 답변 (0개)
참고 항목
카테고리
Help Center 및 File Exchange에서 Parallel and Cloud에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!