필터 지우기
필터 지우기

Best way to integrate GPU use in my code?

조회 수: 2 (최근 30일)
AlexRD
AlexRD 2021년 5월 18일
댓글: Infinite_king 2024년 4월 18일
I've started doing a lot of work on a neural net implementation i've built from scratch using Matlab, and initially changed from using GPU to using CPU only as it was easier to debug and write code for, and would allow me to focus on the GPU aspect of it later.
I am now however on the GPU implementation part, but struggling a bit to get an optimized result. I noticed that the GPU struggles a lot with multiple layers, with the processing time often being directly proportional to how many layers I have, whereas the CPU doesn't really care about number of layers (as long as amount of neurons aren't crazy high) but struggles a bit with the input layer, considering the amount of weights and biases.
I've tried a hybrid approach, where the input and any convolutional layers are assigned to the GPU, the GPU data is then fetched and processed by the CPU. But often the fetch time isn't worth the hassle.
Some feedback would be very welcome, and my project can be found here, fully documented: https://github.com/AlexRDX/Neural-Net
Or attached to this post. Any criticism at all is welcome.
Thank you for your time!

답변 (0개)

카테고리

Help CenterFile Exchange에서 Deep Learning Toolbox에 대해 자세히 알아보기

제품


릴리스

R2021a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by