Running GLMs on GPU
조회 수: 8 (최근 30일)
이전 댓글 표시
I was wondering if it was possible to generate generalized linear models from a gpu to speed up the process since that is a rate limiting step in my code - ends up taking about a week for many models.
training data is of the order [100x10]. Is it possible with a simple gpuArray ? if Not how do I run GLMs on GPU?
Thanks in advance (p.s. I have no idea how GPU programming is done)
댓글 수: 0
답변 (0개)
참고 항목
카테고리
Help Center 및 File Exchange에서 GPU Computing에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!