Running GLMs on GPU

조회 수: 8 (최근 30일)
Aravind Krishna
Aravind Krishna 2017년 9월 15일
I was wondering if it was possible to generate generalized linear models from a gpu to speed up the process since that is a rate limiting step in my code - ends up taking about a week for many models.
training data is of the order [100x10]. Is it possible with a simple gpuArray ? if Not how do I run GLMs on GPU?
Thanks in advance (p.s. I have no idea how GPU programming is done)

답변 (0개)

카테고리

Help CenterFile Exchange에서 GPU Computing에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by