How to avoid memory problem while processing huge table?

조회 수: 2 (최근 30일)
Nitinkumar Ambekar
Nitinkumar Ambekar 2016년 8월 31일
댓글: Nitinkumar Ambekar 2016년 9월 1일
I have a huge observation table with around 30 Lacs of rows and 12 columns. While training knn classifier in 2016a version, I am getting errors related to memory. Is there any way to avoid this? I have tried to reduce rows but it's affecting the output quality.
Each row in table is a pixel and it's other values as features in columns. In one set of MRI scan, there are around 20 images of 512x512, I am loading one set for creating observation table. Is there another way to pass large amount of data to knn classifier?

답변 (1개)

KSSV
KSSV 2016년 8월 31일
doc datastore, memmap, mapreduce.
  댓글 수: 1
Nitinkumar Ambekar
Nitinkumar Ambekar 2016년 9월 1일
Thanks @Dr. Siva, one small query: Can I pass one of these to a function which takes `table` or `matrix`?

댓글을 달려면 로그인하십시오.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by