Info
이 질문은 마감되었습니다. 편집하거나 답변을 올리려면 질문을 다시 여십시오.
What to do when you really ARE out of memory?
조회 수: 2 (최근 30일)
이전 댓글 표시
What is the solution for optimizing code when you really are just working with too large of a dataset?
Currently I need to perform triscatteredinterp using 3 vectors all (100,000,000 x 1).
scatteredInterpolant does not work any better in this instance.
댓글 수: 0
답변 (3개)
the cyclist
2015년 8월 4일
편집: the cyclist
2015년 8월 4일
For very large datasets, processing a random sample of the data will often give satisfactory results.
댓글 수: 0
Walter Roberson
2015년 8월 4일
Store the data in hierarchies such as octrees that allow you to extract a subset that fits within working memory to do the fine-grained work on.
댓글 수: 0
이 질문은 마감되었습니다.
참고 항목
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!