Matlab performance handling large arrays
조회 수: 6 (최근 30일)
이전 댓글 표시
I have a code which depending on how many iterations I choose, may end up with arrays in excess of 5,000,000 by 3. I soon started running into "out of memory" type problems because of the individual size of the large matrices.
I initialised my matrices beforehand, so all memory should have been allocated. Still sometimes I would get memory probs, but more interesting is that as the simulation progressed, it got gradually slower, eventually reaching 100% but at an exponentially slower pace it seemed.
I solved it by using smaller arrays, and after a set number of steps, assigning those matrices to other fixed matrices (which are not accessed in every loop), and then restarting the "looping array". So for e.g. say A is an array accessed in every loop. After the first 100 loops, i assign A to B (which is fixed and not accssed). I clean out A, then use it to fill in the next 101-200 steps, assign that portion to say array C, etc etc. So A is the only "dynamic" variable here.
So I fixed the issue it runs much better now, I am just curious to know why this would happen? Can anyone shed some light?
댓글 수: 0
답변 (1개)
Jan
2012년 8월 19일
A [5e6 x 3] double array needs 120MB RAM. This should not cause out-of-error messages. To understand the cause of your problems, we have to see the code.
댓글 수: 0
참고 항목
카테고리
Help Center 및 File Exchange에서 Loops and Conditional Statements에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!