Why Interactive MATLAB job require less memory compared to non-interctive job on cluster
조회 수: 1 (최근 30일)
이전 댓글 표시
Hi everyone
I am running matlab on school's cluster (linux system). The original data read into matlab is up to 4 GB, and there is also a array needs 24 GB for calculation in my code. I required 12 cores and 24 GB memory by this command (qsh -pe smp 12 -l h_vmemm=2 matlab=12) for Interactive MATLAB job on school's cluster. The job can run successfully.
However, I required 12 cores with 50 GB for non-interctive job, but it failed somewhere of my code. Then I increased the memory to 80 GB, it can run further.But it would stop as well. Even I used clear command to clear the big arrays, it did not work!
Can any one tell me what is wrong for the non-interctive job?
댓글 수: 2
Kojiro Saito
2018년 1월 13일
What a function do you use for non-interactive job? parfor, batch or spmd? One point is that there's a transparency concern in parfor so, please take a look at this document of Transparency in parfor.
답변 (0개)
참고 항목
카테고리
Help Center 및 File Exchange에서 Parallel Computing Fundamentals에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!