JVM out of memory when running parfor on cluster

조회 수: 3 (최근 30일)
Krzysztof Fajst
Krzysztof Fajst 2014년 1월 30일
댓글: Krzysztof Fajst 2014년 1월 30일
I'm running parfor on cluster from matlabpool open and getting JVM out of memory error heap size. The same parfor works perfect when I open matlabpool with 'local' profile and run it just on one machine.
MDCS is installed properly and passes validation.
For sure its not lack of memory on remote cluster PC - it has plenty of memory left and system is win7 64. Both PCs has 16G of memory,one is 8 core,another 4 core.
Java heap size is set to 2048MB, java.opts is set to 4096MB and heap size in java application is also set high. All those settings works well either for serial or for local parfor.
Krzysztof

채택된 답변

Thomas Ibbotson
Thomas Ibbotson 2014년 1월 30일
It's possible that the worker JVMs are not picking up the java.opts file. Workers have a different startup folder to the client MATLAB, so in order to get them to pick up the java.opts I would recommend putting it in the matlab/bin/<arch> folder.
You can check the java heap memory on the workers by running the following command with a pool open:
pctRunOnAll('r = java.lang.Runtime.getRuntime(); r.maxMemory / (1024*1024)')
The first result will be the memory on the client machine followed by the memory on each worker in MB.
  댓글 수: 1
Krzysztof Fajst
Krzysztof Fajst 2014년 1월 30일
Yes it was it !!! When I open matlabpool on cluster heap memory was set to default. So I put java.opts kind of everywhere to /bin, /bin/w64 and /MATLAB, it was still not OK but after a reboot heap was set higher.
Thanks, Krzysztof

댓글을 달려면 로그인하십시오.

추가 답변 (0개)

카테고리

Help CenterFile Exchange에서 Startup and Shutdown에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by