필터 지우기
필터 지우기

Plenty of memory, but hard faults seem to bring my code to a standstill

조회 수: 18 (최근 30일)
Brian
Brian 2017년 2월 1일
답변: Image Analyst 2017년 2월 2일
I have a sizeable simulation that includes interpolation of a 3D data set. It should take a long time (10 - 20 min), but lately(?) it seems to be coming to a standstill!
Reviewing my Resource Monitor, my CPU and disk get hit hard at first, but things really stop once I see a large number of Hard Faults. Then, my computer starts to crawl even with less than 50% of my 16 GB being used.
Are there system settings I can change to avoid this behavior? It is clearly not just a question of "buy more memory or a faster CPU".
My system:
  • Win 7 Pro / 64-bit
  • i7-4720HQ CPU @ 2.60 GHz, (x8) cores
  • 16.0 GB RAM
  • SSD (main drive, but only 15 GB left)
  • HD (secondary drive, tons of space remains

답변 (2개)

Walter Roberson
Walter Roberson 2017년 2월 1일
This may sound strange, but you just might get more performance if you add one more row. You might be encountering cache resonace
  댓글 수: 2
Brian
Brian 2017년 2월 1일
I think I understand. Your hypothesis is that there is something idiosyncratic in my simulation that causes it to page through memory in a way that prevents it from making progress.
The simulation has a scaling parameter. I can increase (decrease) the number of meta-elements in my simulation. Running it on two different computers with similar processor / memory, one never completes. The other completes in > 1 hr after starting. Both appear to be limited by the hard faults.
Seems like a little more than just the size of my simulation...
Walter Roberson
Walter Roberson 2017년 2월 2일
Hard faults are at the operating system level.
You mention that you still have a lot of memory left. Thinking about the situation, the one thing that comes to mind at the moment is that Windows would fault out under the situation that the virtual memory required was larger than the physical memory. That could potentially happen if your work involved large arrays that had bad locality (low re-use). I could imagine problems if your code involves dynamically resizing arrays (failure to preallocate.)
But as to why it would happen on one system but not the other: I would have to think about that more.

댓글을 달려면 로그인하십시오.


Image Analyst
Image Analyst 2017년 2월 2일
16 GB of RAM is not that much. How big are your 3-D arrays? We sometimes work with 3-D CT data that is like 20 GB.
By the way, you'll see hard faults ( https://en.wikipedia.org/wiki/Page_fault ) even if your code is not causing them. You know that other things are going on in the background in your computer, don't you? They could be causing hard faults (too). If you monitor hard faults, does it suddenly dramatically increase when you run your code? Maybe the computer needs to swap your stuff in and out of RAM to handle other tasks it's doing in the background (backups, virus scans, whatever).

카테고리

Help CenterFile Exchange에서 Wireless Communications에 대해 자세히 알아보기

제품

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by