Save "big" object to file fails

조회 수: 16 (최근 30일)
Vincent
Vincent 2011년 11월 22일
편집: Riccardo Scorretti 2021년 9월 23일
Hi,
I'm working on a project with OOP. There is an object called "database" containing a "big" cell-array (nested with mixed contents).
In this database, I stored some file contents. Until now, as I had about 2000 files in this database, the file could be stored properly with "save" and creates a 20 MB file. But as I added another 1000 files, the saving process stops after some time and produces a rudimentary 1KB .mat-file (no error or anything else).
I tried the "pack" command but then Matlab crashed. Of course I could post the log here if desired. I'm using Windows XP SP3, Matlab v. 7.5.0 (R2007b) and wanted to save the file on several file systems (fat/ntfs).
Is this a common issue? I couldn't find anything similar out there...
Greetings

답변 (5개)

Andrea Gentilini
Andrea Gentilini 2012년 5월 7일
Try go to File -> Preferences -> General -> MAT-Files -> and click the option MATLAB Verion 7.3 or later. This will allows you to save variables that are in excess of 2 GB.
  댓글 수: 1
Vincent
Vincent 2012년 5월 10일
I am sorry Andrea, but this does not help. Just for clearification: Files > 20 MB can be stored as long as they do not contain an object.

댓글을 달려면 로그인하십시오.


Jan
Jan 2011년 11월 22일
If Matlab crahs inside the pack command, you have a serious problem in the memory management. Do you use user-defined Mex functions?
BTW. Athough you can create a database using Matlab, dedicated database programs will do this much better.

Vincent
Vincent 2011년 11월 22일
No I don't use any user-defined Mex functions until now. I only have two instances of Matlab running at the same time, but I can't imagine that this is a problem...
And yup, I know of Access, SQL and so on. I'd really like to use them but the people around me prefer Matlab ;) Thanks anyway for your hint

Vincent
Vincent 2011년 11월 24일
Hi there again, I tried to run the same thing on a newer Matlab version (2011b) and got the following error message: Out of Memory error during serialization of subsystem data (or similar)
Does this help anyone finding a solution how I may save my object?
  댓글 수: 2
Peter O
Peter O 2011년 11월 30일
Hi Vincent, I'm getting the same problem here today. Around a 300MB dataset won't save, but the 52MB version _sometimes_ will. R2011a here. I think the issue, for me, is that we have 7MB 'profile' spaces on the network for program temporary work and it's hitting the wall. I'll let you know if I find anything.
Peter O
Peter O 2011년 11월 30일
Oh, and Pack doesn't crash, but gives the same out of memory error

댓글을 달려면 로그인하십시오.


Martin Kahn
Martin Kahn 2018년 7월 1일
Hi guys,
Given that this question still gets some views, I just had an issue that sounds very similar (with Matlab 2018a and Windows 10): When trying to save with "save('filename.mat','myFile')" I just got a 1KB file. I don't really know the details of why but this fixed it: "save('filename.mat','myFile','-v7.3')". I guess this is what Andrea suggested? Sorry if it's not helpful...
  댓글 수: 1
Riccardo Scorretti
Riccardo Scorretti 2021년 9월 23일
편집: Riccardo Scorretti 2021년 9월 23일
Hi there.
Unfortunately I'm experiencing the same problem (Matlab 2020b, Linux Fedora F34). As it can be observed in the picture hereafter, as soon as serialization is triggered the amount of used memory nearly doubles:
It looks like if Matlab makes a temporary copy of data which have to be saved (with option -v 7.3 of course), and in some circumstances this ends up in an out of memory error.
In my case, I was trying to save the whole workspace, which contains many huge variables. I suggest to overcome the problem by saving separately each huge variable in different files, so as to lower the peak temporary memory usage, which is apparently required to serialize data.

댓글을 달려면 로그인하십시오.

카테고리

Help CenterFile Exchange에서 Database Toolbox에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by