MATLAB Answers

Save 'v7.3' apparently uses no compression - how to turn it on?

조회 수: 69(최근 30일)
arnold 6 Sep 2014
댓글: Bob photonics 27 Mar 2020
How do I switch on compression when using 'save -v7.3'? This morning I ran into a problem using 'save'. For the first time I tried saving large variables which then forced me to use 'v7.3'. I thought this might be the end of it but apparently not.
Two problems with v7.3:
  1. It's VERY slow
  2. Files are HUGE compared to normal saving, which might explain why reading/writing is so slow
Externally zipping the produced *.mat file results in a file about the size of the one produced by just using 'save'. For example:
The first file is 778kB in size, the second one a whopping 11.4MB. Zipping the 11.4MB file results in a 228kB file, so theres much room for improvement. While 11MB could be handled, the same happens of course to larger variables. I just edited one of my physics simulations with many multidimensional arrays. Saving via the normal method gives a 500MB file, doing the same using '-v7.3' gives me a 6.3GB file. Zipping this one gives me a 480MB file. This is unacceptable, it can't be how this was intended to be used.
So apparently using 'save -v7.3' just doesn't compress the file. This makes no sense to me. Escpecially if this was specifically implemented to be used to save large variables, why is the compression not on?
How do I switch this on? Going through the documentation, I haven't found an option.

  댓글 수: 9

표시 이전 댓글 수: 6
Jacob Lynch August
Jacob Lynch August 18 Dec 2015
Adding the '-v7.3' switch to my save command turned a 2GB struct into a 18GB file on the hard drive! I feel something is very wrong with the -v7.3 switch...
Cheeba 4 Nov 2016
I have this exact same problem. When I use the '-v7.3' switch my files get enormously larger. Something is broken... I have yet to find a workaround. Very frustrating.
Bob photonics
Bob photonics 27 Mar 2020
Yep I have the same problem in r2016a and r2018a, files that are smaller than 0.5GB otherwise turn into a 9.8GB sized filed and it's supposedly compressed...
Had to switch to 2016a because it's what we run on our linux servers and I was out of memory on my laptop.

로그인 to comment.


per isakson
per isakson 6 Sep 2014
편집: per isakson 6 Sep 2014
Testing compression with ones(15e3) gives unrealistic results.
Instead test with random numbers
m = rand(15e3);
tic,save('normal.mat', 'm');toc
sad = dir('n*.mat');
or with a matrix that is closer to your real data.

  댓글 수: 2

arnold 6 Sep 2014
why? Just kidding :)
I do get what you mean but tons of people use or generate data which is not very random, even if the inital data might be. I.e. in some of the analysis I do discard most data and set it to nan or I use masking arrays which are logicals, yet huge with vast connected chunks. This type of data is ideal for compression, yet the v7.3 does nothing with it. I am using sparse matrices too but not knowing what the result might be, sparse matrices are not always ideal and can't replace everything.
In this case storing using this option is not a very good idea.
per isakson
per isakson 7 Sep 2014
"yet the v7.3 does nothing with it" &nbsp -v7.3 does indeed compress ones(15000).
>> 15e3^2*8/11.4e6
ans =
"arrays which are logical" &nbsp HDF5 doesn't have logical. I guess Matlab may use uint8 to store logical in -v7.3.
"In this case storing using this option is not a very good idea." &nbsp With 2GB+ items (cannot find find a better word), using HDF5 directly might be better. IMO: the Matlab support of HDF5 works well enough.

로그인 to comment.

Walter Roberson
Walter Roberson 5 Nov 2016

  댓글 수: 0

로그인 to comment.

이 질문에 답변하려면 로그인을(를) 수행하십시오.

Translated by