Compress only selected variables when saving to .mat

조회 수: 3 (최근 30일)
Oleg Komarov
Oleg Komarov 2014년 6월 25일
댓글: Oleg Komarov 2015년 1월 23일
I have two variables data and meta, which I am saving in a compressed .mat file (version '-v7' ). The data variable is usually 800mb uncompressed, while meta might not even be 1mb. I have lots of these .mat files and sometimes I just need to loop through all the meta variables. However, since the file is compressed, it still takes lots of time to load the meta variable alone , i.e. same time as if I were to load both variables.
Is it possible to selectively compress specific variables in a .mat file ? Alternative data designs?
Note : I already have an overall single meta which is basically the concatenation of the single smaller ones. However, I will need to abandon this approach because it does not scale well size-wise and performance-wise.
  댓글 수: 6
per isakson
per isakson 2014년 6월 25일
편집: per isakson 2014년 6월 25일
I've spent too much time experimenting with the low and high level HDF5 API of Matlab and alternatives. I'm not sure my "results" are relevant to your use case.
My use case:
  • many hundred 1MB time series
  • reading much more important than writing performance
  • typically reading of entire time series
My conclusions:
  • the low level HDF5 API is not worth the trouble (in my case)
  • the system cache is important to the performance (buy more RAM)
  • store double only when necessary
  • chunking comes at a high price (performance)
I think it is difficult to recommend anything without knowing a bit more about the internal structure of the 800MB and the 1MB together with descriptions of some typical "queries".
.
"However, it means changing my API in many places, and risk of introducing bugs."
I guess the documentation refers to
Tool Name: h5repack
Purpose:
Copies an HDF5 file to a new file with or without compression
and/or chunking.
whether h5repack can help depends on your queries.
Oleg Komarov
Oleg Komarov 2014년 6월 26일
So, matfile() does not improve the loading speed (but it also depends on the setup, I am currently testing with an SSD).
From my previous link, it seemed possible to use low level HDF5 API directly on .mat files, but it is not the case. Also, I do not have the time nor the intention to learn/rewrite everything with pure hdf5 (mat files being based on that).
I have not mentioned that meta is a dataset, and this might impact on the way it is stored, i.e. not contiguously, forcing to unzip/load many chunks. I will test serialization of the dataset, before saving it.

댓글을 달려면 로그인하십시오.

답변 (1개)

Jeremy
Jeremy 2015년 1월 23일
I know this is old, but I have been doing some similar work recently since I could tell it was taking longer than it should to load a small portion of the data using the matfile method. I also spent two days understanding how to use all the low level HDF5 commands only to find that it did not really help on the read side.
Then I realized that the issue is originating on the write side and the compression that occurs. the savefast utility in the file exchange saves using the high level hdf5 commands and this does NOT compress the data. This didn't quite work for me since I am saving complex numbers but I was able to use the same approach and I am now saving uncompressed v7.3 files and reading small portions of my matrix over 100 times faster!
If your matrix is just real numbers, you should be able to create a v7.3 file with your metadata and then use the simple high-level h5write command to save additional variables to the same file uncompressed.

카테고리

Help CenterFile Exchange에서 HDF5에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by