- Why do you use a cell array in the first place?
- Does it contain a mixture of string and numerical values?
- "good ideas"   require a better description of your cell array
- "a long time to wait"   why do you think so?
I have a cellarray (100000000*2cell) ,how can I save the cell in excel more quickly ?
조회 수: 1 (최근 30일)
이전 댓글 표시
if i choose the fonction csvwrite(), i think that will be a long time to wait , do you have some good ideas saving it more quick? thanks
댓글 수: 7
per isakson
2014년 10월 15일
편집: per isakson
2014년 10월 15일
I repeat my question: "Why do you use a cell array in the first place?".
What does
whos the_name_of_your_variable
return?
Is it important that the data is saved to a text file? Saving to a binary file is faster, e.g. with save.
Stephen23
2014년 10월 15일
A cell array does seem like a pretty inefficient storage choice for this data. Why not simply in a numeric array? And for that matter, saving the data in compressed binary form (eg .mat).
채택된 답변
per isakson
2014년 10월 15일
편집: per isakson
2014년 10월 16일
A good way to learn about performance is testing. Thus, I made a little comparison. I use a smaller array to save time. I think elapsed time increases linearly with array size (for large arrays).
>> double2file_performance
Elapsed time is 38.712062 seconds.
Elapsed time is 2.991493 seconds.
Elapsed time is 0.032539 seconds.
Elapsed time is 0.037617 seconds.
where double2file_performance is
M = [
4400002970000003533,8500000190000013093
4400002970000003533,8500000190000045501
4400002970000003533,8500000840000005660
4400002970000003533,8500000840000006008
4400002970000003533,8500090100000000354
4400002970000003533,8500090100000007316
4400002970000003533,8500090100000009112
4400002970000003533,8500090100000010547
8500000190000013093,8500000190000045501
8500000190000013093,8500000840000005660 ];
m1e6 = repmat( M, [1e5,1] );
tic
csvwrite( 'c:tmp\test.csv', m1e6 )
toc%%
tic
fid = fopen( 'c:tmp\test.txt', 'w' );
fprintf( fid, '%f,%f\n', m1e6 );
fclose( fid );
toc
tic
fid = fopen( 'c:tmp\test.bin', 'w' );
cnt = fwrite( fid, m1e6 );
fclose( fid );
toc
tic
save( 'c:tmp\test.mat', 'm1e6', '-v6' )
toc
Convert from a double to a cell array and back
tic, c1e6 = num2cell( m1e6 ); toc
tic, n1e6 = cell2mat( c1e6 ); toc
returns
Elapsed time is 1.073974 seconds.
Elapsed time is 1.679481 seconds.
and check the sizes of the arrays
>> whos
Name Size Bytes Class Attributes
c1e6 1000000x2 240000000 cell
m1e6 1000000x2 16000000 double
n1e6 1000000x2 16000000 double
 
Comments on this comparison
- csvwrite and fprintf create text files. fwrite and save creates binary files. I used save(...'-v6') because it is a bit faster than default.
- csvwrite is slow. That's partly because it is a wrapper of dlmwrite, which in turn is a wrapper of sprintf and fwrite.
- fprint is an order of magnitude faster that csvwrite. I think fprint is the fastest way to write to a text file.
- writing to a binary file is two orders of magnitude faster than fprint to a text file.
WARNING
The content of the text files depend on the precision specification used when writing. Of course it does! However, with cvswrite precision cannot be specified and the default is not appropriate in this case. The first lines of the file, test.csv, are
4.4e+18,8.5e+18
4.4e+18,8.5e+18
4.4e+18,8.5e+18
4.4e+18,8.5e+18
4.4e+18,8.5001e+18
4.4e+18,8.5001e+18
which might not be the expected result. With dlmwrite it is possible to specify precision.
Furthermore,
fprintf( fid, '%d,%d\n', m1e6 );
saves 30% in elapsed time and 25% in file size compared to
fprintf( fid, '%f,%f\n', m1e6 );
without loosing any precision.
추가 답변 (2개)
Iain
2014년 10월 15일
Here's a better idea.
Don't save it to a text file. 100 million lines of text, of 40 bytes per line (more if nicely formatted), is a 40 * 100M = 4GB file. It is a truly immense file saved in an extremely inefficient fashion.
You have 19 significant figures. - I'm guessing these numbers are unsigned 64 bit integers? Why not write them to file as such?
This way, you only need a few hundred characters (as text, to describe the filetype [possibly even including the matlab code to read the file]) at the top, then 8 bytes per number - 16 bytes per line. 16 * 100M = 1.6GB file. It will write to disk in less than half the time, use up less space, AND be easier to read chunks of it at a time.
Orion
2014년 10월 14일
if your cell array is really a 100000000*2 cell, it is too big for Excel (maximum number of line : 2^20 = 1048576).
you should consider to write your data in a simple text file.
or if you reaaly need to write it in a xls file, you need to split your cell in smaller elements and write each "small" cell in a new sheet.
댓글 수: 2
참고 항목
카테고리
Help Center 및 File Exchange에서 Text Files에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!