memory errors with large amounts of csv files
조회 수: 2 (최근 30일)
이전 댓글 표시
I am running a model which produces thousands of csv files which i need to read into matlab. This particular run generated 27,178 files.
After 18839 files, matlab gave me an 'out of memory error'. Could anybody provide a solution or more a more effective way of coding this to allow all the files to be included?
Error using readtable (line 216)
Out of memory. Type "help memory" for your options.
filelist = dir('*.csv'); %read all the files in the selected folder
num_files = length(filelist); %record how many files have been found
[~, index] = natsort({filelist.name}); %sort the files into proper numerical order (1,2,3)
filelist = filelist(index);
particledata = cell(length(filelist), 1); %create a cell array the same length as the number of files in one column
%for all the files found in the specified folder, read the tables of data and fill the empty cell array 'results' with the data in each .csv file
for a = 1:num_files
particledata{a} = readtable(filelist(a).name);
end
%% calculate how many particles leave the rice pile
%for each .csv file, calculate the number of particles after a certain y coordinate
ymax = -0.13;
for b = 1:length(particledata)
%save all the rows in the 6th column (y-coordinates) of each cell as a new variable y
y = particledata{b}(:,6);
%use the function table2array to turn the format of the data from a table to an array
y_array = table2array(y);
%sum the total number of grains leaving the rice pile in each cell, and save into a new variable 'grains'
grains(b) = sum(y_array<ymax);
end
댓글 수: 1
채택된 답변
추가 답변 (0개)
참고 항목
카테고리
Help Center 및 File Exchange에서 Data Import and Analysis에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!