Workaround for dealing with heavy figure file without slowing or freezing or crashing matlab?
조회 수: 20 (최근 30일)
이전 댓글 표시
I have a large dataset represented as a matrix with dimensions 740864x8192, containing double precision values. The matrix is essentially an image with 740864 rows and 8192 columns, and I intend to visualize it using the pcolor function in MATLAB. However, when I attempt to plot this data, MATLAB slows down and crashes due to the sheer size of the dataset.
As a workaround, I tried to delete unnecessary variables during the process to free up memory and ensure that the matrix size doesn't overwhelm the system. Despite these efforts, MATLAB still crashes when it attempts to display the final figure, which is when the graphical representation of the data appears on the screen.
I used following command:
pcolor(X, Y, Z)
Is it possible to deal with this heavy figure file without slowing or freezing or crashing matlab?
댓글 수: 3
dpb
2023년 11월 3일
740864*8192*8/1024/1024/1024
is 45 GB; how much physical real memory does your machine actually have for starters?
If you used single instead of double it would be 25 GB which you might be able to deal with computationally.
But, even a fancy 8K monitor would have only something approaching 8K pixels in resolution in the one direction if every pixel were used for the image;
740864/8192
is 100X more points than pixels to display the image on -- and it's more likely you've got only 1920 which is a factor of 4X less than that.
IOW, "decimate, decimate!" -- keep only every 4-500 points in each direction to even have a chance.
But, it's not at all clear how the 2D array relates to the plot3 X,Y,Z arrangement...
답변 (2개)
Anurag
2023년 11월 23일
Hi Megha,
I understand you want to visualise a matrix with large dimensions, please follow the steps below to do the same:
- Use down sampling, it is the process of reducing the number of samples in a signal or dataset, often to decrease data size or processing complexity.
downsample_factor = 10;
X_downsampled = X(1:downsample_factor:end, :);
Y_downsampled = Y(1:downsample_factor:end, :);
Z_downsampled = Z(1:downsample_factor:end, :);
pcolor(X_downsampled, Y_downsampled, Z_downsampled);
- Plot smaller sections iteratively to avoid overwhelming MATLAB.
section_size = 1000;
for i = 1:section_size:size(Z, 1)
end_idx = min(i + section_size - 1, size(Z, 1));
pcolor(X(i:end_idx, :), Y(i:end_idx, :), Z(i:end_idx, :));
pause(0.1);
end
- Consider exporting data for visualization in external tools like ParaView or exporting images.
imwrite(Z, 'large_data.png');
Hope this helps!
댓글 수: 1
Walter Roberson
2023년 11월 23일
You might want to consider using imresize instead of downsampling by indexing: imresize() offers a choice of resizing methods, some of which would allow the intermediate data to contribute to the final pixel.
img = rgb2gray(imread('flamingos.jpg'));
for K = 1 : 50
img(K:100:end, 1:100:end) = 255;
img(1:100:end, K:100:end) = 255;
end
L = numel(img(1:71:end));
img(1:71:end) = 255-mod(1:L, 256);
imshow(img)
methods = ["nearest", "bilinear", "bicubic", "box", "lanczos2", "lanczos3"];
for K = 1 : numel(methods)
figure();
newimg = imresize(img, [100 100], methods(K));
imshow(newimg); title(methods(K));
end
Shubham
2023년 11월 24일
Hey Megha,
I understand that you want to visualize an image but the MATLAB crashes while attempting to display the result. As pointed out by Les and dpb in the comments, dealing with a 45 GB file computationally is difficult.
You can try out the following:
- Reduce the resolution of the image by down sampling the dataset. You can reduce the dataset size by taking averages within specified intervals. For your case, a sampling frequency of the order of 1000 (or at least 100) should be used to reduce the dataset size significantly. You can try out Anurag's and Walter's suggestion as well. You can also try breaking your dataset into chunks if you only need to visualise a certain part of the image.
- Refer to the documentation of Visualizing Tall arrays: https://www.mathworks.com/help/matlab/import_export/tall-data-visualization.html “Tall arrays” allow you to work with large datasets. The above documentation provides some of the functions which can be used with “Tall arrays". The key idea used here is to create bins such that the data can be aggregated into small intervals.
- Have a look at this previous MATLAB answer: https://ms-intl.mathworks.com/matlabcentral/answers/1915155-how-to-plot-2d-density-plot-with-very-large-files In the above case, the solution describes to divide the data into intervals on both the axis to create a rectangular grid and then averages out the values that fall into a grid point.
- You can use function “reduce_plot” which reduces the plot according to the number of pixels available and updates if the user tries to zoom in. It could be useful for exploring large datasets. You can find the function “reduce_plot” available on file exchange: https://www.mathworks.com/matlabcentral/fileexchange/40790-plot-big
Hope this helps!
댓글 수: 0
참고 항목
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!