Run length image compressing does not create iamge matrix
조회 수: 1 (최근 30일)
Hi friends! I am working on a project. I want to apply run length algorithm for lossless image compression. But I am newbie I search for the example codes to understand algorithm better.Here are the some links of the codes. But when I tried both of them, no image is created. For example as an input I give 512x512 image, then I run run length encoder, then run length decoder, but output size is 1x512. How can I use these codes to create image? Thanks in advance :)
Walter Roberson 2020년 6월 26일
The first of those, https://www.mathworks.com/matlabcentral/fileexchange/31123-rle-run-length-encoding is not coded to expect anything other than a vector. When given a non-empty 2D or 3D array, it will find the maximum dimension size, and will access the first that-many elements of the array (even if that many does not correspond to the first dimension.) For example if you pass in a 40 x 60 matrix it would access the first 60 elements of the array, taking all of the first colum and the first half of the second column (in the case of this example.)
The only circumstance under which you should be doing run length compression row by row or column by column, is if you are transmitting each row or column seperately.
Otherwise, you should be reshaping your array into a vector and doing run-length compression on that. Or you should take each row (or column) seperately and do run-length compression on that and then concatenate all of the results together (this would typically be less efficient.)
After you do the run-length decompression, reshape() back to the original image size.
If you are not always processing the same size of image, you should be trying to figure out how to send the image size so that the receiving end knows what to reshape() back to.