Reducing running time in image processing
조회 수: 3 (최근 30일)
이전 댓글 표시
I wrote a code to find circles in an image by using imfindcircles and do some other calculations on the detected circles. I plan to apply the code to 250000 images. My current code takes 0.8 seconds per image. Processing of each image is completely independent from other images. I am aware of parfor commands but I do my best not to use it because my code is complex enough and I do not like to make it more complex. Is there any way that I can run the script in a parallel way to reduce the total time (and not the running time for each which is 0.8 seoconds)? It should be noted that in some parts of the code I take advantage of GPU as well.
댓글 수: 0
채택된 답변
Walter Roberson
2013년 8월 31일
parfor() and related commands such as spmd() are the main approach. Otherwise, especially if you are on Linux or OS-X, run a script that hives off a number of different MATLAB processes, each with slightly different parameters. Though if you are keeping a GPU busy, it is not certain that running multiple such routines would be any faster.
The usual method is to (A) optimize the algorithm; and (B) vectorize the code.
댓글 수: 0
추가 답변 (0개)
참고 항목
카테고리
Help Center 및 File Exchange에서 Big Data Processing에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!