Optimizing MATLAB generated code for Nvidia Drive AGX
조회 수: 2 (최근 30일)
이전 댓글 표시
Hi,
Is it possible to optimsise matlab generated code to use maximum capacity of nvidia drive. I am generating a script to create an around view image using the gsml cameras in nvidia drive. But, using matlab generated cuda code dosent seem to help. I am able to get the output, but there is a 5 second delay between the camera input and output. Is there any way to reduce this delay and increase the processing speed?
Thank You
댓글 수: 1
Hariprasad Ravishankar
2022년 7월 18일
Hi Abhijith,
Does the time (5 second delay between camera input and output) include the frame grab from the camera? For high-resolution images, the frame grab itself could be a source of the bottleneck. Is it possible to get the timing after frame grab?
Also consider turning on GPU Coder's Memory Manager through the GpuConfig.
cfg = coder.gpuConfig('dll');
cfg.GpuConfig.EnableMemoryManager = true;
See the link below for more details:
Hari
답변 (0개)
참고 항목
카테고리
Help Center 및 File Exchange에서 Get Started with GPU Coder에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!