GTX1060 for deep learning semantic image segmentation
조회 수: 4 (최근 30일)
이전 댓글 표시
Hello!
I am attempting to train SegNet for semantic segmentation following the example set here: https://www.mathworks.com/examples/matlab/community/24778-semantic-segmentation-using-deep-learning
However, I continue to run into an out of memory error. I am wondering if the error is in my code, or if my GTX1060 3gb GPU is simply not powerful enough to train SegNet as in the example. I have already reduced the mini-batch size to 1, so I'm unsure if there are any other fixes I can make.
Thanks!
댓글 수: 0
답변 (1개)
Joss Knight
2018년 1월 23일
편집: Joss Knight
2018년 1월 23일
Yes, 3GB isn't enough for this example, sorry. SegNet is just too high resolution a network. You could try training on the CPU. Alternatively, it could be enough if you were not also driving the display with your 1060.
댓글 수: 0
참고 항목
카테고리
Help Center 및 File Exchange에서 Deep Learning Toolbox에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!