Solving pde using GPU
조회 수: 10 (최근 30일)
이전 댓글 표시
Hello
I am currently solve numerically a non linear time dependent PDE using Euler method. When I run my code with CPU everything seems to work fine however when I use GPU (transforming all my variables using gpuarray ) the solution changes significantly and in some point "explodes" (I compared the solution between CPU and GPU in the same time and they were totally different ) . Did anyone see something like that before? thank you
댓글 수: 7
Joss Knight
2016년 3월 14일
That bug workaround is for the FFT, not for FFT2. And it only does anything for vector inputs of certain lengths, so it will literally be doing nothing in your case. All you've done is make the FFT2 implementation more like the CPU one - so less efficient, but the results will be closer.
Looks like all we're talking about here is numerical accuracy. You haven't actually shown what it is you are iterating on, but it's a fair assumption that you are continually calling ifft2(fft2(X))? This will inevitably have the effect of causing slight numerical offsets to grow, which is why you need to insert real(X) into the loop to remove the extraneous imaginary part.
I can't exactly explain how the CPU's version of the FFT can perfectly reflect the result of FFT with the IFFT and end up with something with a zero imaginary part - no doubt it just falls out in the equations. However, the GPU FFT is computed in parallel and won't be able to provide the same perfect mirroring properties.
In short, it is perfectly valid for you to remove the imaginary part when you know the result is supposed to be real.
답변 (0개)
참고 항목
카테고리
Help Center 및 File Exchange에서 Transforms에 대해 자세히 알아보기
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!