Error in testing_gan (line 54)
dlnetGenerator = dlnetwork(lgraphGenerator)
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
Error in dlnetwork (line 3)
imageInputLayer([64 64 1], 'Name', 'input', 'Mean', mean(XTrain,0))
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
Error in mean (line 127)
y = sum(x, dim, flag) ./ mysize(x,dim);
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
Error using sum
Invalid data type. First argument must be numeric or logical.

댓글 수: 3

nasir mehmood
nasir mehmood 2020년 5월 24일
are u availiable , heip to solve
Output argument "b" (and maybe others) not assigned during call to "dlnetwork".
Error in bb (line 59)
dlnetGenerator = dlnetwork(lgraphGenerator);
For me, the code in the example that you linked runs as expected.
It seems like you may have modified and saved the original example. There's no function or script named testing_gan in the original example.
From the errors you provided, the problem seems to be the mean value used in the imageInputLayer, which is causing the error inside dlnetwork when it initializes the layer.
imageInputLayer([64 64 1], 'Name', 'input', 'Mean', mean(XTrain,0))
I don’t know how you are providing XTrain, as that variable is not present in the example you linked. Is it a datastore? That would explain the error in sum inside mean.
It would help if you provide the exact code that is causing the error.

댓글을 달려면 로그인하십시오.

답변 (1개)

Mahmoud Afifi
Mahmoud Afifi 2020년 5월 23일

0 개 추천

Can you please give a link to the original code? In meanwhile, have a look at this github page . It has several GANs with Matlab implementation.

댓글 수: 5

nasir mehmood
nasir mehmood 2020년 5월 23일
your recommended example required CUDA,
generated error
Epoch 0
Unable to load CUDA driver. The library name used was nvcuda.dll. The error was:
The specified module could not be found.
Update or reinstall your GPU driver. For more information on GPU support, see GPU Support by Release.
Error in parallel.internal.gpu.isAnyDeviceSelected
Error in parallel.gpu.GPUDevice.isAvailable (line 119)
if parallel.internal.gpu.isAnyDeviceSelected
Error in canUseGPU (line 25)
ok = canUsePCT() && parallel.gpu.GPUDevice.isAvailable();
Error in bb (line 107)
dlZValidation = dlarray(ZValidation);
Mahmoud Afifi
Mahmoud Afifi 2020년 5월 24일
But in any example you need CUDA to be installed in your machine. Otherwise it is hard to train a GAN on a CPU.
nasir mehmood
nasir mehmood 2020년 5월 24일
Output argument "b" (and maybe others) not assigned during call to "dlnetwork".
dlnetGenerator = dlnetwork(lgraphGenerator);
Sophia Lloyd
Sophia Lloyd 2020년 6월 28일
It is possible to train a GAN on a CPU, though usually not recommended as it will be very slow.
The example https://www.mathworks.com/help/deeplearning/ug/train-generative-adversarial-network.html will run on the CPU there is no GPU available.
The examples in the GitHub page assume that you have a GPU. If you do not, you need to modify the code and remove the call to gpuArray. This should be enough to run the code on the CPU.
If you do have a supported GPU, you need a suitable driver for your device and platform. We recommend you use the most up-to date driver for your device. You can get drivers from NVIDIA here: https://www.nvidia.com/Download/index.aspx. You can check if your GPU is supported here: https://www.mathworks.com/help/parallel-computing/gpu-support-by-release.html
To use the GPU for training, you only need the driver. You do not need to install the CUDA Toolkit.

댓글을 달려면 로그인하십시오.

카테고리

도움말 센터File Exchange에서 Deep Learning Toolbox에 대해 자세히 알아보기

질문:

2020년 5월 23일

댓글:

2020년 6월 28일

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by