There is a variable error in the sample code of "Training variational Autoencoder (VAE) to generate images"

조회 수: 6 (최근 30일)
The practical effect of including a KL loss term is to pack the clusters learned due to the reconstruction loss tightly around the center of the latent space, forming a continuous space to sample from.
function loss = elboLoss(Y,T,mu,logSigmaSq)
% Reconstruction loss.
reconstructionLoss = mse(Y,T);
% KL divergence.
KL = -0.5 * sum(1 + logSigmaSq - mu.^2 - exp(logSigmaSq),1);
KL = mean(KL);
% Combined loss.
loss = reconstructionLoss + KL;
end
.
% Reconstruction loss.
reconstructionLoss = mse(Y,T);
The variable T cannot be found in the sample code. Strangely, this does not prevent the code from running normally. The variable T should be the input image (X), corresponding to X in the training code.
% Loop over epochs.
while epoch < numEpochs && ~monitor.Stop
epoch = epoch + 1;
% Shuffle data.
shuffle(mbq);
% Loop over mini-batches.
while hasdata(mbq) && ~monitor.Stop
iteration = iteration + 1;
% Read mini-batch of data.
X = next(mbq);
% Evaluate loss and gradients.
[loss,gradientsE,gradientsD] = dlfeval(@modelLoss,netE,netD,X);
% Update learnable parameters.
[netE,trailingAvgE,trailingAvgSqE] = adamupdate(netE, ...
gradientsE,trailingAvgE,trailingAvgSqE,iteration,learnRate);
[netD, trailingAvgD, trailingAvgSqD] = adamupdate(netD, ...
gradientsD,trailingAvgD,trailingAvgSqD,iteration,learnRate);
% Update the training progress monitor.
recordMetrics(monitor,iteration,Loss=loss);
updateInfo(monitor,Epoch=epoch + " of " + numEpochs);
monitor.Progress = 100*iteration/numIterations;
end
end

채택된 답변

Pramil
Pramil 2025년 7월 29일
Hi Hua,
The variable "T" is being supplied to the function "elboLoss" when it is called in the "modelLoss" function:
function [loss,gradientsE,gradientsD] = modelLoss(netE,netD,X)
% Forward through encoder.
[Z,mu,logSigmaSq] = forward(netE,X);
% Forward through decoder.
Y = forward(netD,Z);
% Calculate loss and gradients.
loss = elboLoss(Y,X,mu,logSigmaSq); % X is mapped to T
[gradientsE,gradientsD] = dlgradient(loss,netE.Learnables,netD.Learnables);
end
as you can see "modelLoss" functions calls "ebloLoss" function with arguments and here itslef "X" is mapped to the variable "T".
Hope this helps!
  댓글 수: 1
hua
hua 2025년 8월 1일
Wow, I checked the code and you're right. Thank you, Pramil. Your answer has helped me. Wish you a happy life.

댓글을 달려면 로그인하십시오.

추가 답변 (0개)

카테고리

Help CenterFile Exchange에서 Deep Learning Toolbox에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by