# Kullback-Leibler Divergence for NMF in Matlab

조회 수: 9 (최근 30일)
답변: Matt Tearle 2019년 1월 16일
I am trying to write the KLDV equation in matlab by looking at how the Euclidean distance was written.
1. Euclidean distance for matrix factorization has the following structure.
where X is the original matrix and X_hat is a product W*H
which reduces to this matlab code
f = norm(X - W * H,'fro')^2
Now I have the Kullback-Leibler Divergence with structure as below
where X is the original matrix and X_hat is a product W*H
I wish to write this in matlab. But I am confused how to deal with the sumation. like in the Euclidean distance suddenly we are using the function norm.
Could someone help me write a decent code for this expression? Thanks.

댓글을 달려면 로그인하십시오.

### 채택된 답변

Matt Tearle 2019년 1월 16일
If X and X_hat are just matrices, then I think you should be able to compute all the terms element-wise and sum the result (unless I misunderstand the formula).
div = X .* log(X ./ X_hat) - X + X_hat;
KLD = sum(div,'all'); % in R2018b onward
KLD = sum(div(:)); % in any version
I'm interpreting "log" in the formula in the math sense (natural log) rather than engineering (base 10). If it's base 10, then use the log10 function instead.

댓글을 달려면 로그인하십시오.

### 카테고리

Help CenterFile Exchange에서 Logical에 대해 자세히 알아보기

### Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by