Matrix Factorization In Matlab using Stochastic Gradient Descent

조회 수: 6 (최근 30일)
oMiD Mousazadeh
oMiD Mousazadeh 2013년 10월 7일
댓글: Matt J 2013년 10월 7일
I have to factorize Matrix R[m*n] to two low-rank Matrices (U[K*m] and V[K*n]), I do this for predicting missing values of R by U and V.
The problem is, for factorizing R I can't use Matlab factorization methods, so I have to work on objective function which minimizes the sum-of-squared-errors for enhancing factorization accuracy:
details are shown below:
My Question in this post is how to minimize function F in Matlab Using Stochastic Gradient Descent method to decompose R into U and V matrices.
  댓글 수: 1
Matt J
Matt J 2013년 10월 7일
Since your function is not continuous/differentiable (because I_ij is not), I wonder whether any kind of gradient method applies.
How large are R, U, ad V typically. You might be able to use the genetic algorithm ga() in the Global Optimization Toolbox.

댓글을 달려면 로그인하십시오.

답변 (0개)

카테고리

Help CenterFile Exchange에서 Get Started with MATLAB에 대해 자세히 알아보기

제품

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by