Question

I have to factorize Matrix R[mn] to two low-rank Matrices (U[Km] and V[K*n]), I do this for predicting missing values of R by U and V.
Matrix Factorization

The problem is, for factorizing R I can't use Matlab factorization methods, so I have to work on objective function which minimizes the sum-of-squared-errors for enhancing factorization accuracy:
Objective Function details are shown below:

enter image description here


My Question in this post is how to minimize function F in Matlab Using Stochastic Gradient Descent method to decompose R into U and V matrices.

Thanks for any help!

Was it helpful?

Solution

Finally I figured out with help of this page :)
I explain the approach in some steps:

  1. Create U[k*m] and V[k*n] and fill them arbitrarily

  2. Compute derivatives for objective function on Ui and Vj

  3. Do gradient descent as follows:

    while (your criteria satisfies(optimizing error function F)) { Ui=Ui+a(U'i); Vj=Vj+a(V'j); Evaluate F using new values of Ui and Vj; }

  4. With the minimum F , take U and V, compute transpose(U)*V and the result is estimated R (a is step size or learning rate)

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top