An iterative weighted least squares (IRLS) algorithm, which can be interpreted as smoothing Newton’s method applied on a non-convex rank surrogate cost function, has the ability of escaping saddle points.
An alternating coordinate descent method, where a single coordinate of a particular point is updated each time and the optimal solution is obtained iteratively.
An efficient non-linear successive relaxation strategy is used, where only one linear least squares problem needs to be solved per iteration instead of singular value decomposition (SVD).
The low-rank matrix completion is transformed into an unconstrained minimization problem in Riemannian manifolds. The definition of differentiability is also given, and the modified conjugate gradient algorithm is used to solve the problem.