دورية أكاديمية

Nonconvex Rectangular Matrix Completion via Gradient Descent Without ℓ₂,∞ Regularization.

التفاصيل البيبلوغرافية
العنوان: Nonconvex Rectangular Matrix Completion via Gradient Descent Without ℓ₂,∞ Regularization.
المؤلفون: Chen, Ji1 ljichen@ucdavis.edu, Liu, Dekai2 dekailiu@zju.edu.cn, Li, Xiaodong3 xdgli@ucdavis.edu
المصدر: IEEE Transactions on Information Theory. Sep2020, Vol. 66 Issue 9, p5806-5841. 36p.
مصطلحات موضوعية: MATRICES (Mathematics), MACHINE learning, DIFFERENTIAL inclusions, LEARNING communities, CONVEX functions
مستخلص: The analysis of nonconvex matrix completion has recently attracted much attention in the community of machine learning thanks to its computational convenience. Existing analysis on this problem, however, usually relies on $\ell _{2,\infty }$ projection or regularization that involves unknown model parameters, although they are observed to be unnecessary in numerical simulations. In this paper, we extend the analysis of the vanilla gradient descent for positive semidefinite matrix completion in the literature to the rectangular case, and more significantly, improve the required sampling rate from $O(\mathrm {poly}(\kappa)\mu ^{3}~r^{3} \log ^{3}~n/n)$ to $O(\mu ^{2}~r^{2} \kappa ^{14} \log n/n)$. Our technical ideas and contributions are potentially useful in improving the leave-one-out analysis in other related problems. [ABSTRACT FROM AUTHOR]
Copyright of IEEE Transactions on Information Theory is the property of IEEE and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
قاعدة البيانات: Business Source Index
الوصف
تدمد:00189448
DOI:10.1109/TIT.2020.2992234