دورية أكاديمية

A novel metric learning framework by exploiting global and local information.

التفاصيل البيبلوغرافية
العنوان: A novel metric learning framework by exploiting global and local information.
المؤلفون: Ren, Qiangqiang1 (AUTHOR), Yuan, Chao1 (AUTHOR), Zhao, Yifeng1 (AUTHOR), Yang, Liming1,2 (AUTHOR) cauyanglm@163.com
المصدر: Neurocomputing. Oct2022, Vol. 507, p84-96. 13p.
مصطلحات موضوعية: *BOOSTING algorithms, *K-nearest neighbor classification, *DISTANCE education, *PRIOR learning, *DATA structures
مستخلص: Distance metric learning plays a significant role in improving the generalization of algorithms related to distance metrics. In this paper, we first propose a generalized Mahalanobis metric learning framework (called GLML) to make use of the prior knowledge from the samples including global and local information, the main goal of which is to enlarge the distance of dissimilar sample pairs and shrink the distance of similar sample pairs simultaneously. Specifically, the metric learning problem is guided by a discriminative regularization through incorporating the pair-wise and class-wise information. Moreover, an effective alternating iterative algorithm is developed to optimize the proposed GLML, and the convergence of the algorithm is demonstrated theoretically. Then a boosting algorithm of GLML (BGLML) is proposed, where the low-rank basis learning is jointly optimized with the metric to better uncover the data structure and mitigate the computational burden. Following that, numerical experiments are carried out on binary classification and multi-classification problems. With different assessment criteria, experiment results from different scale datasets confirm that the proposed methods either improve generalization or have comparable results compared with the state-of-the-art algorithms. [ABSTRACT FROM AUTHOR]
قاعدة البيانات: Academic Search Index
الوصف
تدمد:09252312
DOI:10.1016/j.neucom.2022.08.003