Hierarchical Lifelong Learning by Sharing Representations and Integrating Hypothesis
العنوان: | Hierarchical Lifelong Learning by Sharing Representations and Integrating Hypothesis |
---|---|
المؤلفون: | Chunmei Qing, Guoxi Su, Bolun Cai, Tong Zhang, Xiaofen Xing, Xiangmin Xu |
المصدر: | IEEE Transactions on Systems, Man, and Cybernetics: Systems. 51:1004-1014 |
بيانات النشر: | Institute of Electrical and Electronics Engineers (IEEE), 2021. |
سنة النشر: | 2021 |
مصطلحات موضوعية: | Boosting (machine learning), Artificial neural network, Basis (linear algebra), business.industry, Computer science, 020208 electrical & electronic engineering, Feature extraction, Lifelong learning, 02 engineering and technology, Machine learning, computer.software_genre, Computer Science Applications, Human-Computer Interaction, Control and Systems Engineering, 0202 electrical engineering, electronic engineering, information engineering, Task analysis, 020201 artificial intelligence & image processing, Artificial intelligence, Electrical and Electronic Engineering, Layer (object-oriented design), business, computer, Software |
الوصف: | In lifelong machine learning (LML) systems, consecutive new tasks from changing circumstances are learned and added to the system. However, sufficiently labeled data are indispensable for extracting intertask relationships before transferring knowledge in classical supervised LML systems. Inadequate labels may deteriorate the performance due to the poor initial approximation. In order to extend the typical LML system, we propose a novel hierarchical lifelong learning algorithm (HLLA) consisting of two following layers: 1) the knowledge layer consisted of shared representations and integrated knowledge basis at the bottom and 2) parameterized hypothesis functions with features at the top. Unlabeled data is leveraged in HLLA for pretraining of the shared representations. We also have considered a selective inherited updating method to deal with intertask distribution shifting. Experiments show that our HLLA method outperforms many other recent LML algorithms, especially when dealing with higher dimensional, lower correlation, and fewer labeled data problems. |
تدمد: | 2168-2232 2168-2216 |
الوصول الحر: | https://explore.openaire.eu/search/publication?articleId=doi_________::403364231f258012447e96226aaab5ccTest https://doi.org/10.1109/tsmc.2018.2884996Test |
حقوق: | CLOSED |
رقم الانضمام: | edsair.doi...........403364231f258012447e96226aaab5cc |
قاعدة البيانات: | OpenAIRE |
تدمد: | 21682232 21682216 |
---|