On the Language-specificity of Multilingual BERT and the Impact of Fine-tuning

التفاصيل البيبلوغرافية
العنوان: On the Language-specificity of Multilingual BERT and the Impact of Fine-tuning
المؤلفون: Tanti, M, van der Plas, L, Borg, C, Gatt, A
المساهمون: Sub Natural Language Processing, Natural Language Processing, Bastings, Jasmijn, Belinkov,, Yonatan, Dupoux, Emmanuel, Giulianelli, Mario, Hupkes, Dieuwke, Pinter, Yuval, Sajjad, Hassan
سنة النشر: 2021
مصطلحات موضوعية: multilinguality, transfer learning, natural language inference
الوصف: Recent work has shown evidence that the knowledge acquired by multilingual BERT (mBERT) has two components: a language-specific and a language-neutral one. This paper analyses the relationship between them, in the context of fine-tuning on two tasks – POS tagging and natural language inference – which require the model to bring to bear different degrees of language-specific knowledge. Visualisations reveal that mBERT loses the ability to cluster representations by language after fine-tuning, a result that is supported by evidence from language identification experiments. However, further experiments on ‘unlearning’ language-specific representations using gradient reversal and iterative adversarial learning are shown not to add further improvement to the language-independent component over and above the effect of fine-tuning. The results presented here suggest that the process of fine-tuning causes a reorganisation of the model’s limited representational capacity, enhancing language-independent representations at the expense of language-specific ones.
نوع الوثيقة: book part
وصف الملف: application/pdf
اللغة: English
العلاقة: https://dspace.library.uu.nl/handle/1874/416478Test
الإتاحة: https://dspace.library.uu.nl/handle/1874/416478Test
حقوق: info:eu-repo/semantics/OpenAccess
رقم الانضمام: edsbas.67D1AC27
قاعدة البيانات: BASE