تقرير
Scientific and Creative Analogies in Pretrained Language Models
العنوان: | Scientific and Creative Analogies in Pretrained Language Models |
---|---|
المؤلفون: | Czinczoll, Tamara, Yannakoudakis, Helen, Mishra, Pushkar, Shutova, Ekaterina |
سنة النشر: | 2022 |
المجموعة: | Computer Science |
مصطلحات موضوعية: | Computer Science - Computation and Language, Computer Science - Machine Learning |
الوصف: | This paper examines the encoding of analogy in large-scale pretrained language models, such as BERT and GPT-2. Existing analogy datasets typically focus on a limited set of analogical relations, with a high similarity of the two domains between which the analogy holds. As a more realistic setup, we introduce the Scientific and Creative Analogy dataset (SCAN), a novel analogy dataset containing systematic mappings of multiple attributes and relational structures across dissimilar domains. Using this dataset, we test the analogical reasoning capabilities of several widely-used pretrained language models (LMs). We find that state-of-the-art LMs achieve low performance on these complex analogy tasks, highlighting the challenges still posed by analogy understanding. Comment: To be published in Findings of EMNLP 2022 |
نوع الوثيقة: | Working Paper |
الوصول الحر: | http://arxiv.org/abs/2211.15268Test |
رقم الانضمام: | edsarx.2211.15268 |
قاعدة البيانات: | arXiv |
الوصف غير متاح. |