Transformer-based approaches to Sentiment Detection

التفاصيل البيبلوغرافية
العنوان: Transformer-based approaches to Sentiment Detection
المؤلفون: Ojo, Olumide Ebenezer, Ta, Hoang Thang, Gelbukh, Alexander, Calvo, Hiram, Adebanji, Olaronke Oluwayemisi, Sidorov, Grigori
سنة النشر: 2023
المجموعة: Computer Science
مصطلحات موضوعية: Computer Science - Computation and Language
الوصف: The use of transfer learning methods is largely responsible for the present breakthrough in Natural Learning Processing (NLP) tasks across multiple domains. In order to solve the problem of sentiment detection, we examined the performance of four different types of well-known state-of-the-art transformer models for text classification. Models such as Bidirectional Encoder Representations from Transformers (BERT), Robustly Optimized BERT Pre-training Approach (RoBERTa), a distilled version of BERT (DistilBERT), and a large bidirectional neural network architecture (XLNet) were proposed. The performance of the four models that were used to detect disaster in the text was compared. All the models performed well enough, indicating that transformer-based models are suitable for the detection of disaster in text. The RoBERTa transformer model performs best on the test dataset with a score of 82.6% and is highly recommended for quality predictions. Furthermore, we discovered that the learning algorithms' performance was influenced by the pre-processing techniques, the nature of words in the vocabulary, unbalanced labeling, and the model parameters.
Comment: This submission has been removed from arXiv because the submitter did not have the authority to grant the license at the time of submission
نوع الوثيقة: Working Paper
الوصول الحر: http://arxiv.org/abs/2303.07292Test
رقم الانضمام: edsarx.2303.07292
قاعدة البيانات: arXiv