Baichuan2-Sum: Instruction Finetune Baichuan2-7B Model for Dialogue Summarization

التفاصيل البيبلوغرافية
العنوان: Baichuan2-Sum: Instruction Finetune Baichuan2-7B Model for Dialogue Summarization
المؤلفون: Xiao, Jianfei, Chen, Yancan, Ou, Yimin, Yu, Hanyi, Shu, Kai, Xiao, Yiyong
سنة النشر: 2024
المجموعة: Computer Science
مصطلحات موضوعية: Computer Science - Computation and Language, Computer Science - Artificial Intelligence, Computer Science - Machine Learning
الوصف: Large language models (LLMs) like Llama, Baichuan and Bloom models show remarkable ability with instruction fine-tuning in many natural language tasks. Nevertheless, for the dialogue summarization task, which aims to generate summaries for different roles in dialogue, most of the state-of-the-art methods conduct on small models (e.g Bart and Bert). Existing methods try to add task specified optimization on small models like adding global-local centrality score to models. In this paper, we propose an instruction fine-tuning model: Baichuan2-Sum, for role-oriented diaglouge summarization. By setting different instructions for different roles, the model can learn from the dialogue interactions and output the expected summaries. Furthermore, we applied NEFTune technique to add suitable noise during training to improve the results. The experiments demonstrate that the proposed model achieves the new state-of-the-art results on two public dialogue summarization datasets: CSDS and SAMSUM. We release our model and related codes to facilitate future studies on dialogue summarization task.
نوع الوثيقة: Working Paper
الوصول الحر: http://arxiv.org/abs/2401.15496Test
رقم الانضمام: edsarx.2401.15496
قاعدة البيانات: arXiv