Summarization of COVID-19 news documents deep learning-based using transformer architecture

Hayatin, Nur and Ghufron, Kharisma Muzaki and Wicaksono, Galih Wasis (2021) Summarization of COVID-19 news documents deep learning-based using transformer architecture. TELKOMNIKA (Telecommunication Computing Electronics and Control), 19 (3). pp. 754-761. ISSN 1693-6930

[thumbnail of Hayatin Ghufron Wicaksono - COVID-19 Deep Learning News summarization Transformer architecture.pdf]
Preview
Text
Hayatin Ghufron Wicaksono - COVID-19 Deep Learning News summarization Transformer architecture.pdf

Download (778kB) | Preview
[thumbnail of Similarity - Hayatin Ghufron Wicaksono - COVID-19 Deep Learning News summarization Transformer architecture.pdf]
Preview
Text
Similarity - Hayatin Ghufron Wicaksono - COVID-19 Deep Learning News summarization Transformer architecture.pdf

Download (2MB) | Preview

Abstract

Facing the news on the internet about the spreading of Corona virus disease 2019 (COVID-19) is challenging because it is required a long time to get valuable information from the news. Deep learning has a significant impact on NLP research. However, the deep learning models used in several studies, especially in document summary, still have a deficiency. For example, the maximum output of long text provides incorrectly. The other results are redundant, or the characters repeatedly appeared so that the resulting sentences were less organized, and the recall value obtained was low. This study aims to summarize using a deep learning model implemented to COVID-19 news documents. We proposed transformer as base language models with architectural modification as the basis for designing the model to improve results significantly in document summarization. We make a transformer-based architecture model with encoder and decoder that can be done several times repeatedly and make a comparison of layer modifications based on scoring. From the resulting experiment used, ROUGE-1 and ROUGE-2 show the good performance for the proposed model with scores 0.58 and 0.42, respectively, with a training time of 11438 seconds. The model proposed was evidently effective in improving result performance in abstractive document summarization.

Item Type: Article
Keywords: COVID-19; Deep learning; News summarization; Transformer architecture.
Subjects: Q Science > QA Mathematics > QA75 Electronic computers. Computer science
Divisions: Faculty of Engineering > Department of Informatics (55201)
Depositing User: galih.w.w Gali Wasis Wicaksono,S.Kom
Date Deposited: 14 Mar 2024 09:19
Last Modified: 14 Mar 2024 09:19
URI: https://eprints.umm.ac.id/id/eprint/4770

Actions (login required)

View Item
View Item