Investigation of Pre-Trained Bidirectional Encoder Representations from Transformers Checkpoints for Indonesian Abstractive Text Summarization

Authors

  • Henry Lucky Computer Science Department, BINUS Graduate Program, Master of Computer Science, Bina Nusantara University, Jakarta, Indonesia
  • Derwin Suhartono Computer Science Department, School of Computer Science, Bina Nusantara University, Jakarta, Indonesia

DOI:

https://doi.org/10.32890/jict2022.21.1.4

Keywords:

Abstractive text summarization, BERT, Indonesian, Natural language processing, Transformer

Abstract

Text summarization aims to reduce text by removing less useful information to obtain information quickly and precisely. In Indonesian
abstractive text summarization, the research mostly focuses on multi-document summarization which methods will not work optimally in single-document summarization. As the public summarization datasets and works in English are focusing on single-document summarization, this study emphasized on Indonesian single-document summarization. Abstractive text summarization studies in English frequently use Bidirectional Encoder Representations from Transformers (BERT), and since Indonesian BERT checkpoint is available, it was employed in this study. This study investigated the use of Indonesian BERT in abstractive text summarization on
the IndoSum dataset using the BERTSum model. The investigation proceeded by using various combinations of model encoders, model embedding sizes, and model decoders. Evaluation results showed that models with more embedding size and used Generative Pre-Training (GPT)-like decoder could improve the Recall-Oriented Understudy for Gisting Evaluation (ROUGE) score and BERTScore of the model results.

Metrics

Metrics Loading ...

Additional Files

Published

11-11-2021

How to Cite

Lucky, H., & Suhartono, D. (2021). Investigation of Pre-Trained Bidirectional Encoder Representations from Transformers Checkpoints for Indonesian Abstractive Text Summarization. Journal of Information and Communication Technology, 21(1), 71–94. https://doi.org/10.32890/jict2022.21.1.4