Comparing Abstractive and Extractive Approach for Text Summarization

AUTHORS

Pinak Divecha,Department of Computer Science, Lakehead University, Canada
Jinan Fiaidhi,Department of Computer Science, Lakehead University, Canada

ABSTRACT

Researchers are interested in text summarizing because of its practical uses. We investigated and implemented both Abstractive and Extractive approaches to text summarization in this paper. These techniques have been used to summarize not only simple text but also general text and documents. This paper provides both a supervised and unsupervised technique to summarize, depending on the goal. Abstractive summarization is based on supervised learning, in which the text is interpreted and examined using advanced natural language techniques, and a new shorter text is generated that contains the most significant and helpful text from the original text. These summaries are more complex and perform similarly to summaries created by humans. We implemented both of the approaches in this paper based on their real-world application, as well as the summaries evaluation process. The approach compares the genuine human-written summary with the machine-generated summary, judging the quality of the summary based on the summary's significant terms and length. Overall, such a paper might be extremely beneficial to people in a variety of situations.

 

KEYWORDS

Text summarization, Abstractive, Extractive, Recurrent neural network

REFERENCES

[1] D. R. Radev, E. Hovy, & K. McKeown, “Introduction to the special issue on summarization,” Computational linguistics, vol.28, no.4, pp.399-408, (2002)
[2] D. Suleiman and A. Awajan, “Deep learning based abstractive text summarization: Approaches, datasets, evaluation measures, and challenges,” Mathematical Problems in Engineering, (2020) DOI: 10.1155/2020/9365340
[3] A. Pai, “Comprehensive guide to text summarization using deep learning in Python,” June 10, 2019, analyticsvidhya.com
[4] R. Mihalcea and P. Tarau, P. TextRank: Bringing Order into Texts, (2004)
[5] M. V. Balipa, H. Jathanna, C. Ramasamy, and Balasubramani, “Text summarization for psoriasis of text extracted from online health forums using text rank algorithm,” International Journal of Engineering and Technology, (2018)
[6] S. G. Tanwi, V. Kumar, Y. S. Jain, and B. Avinash, “Automatic text summarization using text rank,” International Research Journal of Engineering and Technology (IRJET), (2018)
[7] J. Pennington, R. Socher, and C. Manning, “Glove: Global vectors for word representation,” EMNLP, 14, 15321543, (2014) DOI: 10.3115/v1/D14-1162
[8] P. Srivastava, “Essentials of deep learning: Introduction to long short term memory,” (2017) analyticsvidhya.com
[9] S. Yan, “Understanding LSTM and its diagrams,” (2017) medium.com
F. Shaikh, “Essentials of deep learning sequence to sequence modeling with attention (using Python),” (2018) analyticsvidhya.com
[11] A. Pai, “Comprehensive guide to text summarization using deep learning in python,” (2019) analyticsvidhya.com
[12] https://www.kaggle.com/datasets/sunnysai12345/news-summary?select=news_summary_more.csv
[13] https://www.kaggle.com/datasets/snap/amazon-fine-food-reviews

CITATION

  • APA:
    Divecha,P.& Fiaidhi,J.(2023). Comparing Abstractive and Extractive Approach for Text Summarization. International Journal of Hybrid Information Technology, 3(1), 29-40. 10.21742/ijhit.2653-309X.2023.3.1.03
  • Harvard:
    Divecha,P., Fiaidhi,J.(2023). "Comparing Abstractive and Extractive Approach for Text Summarization". International Journal of Hybrid Information Technology, 3(1), pp.29-40. doi:10.21742/ijhit.2653-309X.2023.3.1.03
  • IEEE:
    [1] P.Divecha, J.Fiaidhi, "Comparing Abstractive and Extractive Approach for Text Summarization". International Journal of Hybrid Information Technology, vol.3, no.1, pp.29-40, Aug. 2023
  • MLA:
    Divecha Pinak and Fiaidhi Jinan. "Comparing Abstractive and Extractive Approach for Text Summarization". International Journal of Hybrid Information Technology, vol.3, no.1, Aug. 2023, pp.29-40, doi:10.21742/ijhit.2653-309X.2023.3.1.03

ISSUE INFO

  • Volume 3, No. 1, 2023
  • ISSN(p):1738-9968
  • ISSN(e):2652-2233
  • Published:Aug. 2023

DOWNLOAD