Extractive Text Summarization: An Application Based Study

Year : 2024 | Volume :11 | Issue : 02 | Page : –
By

Deepanshu Anand

Yugansh Gupta

Arnav Sabharwal

Vinay Kumar Saini

Anshu Khurana

  1. Student Department of Information Technology, Maharaja Agrasen Institute of Technology Delhi India
  2. Student Department of Information Technology, Maharaja Agrasen Institute of Technology Delhi India
  3. Student Department of Information Technology, Maharaja Agrasen Institute of Technology Delhi India
  4. Associate Professor Department of Information Technology, Maharaja Agrasen Institute of Technology Delhi India
  5. Assistant Professor Department of Information Technology, Maharaja Agrasen Institute of Technology Delhi India

Abstract

Text summarization is an essential tool for extracting important information from lengthy texts or documents. Text Summarization has two main methodologies namely: Extractive Summarization and Abstractive Summarization. This study concentrates on extractive summarising, which selects significant sentences straight from the source material to create a summary. It is a popular option for many practical applications since it frequently produces summaries that are more accurate in terms of substance. In Abstractive Summarization, the summaries are generated by using the words that are not in the original text. However, the disadvantage of the above technique lies in the areas, where we want to retain the original text from the source. Hence the need of Extractive Summarization arises. Although, there are few drawbacks associated to extractive summarization which includes the possibility of repetition and the dependence on pre-existing content. This study investigates how extractive summarization could be used to create end-to-end applications that are broadly applicable across various applications like legal document analysis. Through the resolution of these constraints and the utilisation of advances in natural language processing, extractive summarization could provide beneficial outcomes for a range of applications.

Keywords: Text Summarization, Supervised Learning, Unsupervised Learning, Semantics, Extractive Summary

[This article belongs to Journal of Artificial Intelligence Research & Advances(joaira)]

How to cite this article: Deepanshu Anand, Yugansh Gupta, Arnav Sabharwal, Vinay Kumar Saini, Anshu Khurana. Extractive Text Summarization: An Application Based Study. Journal of Artificial Intelligence Research & Advances. 2024; 11(02):-.
How to cite this URL: Deepanshu Anand, Yugansh Gupta, Arnav Sabharwal, Vinay Kumar Saini, Anshu Khurana. Extractive Text Summarization: An Application Based Study. Journal of Artificial Intelligence Research & Advances. 2024; 11(02):-. Available from: https://journals.stmjournals.com/joaira/article=2024/view=0

References

  1. El-Kassas, W. S., Salama, C. R., Rafea, A. A., & Mohamed, H. K. (2021). Automatic text summarization: A comprehensive survey. Expert systems with applications, 165, 113679.
  2. Sethi, P., Sonawane, S., Khanwalker, S., & Keskar, R. B. (2017, December). Automatic text summarization of news articles. In 2017 International Conference on Big Data, IoT and Data Science (BID)(pp. 23-29). IEEE.
  3. Kanapala, A., Pal, S., & Pamula, R. (2019). Text summarization from legal documents: a survey. Artificial Intelligence Review, 51, 371-402.
  4. Moratanch, N., & Chitrakala, S. (2017, January). A survey on extractive text summarization. In 2017 international conference on computer, communication and signal processing (ICCCSP)(pp. 1-6). IEEE.
  5. Moratanch, N., & Chitrakala, S. (2016, March). A survey on abstractive text summarization. In 2016 International Conference on Circuit, power and computing technologies (ICCPCT)(pp. 1-7). IEEE.
  6. H. P. Luhn, “The Automatic Creation of Literature Abstracts,” in IBM Journal of Research and Development, vol. 2, no. 2, pp. 159-165, Apr. 1958, doi: 10.1147/rd.22.0159.
  7. Erkan, G., & Radev, D. R. (2004). Lexrank: Graph-based lexical centrality as salience in text summarization. Journal of artificial intelligence research, 22, 457-479.
  8. Mihalcea, R., & Tarau, P. (2004, July). Textrank: Bringing order into text. In Proceedings of the 2004 conference on empirical methods in natural language processing(pp. 404-411).
  9. Nallapati, R., Zhai, F., & Zhou, B. (2017, February). Summarunner: A recurrent neural network-based sequence model for extractive summarization of documents. In Proceedings of the AAAI conference on artificial intelligence(Vol. 31, No. 1).
  10. Cheng, J., & Lapata, M. (2016). Neural summarization by extracting sentences and words. arXiv preprint arXiv:1603.07252.
  11. Liu, Y. (2019). Fine-tune BERT for extractive summarization. arXiv preprint arXiv:1903.10318.
  12. Lin, C. Y. (2004, July). Rouge: A package for automatic evaluation of summaries. In Text summarization branches out(pp. 74-81).

Regular Issue Subscription Review Article
Volume 11
Issue 02
Received May 24, 2024
Accepted June 6, 2024
Published July 10, 2024

function myFunction2() {
var x = document.getElementById(“browsefigure”);
if (x.style.display === “block”) {
x.style.display = “none”;
}
else { x.style.display = “Block”; }
}
document.querySelector(“.prevBtn”).addEventListener(“click”, () => {
changeSlides(-1);
});
document.querySelector(“.nextBtn”).addEventListener(“click”, () => {
changeSlides(1);
});
var slideIndex = 1;
showSlides(slideIndex);
function changeSlides(n) {
showSlides((slideIndex += n));
}
function currentSlide(n) {
showSlides((slideIndex = n));
}
function showSlides(n) {
var i;
var slides = document.getElementsByClassName(“Slide”);
var dots = document.getElementsByClassName(“Navdot”);
if (n > slides.length) { slideIndex = 1; }
if (n (item.style.display = “none”));
Array.from(dots).forEach(
item => (item.className = item.className.replace(” selected”, “”))
);
slides[slideIndex – 1].style.display = “block”;
dots[slideIndex – 1].className += ” selected”;
}