Volume 18, No. 6, 2021

Selective Multi-Head Attention For Abstractive Text Summarization Using Coverage And Pointer Generator


Pramod kumar amaravarapu , Akhil khare

Abstract

Text summarization produces a concise version of text without reusing the phrases from the original text, while retaining the context and key contents. In this work, we present seq2seq architecture for lengthy text summarization. The previous abstractive summarization models have generated summaries, but these models suffer from duplicates and semantic irrelevance. The primary reason for this is that the source text being summarized is longer and typically contains multiple sentences. It also includes a significant amount of repetitive information. We propose selective multi-head attention using coverage and pointer generation for summarization to handle the problems. The selective mechanism helps to improve the encoded representation by making it more accurate. The repetitions are controlled by the coverage mechanism in multi-head attention by tracking the tokens, which have been summarized. The pointer network is also integrated into the multi-head architecture which handle out-of-vocabulary problem. The experimentaion is carried on CNN/DM standard dataset. The suggested model outperforms the baseline extractive and abstractive summarization, according to the finds.


Pages: 1452-1463

Keywords: multi-head attention; coverage mechanism; pointer-generator; abstractive summarization

Full Text