Tags:Abstractive text summarization, Attention mechanism, BLEU and RNN
Abstract:
In this study, the aim was to evaluate the feasibility of using attention mechanism in automatic abstractive text summarization. Furthermore, the experimented opted for comparing RNN based architecture using sequence to sequence modeling powered by LSTM stacked layers. It is clear the attention mechanism helps to keep focus on the important sequences, thus, enabling the system to be performed well and optimized for longer sentences. The proposed model achieves better results on BLEU score which gives a good potential for further improvement in abstractive summarization field.
Feasibility of Using Attention Mechanism in Abstractive Summarization