Research Article

Deep Learning Based Abstractive Text Summarization: Approaches, Datasets, Evaluation Measures, and Challenges

Table 1

Encoder and decoder components.

ReferenceYearEncoderDecoder

[18]2015Bag-of-words, convolutional, and attention-based
[29]2015RNN with LSTM units and attentionRNN with LSTM units and attention
[39]2016RNN-LSTM decoder RNNWord-based
[50]2016GRU + QRNN + attentionGRU + RNN QRNN
[38]2016Unidirectional RNN attentive encoder-decoder LSTMUnidirectional RNN attentive encoder-decoder LSTM
Bidirectional LSTMUnidirectional LSTM
Bidirectional LSTMDecoder that had global attention
[51]2016LSTM-RNNLSTM-RNN
[55]2016Two bidirectional GRU-RNNGRU-RNN unidirection
[52]2017Bidirectional GRUUnidirectional GRU
[53]2017Bidirectional GRUUnidirectional GRU
[56]2017Single-layer bidirectional LSTM + attentionSingle-layer unidirectional LSTM
[57]2017Bidirectional LSTM-RNN + intra-attention singleLSTM decoder + intra-attention
[58]2018Bidirectional LSTMUnidirectional LSTM
[30]2018Bidirectional LSTMUnidirectional LSTM
[35]2018Bidirectional LSTMBidirectional LSTM
[59]2018Bidirectional LSTMUnidirectional LSTM
[60]2018Bidirectional LSTM3-layer unidirectional LSTM
[61]2018Bidirectional GRUUnidirectional GRU
[62]2018Bidirectional LSTMTwo-decoder unidirectional LSTM
[63]2019Bidirectional GRUUnidirectional GRU
[64]2019Unidirectional GRUUnidirectional GRU
[49]2020Bidirectional LSTMUnidirectional LSTM