Research Article

[Retracted] N-GPETS: Neural Attention Graph-Based Pretrained Statistical Model for Extractive Text Summarization

Table 3

Non-BERT graph structures.

Author(s)Model nameNon-BERT graph structure

Zhou et al., 2018, [11]NeuSumA framework based on the attention of seq2seq to extract text summary.
Wang et al., 2020, [9]HSGA model based on a variety of abstract graphs forms a graph of a sentence document based on word appearance. These words and sentence coding notes are CNN and BiLSTM.
Xu et al., 2020, [13]JECSThe compression model selects sentences and, to reduce repetition, suppresses these sentences by pruning the dependency tree.
Crawford et al., 2018, [61]BanditSumFaces the problem of selecting sentences as a contextual problem. To train model policy, gradient methods are used.