Computational Intelligence and Neuroscience / 2022 / Article / Tab 3 / Research Article
[Retracted] N-GPETS: Neural Attention Graph-Based Pretrained Statistical Model for Extractive Text Summarization Table 3 Non-BERT graph structures.
Author(s) Model name Non-BERT graph structure Zhou et al., 2018, [11 ] NeuSum A framework based on the attention of seq2seq to extract text summary. Wang et al., 2020, [9 ] HSG A model based on a variety of abstract graphs forms a graph of a sentence document based on word appearance. These words and sentence coding notes are CNN and BiLSTM. Xu et al., 2020, [13 ] JECS The compression model selects sentences and, to reduce repetition, suppresses these sentences by pruning the dependency tree. Crawford et al., 2018, [61 ] BanditSum Faces the problem of selecting sentences as a contextual problem. To train model policy, gradient methods are used.