Research Article

[Retracted] N-GPETS: Neural Attention Graph-Based Pretrained Statistical Model for Extractive Text Summarization

Table 5

ROUGE F1 scores/results on the CNN/DM data set.

ModelsCNN/Daily Mail
R-1R-2R-L

Lead-340.4217.6236.67
Oracle55.6132.8451.88

Non-BERT graph models
NeuSum41.5919.0137.98
HSG42.3119.5138.74
HSG + tri-blocking42.9519.7639.23
BanditSum41.5018.7037.60
JECS41.7018.5037.90

BERT-based graph models
BERTSUM (sent)43.2520.2439.63
HiBERT42.3719.9538.83
DISCOBERT43.7720.8540.67
Topic-GraphSum44.0220.8140.55
DiscoCorrelation-GraphSum (EDU)43.6120.8141.12
N-GPETS (proposed)44.1520.8640.97

The bold values against the model shows that the corresponding model gain the highest performance in comparison to all other models listed in table.