Research Article
[Retracted] N-GPETS: Neural Attention Graph-Based Pretrained Statistical Model for Extractive Text Summarization
Table 5
ROUGE F1 scores/results on the CNN/DM data set.
| Models | CNN/Daily Mail | R-1 | R-2 | R-L |
| Lead-3 | 40.42 | 17.62 | 36.67 | Oracle | 55.61 | 32.84 | 51.88 |
| Non-BERT graph models | NeuSum | 41.59 | 19.01 | 37.98 | HSG | 42.31 | 19.51 | 38.74 | HSG + tri-blocking | 42.95 | 19.76 | 39.23 | BanditSum | 41.50 | 18.70 | 37.60 | JECS | 41.70 | 18.50 | 37.90 |
| BERT-based graph models | BERTSUM (sent) | 43.25 | 20.24 | 39.63 | HiBERT | 42.37 | 19.95 | 38.83 | DISCOBERT | 43.77 | 20.85 | 40.67 | Topic-GraphSum | 44.02 | 20.81 | 40.55 | DiscoCorrelation-GraphSum (EDU) | 43.61 | 20.81 | 41.12 | N-GPETS (proposed) | 44.15 | 20.86 | 40.97 |
|
|
The bold values against the model shows that the corresponding model gain the highest performance in comparison to all other models listed in table.
|