Research Article
Leveraging Contextual Sentences for Text Classification by Using a Neural Attention Model
Table 5
Results of models when using additional features by different ways.
| Models | Accuracy | Time |
| Attended representation with length + TF-IDF (CNN) | 72.7/83.1 | 379/310 | Attended representation with probs (CNN) | 72.8/83.0 | 396/321 | Attended representation with probs + length + TF-IDF (CNN) | 73.0/83.4 | 412/335 | Attended representation with length + TF-IDF (BLSTM) | 73.7/85.6 | 1084/905 | Attended representation with probs (BLSTM) | 73.6/85.7 | 1107/918 | Attended representation with probs + length + TF-IDF (BLSTM) | 73.9/85.9 | 1141/924 |
|
|