Research Article

Leveraging Contextual Sentences for Text Classification by Using a Neural Attention Model

Table 5

Results of models when using additional features by different ways.

ModelsAccuracyTime

Attended representation with length + TF-IDF (CNN)72.7/83.1379/310
Attended representation with probs (CNN)72.8/83.0396/321
Attended representation with probs + length + TF-IDF (CNN)73.0/83.4412/335
Attended representation with length + TF-IDF (BLSTM)73.7/85.61084/905
Attended representation with probs (BLSTM)73.6/85.71107/918
Attended representation with probs + length + TF-IDF (BLSTM)73.9/85.91141/924