Research Article
Multi-Self-Attention for Aspect Category Detection and Biomedical Multilabel Text Classification with BERT
Table 4
The test set results for the Chinese EMRC task.
| Model | Precision | Recall | F1 |
| ML-KNN | 71.85 | 84.09 | 76.94 | Rank SVM | 81.62 | 84.83 | 82.68 | TextCNN | 95.20 | 92.60 | 93.00 | BiLSTM-Attention | 92.55 | 91.95 | 89.22 | Attention-XML | 92.64 | 92, 55 | 89.78 | BERT base | 95.55 | 93.33 | 93.41 | BERT Att | 94.97 | 95.28 | 93.48 | BERT-MSA | 96.75 | 94.57 | 93.94 |
|
|