Research Article

Multi-Self-Attention for Aspect Category Detection and Biomedical Multilabel Text Classification with BERT

Table 4

The test set results for the Chinese EMRC task.

ModelPrecisionRecallF1

ML-KNN71.8584.0976.94
Rank SVM81.6284.8382.68
TextCNN95.2092.6093.00
BiLSTM-Attention92.5591.9589.22
Attention-XML92.6492, 5589.78
BERT base95.5593.3393.41
BERT Att94.9795.2893.48
BERT-MSA96.7594.5793.94