Research Article
Natural Language Processing Algorithms for Normalizing Expressions of Synonymous Symptoms in Traditional Chinese Medicine
Table 2
Model parameters and development set results on HFDS.
| Model | LR | DR | MC | Accuracy | Precision | Recall | F1-score |
| Encoder (Char)-Decoder (Char) | 0.0003 | 0.5 | 512 | 0.8631 ± 0.0042 | 0.8637 ± 0.0091 | 0.8587 ± 0.0038 | 0.8611 ± 0.0053 | Encoder (Char)-Decoder (Label) | 0.0005 | 0.5 | 256 | 0.8688 ± 0.0046 | 0.8812 ± 0.0070 | 0.8623 ± 0.0044 | 0.8716 ± 0.0048 | Encoder (Word)-Decoder (Label) | 0.0005 | 0.3 | 512 | 0.8631 ± 0.0042 | 0.8637 ± 0.0091 | 0.8587 ± 0.0038 | 0.8611 ± 0.0053 | Encoder (Word)-Decoder (Word) | 0.0005 | 0.3 | 512 | 0.8549 ± 0.0055 | 0.8596 ± 0.0047 | 0.8468 ± 0.0065 | 0.8531 ± 0.0052 | Encoder (Char)-Classification | 0.005 | 0.3 | 512 | 0.8377 ± 0.0060 | 0.9020 ± 0.0109 | 0.8414 ± 0.0062 | 0.8706 ± 0.0058 | Encoder (Word)-Classification | 0.005 | 0.5 | 512 | 0.8326 ± 0.0061 | 0.8978 ± 0.0068 | 0.8335 ± 0.0056 | 0.8645 ± 0.0043 | BERT-UniLM (Char) | 0.00003 | 0.1 | N/A | 0.8966 ± 0.0027 | 0.9013 ± 0.0064 | 0.8920 ± 0.0041 | 0.8966 ± 0.0025 | BERT-UniLM (Label) | 0.00003 | 0.1 | N/A | 0.8957 ± 0.0042 | 0.8996 ± 0.0063 | 0.8895 ± 0.0038 | 0.8945 ± 0.0039 | BERT-Classification | 0.00003 | 0.1 | N/A | 0.9087 ± 0.0029 | 0.9216 ± 0.0027 | 0.9084 ± 0.0034 | 0.9150 ± 0.0018 |
|
|
Note. LR: learning rate; DR: dropout rate; MC: number of memory cells of RNN; N/A: not applicable.
|