[Retracted] PLncWX: A Machine-Learning Algorithm for Plant lncRNA Identification Based on WOA-XGBoost
Table 4
Comparative performance among four feature selection methods combined with KNN, GaussianNB, SVM, Decision Tree, Random Forest, AdaBoost, and XGBoost, respectively.
Methods
Models
Accuracy
Precision
Recall
AUC
F1_score
FCD
KNN
91.14
90.46
95.17
94.54
91.27
GaussianNB
90.21
87.29
96.13
95.50
90.83
SVM
91.21
88.99
94.82
96.17
91.56
Decision Tree
91.37
91.34
91.70
91.37
91.34
Random Forest
91.47
91.17
93.30
95.88
91.48
AdaBoost
91.47
91.34
92.80
96.05
91.54
XGBoost
91.33
91.27
92.58
96.38
91.35
GWO
KNN
91.12
89.98
92.93
94.97
91.28
GaussianNB
87.73
84.38
93.37
94.52
88.48
SVM
90.76
88.36
94.57
95.92
91.16
Decision Tree
91.37
91.34
91.70
91.37
91.34
Random Forest
91.75
91.06
92.87
96.35
91.95
AdaBoost
91.89
91.71
92.48
96.51
91.90
XGBoost
91.56
90.73
93.05
96.50
91.69
WOA
KNN
84.79
91.02
77.55
91.91
82.70
GaussianNB
79.73
79.27
86.52
90.17
81.42
SVM
86.78
87.02
87.53
94.35
86.78
Decision Tree
79.69
82.75
76.07
79.69
78.26
Random Forest
90.54
88.08
94.22
96.48
90.80
AdaBoost
89.46
88.51
92.05
95.57
89.78
XGBoost
91.55
90.46
93.33
96.78
91.68
HHO
KNN
83.58
91.79
74.35
91.43
80.70
GaussianNB
74.87
78.36
81.58
88.04
76.98
SVM
86.33
87.19
86.42
94.39
86.17
Decision Tree
79.69
82.75
76.07
79.69
78.26
Random Forest
89.71
87.35
94.23
96.05
90.69
AdaBoost
89.23
88.59
91.31
95.52
89.47
XGBoost
91.41
90.65
92.77
96.79
91.48
The bold values represent the maximum value in each column of evaluation indicators under each feature selection method.