Research Article
Fake News Detection Using Machine Learning Ensemble Methods
Table 3
Precision on the 4 datasets.
| | DS1 | DS2 | DS3 | DS4 |
| Logistic regression (LR) | 0.98 | 0.92 | 0.93 | 0.88 | Linear SVM (LSVM) | 0.98 | 0.31 | 0.54 | 0.88 | Multilayer perceptron | 0.97 | 0.32 | 0.93 | 0.92 | K-nearest neighbors (KNN) | 0.91 | 0.22 | 0.85 | 0.8 |
| Ensemble learners | | | | | Random forest (RF) | 0.99 | 0.3 | 0.98 | 0.92 | Voting classifier (RF, LR, KNN) | 0.96 | 0.88 | 0.92 | 0.86 | Voting classifier (LR, LSVM, CART) | 0.94 | 0.86 | 0.88 | 0.83 | Bagging classifier (decision trees) | 0.98 | 0.94 | 0.93 | 0.9 | Boosting classifier (AdaBoost) | 0.98 | 0.92 | 0.92 | 0.86 | Boosting classifier (XGBoost) | 0.99 | 0.94 | 0.96 | 0.92 |
| Benchmark algorithms | | | | | Perez-LSVM | 0.99 | 0.79 | 0.96 | 0.9 | Wang-CNN | 0.84 | 0.65 | 0.48 | 0.72 | Wang-Bi-LSTM | 0.92 | 0.43 | 0.5 | 0.65 |
|
|