| Ref. | Used models | Data division | Best model | Statistical matrices |
| [1] | Random forest, XGBoost, and deep learning | 70% for training; 30% for testing | XGBoost | R, MAE, RMSE |
| [43] | Random forest | 10-fold cross-validation | RF | , RPE |
| [52] | SVM, ANN | 80% for training; 20% for testing | ANN | R, MSE |
| [53] | EEMD-GRNN ANFIS MLR MLP | 90% for training; 10% for testing | ANFIS | R, , MAE, RMSE |
| [54] | GTWR LR ANN ANIFS GRNN | ā | GRNN. | R, MAE, RMSE |
| [55] |
| 10-fold cross-validation | ANN | R, MAE, MAPE, RMSE, IA |
| [56] | EEMD-GRNN GRNN MLR PCR ARIMA | 90% for training; 10% for testing | EEMD- GRNN | MAE, MAPE, RMSE, IA |
| [57] | MLP | 70% for training; 15% for validation; 15% for testing | MLP | MAE, RMSE, IA |
| [58] | XGBoost includes NELRM | 10-folds cross-validation | XGBoost. | , RMSE, MAPE |
| [59] | RF model | 10-folds cross-validation | RF | , RMSPE, MPE |
|
|