Research Article
Impact of Parameter Tuning for Optimizing Deep Neural Network Models for Predicting Software Faults
Table 7
Confusion matrix analysis for the KC1, KC3, PC1, and PC2 datasets (TPR: True Positive Rate, TNR: True Negative Rate, FPR: False Positive Rate, and FNR: False Negative Rate).
| Algorithm | KC1 | KC3 | PC1 | PC2 | TPR | TNR | FPR | FNR | TPR | TNR | FPR | FNR | TPR | TNR | FPR | FNR | TPR | TNR | FPR | FNR |
| RF | 0.330 | 0.960 | 0.040 | 0.670 | 0.140 | 0.970 | 0.030 | 0.860 | 0.290 | 0.980 | 0.015 | 0.700 | 0.000 | 1.000 | 0.000 | 1.000 | DT | 0.170 | 0.970 | 0.030 | 0.830 | 0.380 | 0.910 | 0.070 | 0.610 | 0.290 | 0.930 | 0.060 | 0.700 | 0.130 | 0.920 | 0.070 | 0.880 | NB | 0.380 | 0.900 | 0.070 | 0.620 | 0.380 | 0.900 | 0.120 | 0.610 | 0.290 | 0.930 | 0.060 | 0.700 | 0.130 | 0.920 | 0.070 | 0.880 | Without dropout DNN | 0.470 | 0.980 | 0.020 | 0.530 | 0.170 | 0.970 | 0.030 | 0.830 | 0.020 | 0.990 | 0.010 | 0.980 | 0.020 | 0.990 | 0.010 | 0.980 | With dropout DNN | 0.520 | 0.980 | 0.017 | 0.470 | 0.970 | 0.980 | 0.012 | 0.026 | 0.410 | 0.980 | 0.011 | 0.580 | 1.000 | 0.990 | 0.001 | 0.000 |
|
|