Research Article

Impact of Parameter Tuning for Optimizing Deep Neural Network Models for Predicting Software Faults

Table 11

Experiment performed on the PC2 dataset by tuning the parameter i.e., learning rate = -5, epochs = 100, 200, 300, 500, 1000, and 2000, number of layers = 5, and dropout = 0.2 and 0.5.

No. of layers555555
No. of neurons with dropout80 (0.2), 80, 80, 80, and 20080 (0.2), 80, 80, 80, and 20080 (0.2), 80 (0.2), 80, 80, and 20080 (0.5), 80 (0.5), 80 (0.5), 80 (0.5), and 200 (0.5)80 (0.5), 80 (0.5), 80 (0.5), 80 (0.5), and 200 (0.5)80 (0.5), 80 (0.5), 80 (0.5), 80 (0.5), and 200 (0.5)
Lr1.00E − 051.00E − 051.00E – 051.00E − 051.00E − 051.00E − 05
Epoch10020030050010002000
Accuracy0.96230.96230.96230.96230.96230.9623
Loss0.51040.40970.30330.10550.11120.2213