Research Article

Two-Stage Bagging Pruning for Reducing the Ensemble Size and Improving the Classification Performance

Table 3

Comparison of the classification performance on 28 datasets between the single classifier GNB, bagging using GNB as the base learner, and the proposed corresponding pruning methods including AP, DP, “AP+DP”, and “DP+AP”. The values in parentheses represent the optimized parameters ta for AP, td for DP, and ta, td for “AP+DP” and “DP+AP” with the best classification performance achieved by the corresponding pruning method.

DatasetGNBBaggingAP“AP+DP”DP“DP+AP”

Aba51.5951.8152.21952.559,151.90751.902,7
Adult81.5281.5181.76981.939,181.52281.899,1
Aus79.5779.5780.58880.588,1079.57979.579,8
Bcw93.5693.5693.56893.859,193.56493.719,1
Bld54.7854.7861.16962.039,155.94561.459,1
Cmc48.5448.2050.24950.519,148.74648.888,2
Col36.6836.6838.86945.659,137.23266.308,1
Cre80.4380.8781.88982.329,581.01282.329,6
Der88.2588.8091.80995.369,189.07189.075,1
Ger72.7072.0073.80873.807,872.20974.008,9
Gla39.7239.7251.40953.749,841.12741.125,7
Hea84.0783.3384.44985.198,284.44285.195,2
Hep58.7158.7167.74969.039,158.71270.979,1
Ion89.4689.4690.03990.038,290.03890.319,4
kr-vs-kp62.5862.6764.99864.998,662.86263.679,1
Mam78.6778.5679.29979.406,678.77479.296,6
Pid75.0375.4275.81976.599,475.55976.209,5
Spe69.6671.5472.28973.039,171.91374.919,1
Tel72.6672.6472.77972.879,372.67772.869,2
Veh43.8544.5646.57746.819,244.561044.927,6
Vot94.4894.4894.71994.719,494.48194.718,1
Vow67.6868.1869.90670.109,968.18968.180,9
Yea14.4217.1244.00944.349,819.07134.303,1
Spambase81.7281.7082.03982.489,181.72382.429,3
Tictacto69.6269.6270.67970.679,1069.62769.733,8
Wdbc93.8593.8593.85594.386,194.02294.209,1
Wpbc62.1264.1467.17971.729,164.146,1076.268,1
Spect55.4358.4363.67967.049,360.302,1062.179,3