Research Article

Two-Stage Bagging Pruning for Reducing the Ensemble Size and Improving the Classification Performance

Table 4

Comparison of the classification performance on the 28 datasets between the single classifier KNN, bagging using KNN as the base learner, and the proposed corresponding pruning methods including AP, DP, “AP+DP”, and “DP+AP”. The values in parentheses represent the optimized parameters ta for AP, td for DP, and ta, td for “AP+DP” and “DP+AP” with the best classification performance achieved by the corresponding pruning method.

DatasetKNNBaggingAP“AP+DP”DP“DP+AP”

Aba52.9153.2753.60753.725,653.32453.488,4
Adult81.5481.9581.95481.984,981.97881.994,9
Aus66.5267.8367.83270.436,169.86169.861,1
Bcw94.7194.1394.42694.426,394.28894.565,8
Bld61.7463.4864.64965.809,863.481066.099,8
Cmc50.1050.5151.05451.261,850.51450.648,5
Col80.7181.5282.07382.886,281.79282.073,6
Cre77.6878.4178.84979.133,578.99479.719,5
Der96.9997.2797.27797.817,197.27297.275,2
Ger67.8068.0068.30968.906,268.00268.807,2
Gla73.3671.5075.70175.706,773.36673.365,6
Hea77.7878.1579.63880.009,578.89680.379,4
Hep83.2381.9481.94483.876,683.23683.879,8
Ion82.6282.9184.90984.909,1082.91184.339,8
kr-vs-kp95.0996.2196.31796.401,796.31696.349,7
Mam77.4278.4678.46479.293,279.08279.505,3
Pid72.6972.6973.60874.518,473.60274.514,3
Spe74.9174.5375.28876.408,475.28276.408,1
Tel82.3582.9983.00383.082,483.04383.075,8
Veh65.6065.3765.60466.909,366.08666.083,6
Vot94.9494.9494.94995.402,295.17295.402,2
Vow95.0595.3595.56495.662,695.66695.660,6
Yea53.9856.4056.74356.872,856.67756.679,7
Spambase90.0090.7491.02791.229,890.94691.156,3
Tictacto82.3684.1384.34984.451,384.13784.457,1
Wdbc93.3293.3293.32093.504,893.32993.327,1
Wpbc70.2070.7172.73974.759,274.24176.268,1
Spect78.2879.7880.52582.403,181.27181.659,1