|
Index | Model structure | Accuracy (%) | Hyper-parameters |
|
1 | 96 × 96-80C3 × 3-MP2 × 2-160C2 × 2-MP2 × 2-240C2 × 2-MP2 × 2-320C2 × 2-MP2 × 2-400C2 × 2-MP2 × 2-480C1 × 1-512FC-10 | 99.12 | minBatch = 64, iterNum = 9,400 dropout (fc) = 0.5, |
: reducing 0.1 every 3 epochs |
2 | 96 × 96-100C3 × 3-MP2 × 2-200C2 × 2-MP2 × 2-300C2 × 2-MP2 × 2-400C2 × 2-MP2 × 2-500C2 × 2-600C1 × 1-512FC-10 | 99.32 | minBatch = 64, iterNum = 9,400 dropout (fc) = 0.5, |
: reducing 0.1 every 3 epochs |
3 | 96 × 96-96C3 × 3-MP3 × 3-128C3 × 3-MP3 × 3-160C3 × 3-MP3 × 3-256C3 × 3-256C3 × 3-MP3 × 3-384C3 × 3-384C3 × 3-MP3 × 3-1024FC-10 | 99.56 | minBatch = 64, iterNum = 9,400 dropout (fc) = 0.5, |
| : reducing 0.1 every 3 epochs |
4 | Fisher vector-based method | 99.66 | — |
5 | Defensive distillation DCNN (transfer temperatures is 20) | 99.05 | — |
6 | DSEDR | 99.26 | — |
7 | Data-augmentation CNN (5000 training samples per class) (ELASTIC, SMOTE, DBSMOTE) | About (99.60, 99.70, 99.70) | — |
8 | Data-augmentation CSVM (5000 training samples per class) (ELASTIC, SMOTE, DBSMOTE) | About (89.80, 99.80, 99.80) | — |
9 | Data-augmentation CLEM (5000 training samples per class) (ELASTIC, SMOTE, DBSMOTE) | About (99.60, 99.80, 99.80) | — |
10 | RSFKM | 59.48 | — |
11 | GA-bayes (2K2K MNIST) | 56.83 | — |
|