Research Article
A Chinese Lip-Reading System Based on Convolutional Block Attention Module
Table 1
The accuracy of each model.
| Model | Prams (M) | Top-1 accuracy (%) |
| Vgg16 + LSTM + Attention | 144.4 | 95.2 | Vgg16 + GRU + Attention | 142.4 | 95.3 | InceptionV3 + LSTM + Attention | 31.9 | 98.2 | InceptionV3 + GRU + Attention | 29.9 | 99.1 | ResNet50 + LSTM + Attention | 33.6 | 98.2 | ResNet50 + GRU + Attention | 31.6 | 99.3 | ResNet101 + LSTM + Attention | 52.6 | 97.3 | ResNet101 + GRU + Attention | 50.6 | 99.6 | ResNet152 + LSTM + Attention | 68.3 | 98.4 | ResNet152 + GRU + Attention | 66.3 | 99.8 | ResNet50 + CBAM + LSTM + Attention | 36.1 | 98.7 | ResNet50 + CBAM + GRU + Attention | 34.1 | 99.6 |
|
|