Research Article
[Retracted] A Post-training Quantization Method for the Design of Fixed-Point-Based FPGA/ASIC Hardware Accelerators for LSTM/GRU Algorithms
Table 3
Comparison of quantization results on the PTB dataset.
| | Model | #Layers | #Units | Quantization method | Weights bits | Activation bits | FP model PPW | Quantized model PPW | PPW variation |
| [13] | LSTM | 1 | 300 | In-training | 3 | 3 | 89.8 | 87.9 | −1.9 | [1] | LSTM | 1 | 300 | In-training | 4 | 4 | 109 | 114 | +5 | [41] | LSTM | 1 | 300 | In-training | 4 | 4 | 97 | 100 | +3 | [42] | LSTM | 1 | 300 | In-training | 2 | 2 | 97.2 | 110.3 | +13.1 | Our work | LSTM | 1 | 300 | Post-training | 11 | 10 | 92.8 | 93.7 | +0.9 | [13] | GRU | 1 | 300 | In-training | 3 | 3 | 92.5 | 92.9 | +0.4 | [1] | GRU | 1 | 300 | In-training | 4 | 4 | 100 | 102 | +2 | Our work | GRU | 1 | 300 | Post-training | 8 | 3 | 91.3 | 90.6 | −0.7 |
|
|