Research Article

Adoption of Wireless Network and Artificial Intelligence Algorithm in Chinese-English Tense Translation

Table 2

Main parameter settings and descriptions of baseline model training.

ParametersParameter valueDescriptions

Scr vocab size30000The size of source Chinese vocabulary
Tgt vocab size30000The size of target English vocabulary
Batch size64Minibatch size, the number of training samples taken out for each training
Embedding size500Dimensions of source and target word embedding
Encoder typeBiLSTMThe type of neural network used by Encoder, a bidirectional LSTM here
Decoder typeLSTMThe neural network type used by Decoder, the standard LSTM used here
Enc/Dec layers2Network layers of Encoder and Decoder
LSTM size500Dimension of hidden layer of the neural network in LSTM
OptimizationAdamType of the optimization functions
Learning rate0.001Learning rate for neural network training
Beam size10The size of each candidate set selected in beam-search