Figure 1: Sentence-level attention neural network. The network includes 5 layers: input layer, embedding layer, Bi-LSTM layer, attention layer, and merging layer.