Research Article

MTQA: Text-Based Multitype Question and Answer Reading Comprehension Model

Table 1

Optimal results and innovation of the models.

Model design perspectiveModelResultsInnovations
EMF1

Operations on external methodsNAQANet46.2049.24The first model built for DROP corpus, with four decoding heads
NABERT+64.6167.35Switch to BERT encoding based on NAQANet model
MTMSN76.6880.54Add two decoding heads and cluster search algorithm based on NABERT+
TbMS76.9179.92Improve and increase the answer prediction algorithm
TASE

Operations on the model itselfQANet + ELMo27.7130.33Put the DROP corpus into the model for training and testing
BERTBase30.1033.36
GENBERT68.2072.80Use transformer internal structure for decoding and secondary pretraining