Kun Ma  Kun Ma

DIMN: Dual Integrated Matching Network for Multi-Choice Reading Comprehension


Multi-choice reading comprehension is a task that involves selecting the correct option from a set of option choices.Recently,the attention mechanism has been widely used to acquire embedding representations.However,there are two significant challenges:1)generating the contextualized representations,namely,drawing associated information,and 2)capturing the global interactive relationship,namely,drawing local semantics.To address these issues,we have proposed the Dual Integrated Matching Network(DIMN)for multi-choice reading comprehension.It consists of two major parts.Fusing Information from Passage and Question-option pair into Enhanced Embedding Representation(FEER)is proposed to draw associated information to enhance embedding representation,which incorporates the information that reflects the most salient supporting entities to answer the question into the contextualized representations;Linear Integration of Co-Attention and Convolution(LIAC)is proposed to capture the interactive information and local semantics to construct global interactive relationship,which incorporates local semantics of a single sequence into the question-option-aware passage and passage-aware question-option representation.The experiments are shown that our DIMN performs better accuracy on three datasets:RACE(69.34%),DREAM(68.45%)and MCTest(71.81%on MCTest160 and 78.83%on MCTest500).Our DIMN is beneficial for improving the ability of machines to understand natural language.The system we have developed has been applied to customer service support.Our source code is accessible athttps:

Fusing Information from Passage and Question-option pair into Enhanced Embedding Representation(abbreviated as F-EER)In order to draw contextualized representations among the passage,question,and option,we have proposed a method to generate enhanced embedding representation.To generate the overall impression of the given passage or question,FEER concentrates the embeddings of the passage of the same dimension into one concentrated embedding that is generated by pooling.The word-level similarity between concentrated embeddings and contextualized representations is measured.A dot product between the original embeddings of the question-option and the above word-level similarity is adopted to update the representation of the quest.
Linear Integration of Co-Attention and Convolution for Capturing Global Interactive Relationship(abbreviated as LIAC).To capture a global interactive relationship,we have proposed Linear Integration of Co-Attention and Convolution.The LIAC is comprised of two symmetrical integration layers,each of which is a linear integration of the module of co-attention and convolution.In the left integration layer,the co-attention module computes the attention of each token in one sequence to a target token in another sequence and aggregates the attention to model the interactive information.For the convolution module,1)the passage and question-option pair are projected by convolutional operations with a fixed size,2)the feature maps are generated by adopting a fully connected layer,and 3)the feature weights are aggregated to obtain the local semantics.The question-option-aware passage is constructed by the linear integration of interactive information and local semantics by leveraging two learnable scalars.The process of the right integration layer is similarly.The global interactive relationship is the combination of the outputs of both layers.

Code & Data

Code & Data
Code & Data: https://github.com/vqiangv/DIMN



Qiang Wei, Kun Ma*, Xinyu Liu, Ke Ji, Bo Yang, and Ajith Abraham, "DIMN: Dual Integrated Matching Network for Multi-Choice Reading Comprehension," Engineering Applications of Artificial Intelligence, 2024, 130 (4): 1-11 


  title={DIMN: Dual Integrated Matching Network for multi-choice reading comprehension},
  author={Wei, Qiang and Ma, Kun and Liu, Xinyu and Ji, Ke and Yang, Bo and Abraham, Ajith},
  journal={Engineering Applications of Artificial Intelligence},