Next Article in Journal
Task-Based Motion Planning Using Optimal Redundancy for a Minimally Actuated Robotic Arm
Next Article in Special Issue
SupMPN: Supervised Multiple Positives and Negatives Contrastive Learning Model for Semantic Textual Similarity
Previous Article in Journal
Noise Reduction Method of Nanopore Based on Wavelet and Kalman Filter
Previous Article in Special Issue
Knowledge Graph Alignment Network with Node-Level Strong Fusion
 
 
Article

Multigranularity Syntax Guidance with Graph Structure for Machine Reading Comprehension

1
School of Artificial Intelligence, Chongqing University of Technology, Chongqing 400054, China
2
College of Computer and Information Science, Chongqing Normal University, Chongqing 401331, China
*
Authors to whom correspondence should be addressed.
Academic Editor: Valentino Santucci
Appl. Sci. 2022, 12(19), 9525; https://doi.org/10.3390/app12199525
Received: 2 August 2022 / Revised: 18 September 2022 / Accepted: 19 September 2022 / Published: 22 September 2022
(This article belongs to the Special Issue Natural Language Processing (NLP) and Applications)
In recent years, pre-trained language models, represented by the bidirectional encoder representations from transformers (BERT) model, have achieved remarkable success in machine reading comprehension (MRC). However, limited by the structure of BERT-based MRC models (for example, restrictions on word count), such models cannot effectively integrate significant features, such as syntax relations, semantic connections, and long-distance semantics between sentences, leading to the inability of the available models to better understand the intrinsic connections between text and questions to be answered based on it. In this paper, a multi-granularity syntax guidance (MgSG) module that consists of a “graph with dependence” module and a “graph with entity” module is proposed. MgSG selects both sentence and word granularities to guide the text model to decipher the text. In particular, syntactic constraints are used to guide the text model while exploiting the global nature of graph neural networks to enhance the model’s ability to construct long-range semantics. Simultaneously, named entities play an important role in text and answers and focusing on entities can improve the model’s understanding of the text’s major idea. Ultimately, fusing multiple embedding representations to form a representation yields the semantics of the context and the questions. Experiments demonstrate that the performance of the proposed method on the Stanford Question Answering Dataset is better when compared with the traditional BERT baseline model. The experimental results illustrate that our proposed “MgSG” module effectively utilizes the graph structure to learn the internal features of sentences, solve the problem of long-distance semantics, while effectively improving the performance of PrLM in machine reading comprehension. View Full-Text
Keywords: machine reading comprehension; graph attention network; BERT; SQuAD machine reading comprehension; graph attention network; BERT; SQuAD
Show Figures

Figure 1

MDPI and ACS Style

Xu, C.; Liu, Z.; Li, G.; Zhu, C.; Zhang, Y. Multigranularity Syntax Guidance with Graph Structure for Machine Reading Comprehension. Appl. Sci. 2022, 12, 9525. https://doi.org/10.3390/app12199525

AMA Style

Xu C, Liu Z, Li G, Zhu C, Zhang Y. Multigranularity Syntax Guidance with Graph Structure for Machine Reading Comprehension. Applied Sciences. 2022; 12(19):9525. https://doi.org/10.3390/app12199525

Chicago/Turabian Style

Xu, Chuanyun, Zixu Liu, Gang Li, Changpeng Zhu, and Yang Zhang. 2022. "Multigranularity Syntax Guidance with Graph Structure for Machine Reading Comprehension" Applied Sciences 12, no. 19: 9525. https://doi.org/10.3390/app12199525

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop