Next Article in Journal
Computational Modeling of Boundary Layer Flashback in a Swirling Stratified Flame Using a LES-Based Non-Adiabatic Tabulated Chemistry Approach
Next Article in Special Issue
Design of a 2-Bit Neural Network Quantizer for Laplacian Source
Previous Article in Journal
Unsupervised Exemplar-Domain Aware Image-to-Image Translation
Previous Article in Special Issue
Integrate Candidate Answer Extraction with Re-Ranking for Chinese Machine Reading Comprehension
Article

Augmenting Paraphrase Generation with Syntax Information Using Graph Convolutional Networks

College of Electronics and Information Engineering, Tongji University, Shanghai 201804, China
*
Author to whom correspondence should be addressed.
Academic Editor: Zoran H. Peric
Entropy 2021, 23(5), 566; https://doi.org/10.3390/e23050566
Received: 29 March 2021 / Revised: 29 April 2021 / Accepted: 29 April 2021 / Published: 2 May 2021
(This article belongs to the Special Issue Methods in Artificial Intelligence and Information Processing)
Paraphrase generation is an important yet challenging task in natural language processing. Neural network-based approaches have achieved remarkable success in sequence-to-sequence learning. Previous paraphrase generation work generally ignores syntactic information regardless of its availability, with the assumption that neural nets could learn such linguistic knowledge implicitly. In this work, we make an endeavor to probe into the efficacy of explicit syntactic information for the task of paraphrase generation. Syntactic information can appear in the form of dependency trees, which could be easily acquired from off-the-shelf syntactic parsers. Such tree structures could be conveniently encoded via graph convolutional networks to obtain more meaningful sentence representations, which could improve generated paraphrases. Through extensive experiments on four paraphrase datasets with different sizes and genres, we demonstrate the utility of syntactic information in neural paraphrase generation under the framework of sequence-to-sequence modeling. Specifically, our graph convolutional network-enhanced models consistently outperform their syntax-agnostic counterparts using multiple evaluation metrics. View Full-Text
Keywords: paraphrase generation; syntax information; graph convolutional network; sequence-to-sequence paraphrase generation; syntax information; graph convolutional network; sequence-to-sequence
Show Figures

Figure 1

MDPI and ACS Style

Chi, X.; Xiang, Y. Augmenting Paraphrase Generation with Syntax Information Using Graph Convolutional Networks. Entropy 2021, 23, 566. https://doi.org/10.3390/e23050566

AMA Style

Chi X, Xiang Y. Augmenting Paraphrase Generation with Syntax Information Using Graph Convolutional Networks. Entropy. 2021; 23(5):566. https://doi.org/10.3390/e23050566

Chicago/Turabian Style

Chi, Xiaoqiang, and Yang Xiang. 2021. "Augmenting Paraphrase Generation with Syntax Information Using Graph Convolutional Networks" Entropy 23, no. 5: 566. https://doi.org/10.3390/e23050566

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop