Next Article in Journal
The Effects of Dry, Humid and Wear Conditions on the Antimicrobial Efficiency of Triclosan-Containing Surfaces
Previous Article in Journal
The Method of Fundamental Solutions for Three-Dimensional Nonlinear Free Surface Flows Using the Iterative Scheme
Article Menu
Issue 8 (April-2) cover image

Export Article

Open AccessArticle

Selectively Connected Self-Attentions for Semantic Role Labeling

Department of Computer Science and Engineering, Incheon National University, Incheon 400-011, Korea
Appl. Sci. 2019, 9(8), 1716; https://doi.org/10.3390/app9081716
Received: 8 March 2019 / Revised: 31 March 2019 / Accepted: 19 April 2019 / Published: 25 April 2019
(This article belongs to the Section Computing and Artificial Intelligence)
  |  
PDF [588 KB, uploaded 25 April 2019]
  |  

Abstract

Semantic role labeling is an effective approach to understand underlying meanings associated with word relationships in natural language sentences. Recent studies using deep neural networks, specifically, recurrent neural networks, have significantly improved traditional shallow models. However, due to the limitation of recurrent updates, they require long training time over a large data set. Moreover, they could not capture the hierarchical structures of languages. We propose a novel deep neural model, providing selective connections among attentive representations, which remove the recurrent updates, for semantic role labeling. Experimental results show that our model performs better in accuracy compared to the state-of-the-art studies. Our model achieves 86.6 F1 scores and 83.6 F1 scores on the CoNLL 2005 and CoNLL 2012 shared tasks, respectively. The accuracy gains are improved by capturing the hierarchical information using the connection module. Moreover, we show that our model can be parallelized to avoid the repetitive updates of the model. As a result, our model reduces the training time by 62 percentages from the baseline. View Full-Text
Keywords: semantic role labeling; attention mechanism; selective connection semantic role labeling; attention mechanism; selective connection
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).
SciFeed

Share & Cite This Article

MDPI and ACS Style

Park, J. Selectively Connected Self-Attentions for Semantic Role Labeling. Appl. Sci. 2019, 9, 1716.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Appl. Sci. EISSN 2076-3417 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top