Next Article in Journal
A Controlled Benchmark of Video Violence Detection Techniques
Previous Article in Journal
A Reliable Weighting Scheme for the Aggregation of Crowd Intelligence to Detect Fake News
Previous Article in Special Issue
Fully-Unsupervised Embeddings-Based Hypernym Discovery
Open AccessArticle

Modeling Word Learning and Processing with Recurrent Neural Networks

Institute for Computational Linguistics—Italian National Research Council, I-56124 Pisa, Italy
Information 2020, 11(6), 320; https://doi.org/10.3390/info11060320
Received: 24 April 2020 / Revised: 20 May 2020 / Accepted: 10 June 2020 / Published: 13 June 2020
(This article belongs to the Special Issue Advances in Computational Linguistics)
The paper focuses on what two different types of Recurrent Neural Networks, namely a recurrent Long Short-Term Memory and a recurrent variant of self-organizing memories, a Temporal Self-Organizing Map, can tell us about speakers’ learning and processing a set of fully inflected verb forms selected from the top-frequency paradigms of Italian and German. Both architectures, due to the re-entrant layer of temporal connectivity, can develop a strong sensitivity to sequential patterns that are highly attested in the training data. The main goal is to evaluate learning and processing dynamics of verb inflection data in the two neural networks by focusing on the effects of morphological structure on word production and word recognition, as well as on word generalization for untrained verb forms. For both models, results show that production and recognition, as well as generalization, are facilitated for verb forms in regular paradigms. However, the two models are differently influenced by structural effects, with the Temporal Self-Organizing Map more prone to adaptively find a balance between processing issues of learnability and generalization, on the one side, and discriminability on the other side. View Full-Text
Keywords: word-learning; serial word processing; recurrent neural networks; long short-term memories; temporal self-organizing memories word-learning; serial word processing; recurrent neural networks; long short-term memories; temporal self-organizing memories
Show Figures

Figure 1

MDPI and ACS Style

Marzi, C. Modeling Word Learning and Processing with Recurrent Neural Networks. Information 2020, 11, 320.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Search more from Scilit
 
Search
Back to TopTop