Next Article in Journal
A Refined Composite Multivariate Multiscale Fluctuation Dispersion Entropy and Its Application to Multivariate Signal of Rotating Machinery
Next Article in Special Issue
Gait Stability Measurement by Using Average Entropy
Previous Article in Journal
Information-Theoretic Generalization Bounds for Meta-Learning and Applications
Previous Article in Special Issue
Bivariate Entropy Analysis of Electrocardiographic RR–QT Time Series
Article

An Entropy Metric for Regular Grammar Classification and Learning with Recurrent Neural Networks

1
Information Sciences and Technology, Pennsylvania State University, University Park, PA 16802, USA
2
Alibaba Group, Building A2, Lane 55 Chuan He Road Zhangjiang, Pudong New District, Shanghai 200135, China
*
Author to whom correspondence should be addressed.
Entropy 2021, 23(1), 127; https://doi.org/10.3390/e23010127
Received: 5 December 2020 / Revised: 28 December 2020 / Accepted: 13 January 2021 / Published: 19 January 2021
(This article belongs to the Special Issue Entropy in Data Analysis)
Recently, there has been a resurgence of formal language theory in deep learning research. However, most research focused on the more practical problems of attempting to represent symbolic knowledge by machine learning. In contrast, there has been limited research on exploring the fundamental connection between them. To obtain a better understanding of the internal structures of regular grammars and their corresponding complexity, we focus on categorizing regular grammars by using both theoretical analysis and empirical evidence. Specifically, motivated by the concentric ring representation, we relaxed the original order information and introduced an entropy metric for describing the complexity of different regular grammars. Based on the entropy metric, we categorized regular grammars into three disjoint subclasses: the polynomial, exponential and proportional classes. In addition, several classification theorems are provided for different representations of regular grammars. Our analysis was validated by examining the process of learning grammars with multiple recurrent neural networks. Our results show that as expected more complex grammars are generally more difficult to learn. View Full-Text
Keywords: entropy; regular grammar classification; complexity analysis; recurrent neural network entropy; regular grammar classification; complexity analysis; recurrent neural network
Show Figures

Figure 1

MDPI and ACS Style

Zhang, K.; Wang, Q.; Giles, C.L. An Entropy Metric for Regular Grammar Classification and Learning with Recurrent Neural Networks. Entropy 2021, 23, 127. https://doi.org/10.3390/e23010127

AMA Style

Zhang K, Wang Q, Giles CL. An Entropy Metric for Regular Grammar Classification and Learning with Recurrent Neural Networks. Entropy. 2021; 23(1):127. https://doi.org/10.3390/e23010127

Chicago/Turabian Style

Zhang, Kaixuan, Qinglong Wang, and C. L. Giles. 2021. "An Entropy Metric for Regular Grammar Classification and Learning with Recurrent Neural Networks" Entropy 23, no. 1: 127. https://doi.org/10.3390/e23010127

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop