Entropy and Information Theory

A special issue of Axioms (ISSN 2075-1680).

Deadline for manuscript submissions: closed (20 December 2017) | Viewed by 12158

Special Issue Editors


E-Mail Website
Guest Editor
Department of Computer Science, The City College of New York, New York, NY 10031, USA
Interests: network science; complexity of graphs and networks; dynamic distributed database systems; virtual organization; management and economics of information
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
1. Institute for Intelligent Production, Faculty for Management, University of Applied Sciences Upper Austria, Campus Steyr, Wehrgrabengasse 1, 4040 Steyr, Austria
2. College of Artificial Intelligence, Nankai University, Tianjin 300071, China
Interests: applied mathematics; bioinformatics; data mining; machine learning; systems biology; graph theory; complexity and information theory
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

The relationship between entropy and information theory has a long history and has led to many important results, as well as to applications in fields outside of mathematics and physics. This Special Issue will focus on applications of Shannon information theory to the measurement of entropy in mathematical systems and models, primarily those based on graphs or networks. Theoretical papers, as well as those reporting on experimental results, are welcome. Suggested topics include:

  • Information-theoretic measures on complex networks
  • Shannon Entropy measures on random graphs
  • Applications of information-theoretic measures in chemistry, biology, social sciences and the humanities

Prof. Dr. Abbe Mowshowitz
Prof. Dr. Matthias Dehmer
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Axioms is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2400 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • entropy based information measures
  • entropy of graphs and networks
  • applications of entropy based information measures

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (2 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

695 KiB  
Article
Toward Measuring Network Aesthetics Based on Symmetry
by Zengqiang Chen, Matthias Dehmer, Frank Emmert-Streib, Abbe Mowshowitz and Yongtang Shi
Axioms 2017, 6(2), 12; https://doi.org/10.3390/axioms6020012 - 6 May 2017
Cited by 5 | Viewed by 4911
Abstract
In this exploratory paper, we discuss quantitative graph-theoretical measures of network aesthetics. Related work in this area has typically focused on geometrical features (e.g., line crossings or edge bendiness) of drawings or visual representations of graphs which purportedly affect an observer’s perception. Here [...] Read more.
In this exploratory paper, we discuss quantitative graph-theoretical measures of network aesthetics. Related work in this area has typically focused on geometrical features (e.g., line crossings or edge bendiness) of drawings or visual representations of graphs which purportedly affect an observer’s perception. Here we take a very different approach, abandoning reliance on geometrical properties, and apply information-theoretic measures to abstract graphs and networks directly (rather than to their visual representaions) as a means of capturing classical appreciation of structural symmetry. Examples are used solely to motivate the approach to measurement, and to elucidate our symmetry-based mathematical theory of network aesthetics. Full article
(This article belongs to the Special Issue Entropy and Information Theory)
Show Figures

Figure 1

628 KiB  
Article
Expansion of the Kullback-Leibler Divergence, and a New Class of Information Metrics
by David J. Galas, Gregory Dewey, James Kunert-Graf and Nikita A. Sakhanenko
Axioms 2017, 6(2), 8; https://doi.org/10.3390/axioms6020008 - 1 Apr 2017
Cited by 21 | Viewed by 6446
Abstract
Inferring and comparing complex, multivariable probability density functions is fundamental to problems in several fields, including probabilistic learning, network theory, and data analysis. Classification and prediction are the two faces of this class of problem. This study takes an approach that simplifies many [...] Read more.
Inferring and comparing complex, multivariable probability density functions is fundamental to problems in several fields, including probabilistic learning, network theory, and data analysis. Classification and prediction are the two faces of this class of problem. This study takes an approach that simplifies many aspects of these problems by presenting a structured, series expansion of the Kullback-Leibler divergence—a function central to information theory—and devise a distance metric based on this divergence. Using the Möbius inversion duality between multivariable entropies and multivariable interaction information, we express the divergence as an additive series in the number of interacting variables, which provides a restricted and simplified set of distributions to use as approximation and with which to model data. Truncations of this series yield approximations based on the number of interacting variables. The first few terms of the expansion-truncation are illustrated and shown to lead naturally to familiar approximations, including the well-known Kirkwood superposition approximation. Truncation can also induce a simple relation between the multi-information and the interaction information. A measure of distance between distributions, based on Kullback-Leibler divergence, is then described and shown to be a true metric if properly restricted. The expansion is shown to generate a hierarchy of metrics and connects this work to information geometry formalisms. An example of the application of these metrics to a graph comparison problem is given that shows that the formalism can be applied to a wide range of network problems and provides a general approach for systematic approximations in numbers of interactions or connections, as well as a related quantitative metric. Full article
(This article belongs to the Special Issue Entropy and Information Theory)
Show Figures

Figure 1

Back to TopTop