E-Mail Alert

Add your e-mail address to receive forthcoming issues of this journal:

Journal Browser

Journal Browser

Special Issue "Entropy and Complexity of Data"

A special issue of Entropy (ISSN 1099-4300).

Deadline for manuscript submissions: closed (30 June 2018)

Special Issue Editor

Guest Editor
Prof. Dr. Alexander N. Gorban

Department of Mathematics, University of Leicester, Leicester LE1 7RH, UK
Website | E-Mail
Phone: +441162231433
Interests: neural networks; chemical and biological kinetics; human adaptation to hard living conditions; methods and technologies of collective thinking

Special Issue Information

Dear Colleagues,

Discovery of interesting knowledge from complex databases is one of the most important applied problems and theoretical challenges in data science and technology. Many approaches to the notions of “interestingness” and “complexity” have been proposed during the last three decades. Nevertheless, we have to return to these topics again and again, because no satisfactory general theory which meets all of the challenges exists. We need to evaluate the complexity of high dimensional and large datasets and reveal what interesting knowledge exists inside.

Many of the approaches to interestingness and complexity are based on entropy and related notions. The modern development of the theory of measure concentration effects in high dimensions bring new light and new challenges: Essentially multidimensional datasets are boring from most of points of view and the discovery of reliable and interesting knowledge from them becomes a very difficult task.

Dimensionality reduction is a fundamental part of complexity analysis and facilitates the extraction of interesting and new information. Two limit cases have been studied in some depth: (i) reducible complexity which allows for the extraction of non-trivial, low-dimensional structures, and (ii) the self-averaging complexity which becomes simple analysis of multidimensional spheres and normal distributions. The no-man’s land between these extremes may well be very important in the development of complex artificial intelligence (AI) and in the study of real and artificial neural nets.

We invite papers on entropic methods in the analysis of the complexity of high-dimensional data and the extraction interesting knowledge from them. Related topics from the theory and methods of AI, neural networks and mathematical neurosciences are also welcome. We also encourage contributions that aim at exploring the connections of the measure concentration theory with complexity of high dimension data, knowledge interestingness, curse and blessing of dimensionality.

Prof. Dr. Alexander N. Gorban
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1500 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Published Papers (4 papers)

View options order results:
result details:
Displaying articles 1-4
Export citation of selected articles as:

Research

Open AccessArticle Investigation of Finite-Size 2D Ising Model with a Noisy Matrix of Spin-Spin Interactions
Entropy 2018, 20(8), 585; https://doi.org/10.3390/e20080585
Received: 20 June 2018 / Revised: 13 July 2018 / Accepted: 2 August 2018 / Published: 7 August 2018
PDF Full-text (5748 KB) | HTML Full-text | XML Full-text
Abstract
We analyze changes in the thermodynamic properties of a spin system when it passes from the classical two-dimensional Ising model to the spin glass model, where spin-spin interactions are random in their values and signs. Formally, the transition reduces to a gradual change
[...] Read more.
We analyze changes in the thermodynamic properties of a spin system when it passes from the classical two-dimensional Ising model to the spin glass model, where spin-spin interactions are random in their values and signs. Formally, the transition reduces to a gradual change in the amplitude of the multiplicative noise (distributed uniformly with a mean equal to one) superimposed over the initial Ising matrix of interacting spins. Considering the noise, we obtain analytical expressions that are valid for lattices of finite sizes. We compare our results with the results of computer simulations performed for square N = L × L lattices with linear dimensions L = 50 ÷ 1000. We find experimentally the dependencies of the critical values (the critical temperature, the internal energy, entropy and the specific heat) as well as the dependencies of the energy of the ground state and its magnetization on the amplitude of the noise. We show that when the variance of the noise reaches one, there is a jump of the ground state from the fully correlated state to an uncorrelated state and its magnetization jumps from 1 to 0. In the same time, a phase transition that is present at a lower level of the noise disappears. Full article
(This article belongs to the Special Issue Entropy and Complexity of Data)
Figures

Figure 1

Open AccessArticle Energy and Entropy Measures of Fuzzy Relations for Data Analysis
Entropy 2018, 20(6), 424; https://doi.org/10.3390/e20060424
Received: 18 April 2018 / Revised: 25 May 2018 / Accepted: 30 May 2018 / Published: 31 May 2018
PDF Full-text (2601 KB) | HTML Full-text | XML Full-text
Abstract
We present a new method for assessing the strength of fuzzy rules with respect to a dataset, based on the measures of the greatest energy and smallest entropy of a fuzzy relation. Considering a fuzzy automaton (relation), in which A is the input
[...] Read more.
We present a new method for assessing the strength of fuzzy rules with respect to a dataset, based on the measures of the greatest energy and smallest entropy of a fuzzy relation. Considering a fuzzy automaton (relation), in which A is the input fuzzy set and B the output fuzzy set, the fuzzy relation R1 with greatest energy provides information about the greatest strength of the input-output, and the fuzzy relation R2 with the smallest entropy provides information about uncertainty of the input-output relationship. We consider a new index of the fuzziness of the input-output based on R1 and R2. In our method, this index is calculated for each pair of input and output fuzzy sets in a fuzzy rule. A threshold value is set in order to choose the most relevant fuzzy rules with respect to the data. Full article
(This article belongs to the Special Issue Entropy and Complexity of Data)
Figures

Figure 1

Open AccessArticle Do We Really Need to Catch Them All? A New User-Guided Social Media Crawling Method
Entropy 2017, 19(12), 686; https://doi.org/10.3390/e19120686
Received: 18 October 2017 / Revised: 28 November 2017 / Accepted: 11 December 2017 / Published: 13 December 2017
PDF Full-text (774 KB) | HTML Full-text | XML Full-text | Supplementary Files
Abstract
[-15]With the growing use of popular social media services like Facebook and Twitter it is challenging to collect all content from the networks without access to the core infrastructure or paying for it. Thus, if all content cannot be collected one must consider
[...] Read more.
[-15]With the growing use of popular social media services like Facebook and Twitter it is challenging to collect all content from the networks without access to the core infrastructure or paying for it. Thus, if all content cannot be collected one must consider which data are of most importance. In this work we present a novel User-guided Social Media Crawling method (USMC) that is able to collect data from social media, utilizing the wisdom of the crowd to decide the order in which user generated content should be collected to cover as many user interactions as possible. USMC is validated by crawling 160 public Facebook pages, containing content from 368 million users including 1.3 billion interactions, and it is compared with two other crawling methods. The results show that it is possible to cover approximately 75% of the interactions on a Facebook page by sampling just 20% of its posts, and at the same time reduce the crawling time by 53%. In addition, the social network constructed from the 20% sample contains more than 75% of the users and edges compared to the social network created from all posts, and it has similar degree distribution. Full article
(This article belongs to the Special Issue Entropy and Complexity of Data)
Figures

Figure 1

Open AccessArticle Capturing Causality for Fault Diagnosis Based on Multi-Valued Alarm Series Using Transfer Entropy
Entropy 2017, 19(12), 663; https://doi.org/10.3390/e19120663
Received: 1 November 2017 / Revised: 24 November 2017 / Accepted: 29 November 2017 / Published: 4 December 2017
Cited by 1 | PDF Full-text (6342 KB) | HTML Full-text | XML Full-text
Abstract
Transfer entropy (TE) is a model-free approach based on information theory to capture causality between variables, which has been used for the modeling and monitoring of, and fault diagnosis in, complex industrial processes. It is able to detect the causality between variables without
[...] Read more.
Transfer entropy (TE) is a model-free approach based on information theory to capture causality between variables, which has been used for the modeling and monitoring of, and fault diagnosis in, complex industrial processes. It is able to detect the causality between variables without assuming any underlying model, but it is computationally burdensome. To overcome this limitation, a hybrid method of TE and the modified conditional mutual information (CMI) approach is proposed by using generated multi-valued alarm series. In order to obtain a process topology, TE can generate a causal map of all sub-processes and modified CMI can be used to distinguish the direct connectivity from the above-mentioned causal map by using multi-valued alarm series. The effectiveness and accuracy rate of the proposed method are validated by simulated and real industrial cases (the Tennessee-Eastman process) to capture process topology by using multi-valued alarm series. Full article
(This article belongs to the Special Issue Entropy and Complexity of Data)
Figures

Figure 1

Back to Top