entropy-logo

Journal Browser

Journal Browser

Entropy and Complexity of Data

A special issue of Entropy (ISSN 1099-4300).

Deadline for manuscript submissions: closed (30 June 2018) | Viewed by 19855

Special Issue Editor

Special Issue Information

Dear Colleagues,

Discovery of interesting knowledge from complex databases is one of the most important applied problems and theoretical challenges in data science and technology. Many approaches to the notions of “interestingness” and “complexity” have been proposed during the last three decades. Nevertheless, we have to return to these topics again and again, because no satisfactory general theory which meets all of the challenges exists. We need to evaluate the complexity of high dimensional and large datasets and reveal what interesting knowledge exists inside.

Many of the approaches to interestingness and complexity are based on entropy and related notions. The modern development of the theory of measure concentration effects in high dimensions bring new light and new challenges: Essentially multidimensional datasets are boring from most of points of view and the discovery of reliable and interesting knowledge from them becomes a very difficult task.

Dimensionality reduction is a fundamental part of complexity analysis and facilitates the extraction of interesting and new information. Two limit cases have been studied in some depth: (i) reducible complexity which allows for the extraction of non-trivial, low-dimensional structures, and (ii) the self-averaging complexity which becomes simple analysis of multidimensional spheres and normal distributions. The no-man’s land between these extremes may well be very important in the development of complex artificial intelligence (AI) and in the study of real and artificial neural nets.

We invite papers on entropic methods in the analysis of the complexity of high-dimensional data and the extraction interesting knowledge from them. Related topics from the theory and methods of AI, neural networks and mathematical neurosciences are also welcome. We also encourage contributions that aim at exploring the connections of the measure concentration theory with complexity of high dimension data, knowledge interestingness, curse and blessing of dimensionality.

Prof. Dr. Alexander N. Gorban
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (4 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

13 pages, 5748 KiB  
Article
Investigation of Finite-Size 2D Ising Model with a Noisy Matrix of Spin-Spin Interactions
by Boris Kryzhanovsky, Magomed Malsagov and Iakov Karandashev
Entropy 2018, 20(8), 585; https://doi.org/10.3390/e20080585 - 7 Aug 2018
Cited by 7 | Viewed by 5019
Abstract
We analyze changes in the thermodynamic properties of a spin system when it passes from the classical two-dimensional Ising model to the spin glass model, where spin-spin interactions are random in their values and signs. Formally, the transition reduces to a gradual change [...] Read more.
We analyze changes in the thermodynamic properties of a spin system when it passes from the classical two-dimensional Ising model to the spin glass model, where spin-spin interactions are random in their values and signs. Formally, the transition reduces to a gradual change in the amplitude of the multiplicative noise (distributed uniformly with a mean equal to one) superimposed over the initial Ising matrix of interacting spins. Considering the noise, we obtain analytical expressions that are valid for lattices of finite sizes. We compare our results with the results of computer simulations performed for square N = L × L lattices with linear dimensions L = 50 ÷ 1000. We find experimentally the dependencies of the critical values (the critical temperature, the internal energy, entropy and the specific heat) as well as the dependencies of the energy of the ground state and its magnetization on the amplitude of the noise. We show that when the variance of the noise reaches one, there is a jump of the ground state from the fully correlated state to an uncorrelated state and its magnetization jumps from 1 to 0. In the same time, a phase transition that is present at a lower level of the noise disappears. Full article
(This article belongs to the Special Issue Entropy and Complexity of Data)
Show Figures

Figure 1

12 pages, 2601 KiB  
Article
Energy and Entropy Measures of Fuzzy Relations for Data Analysis
by Ferdinando Di Martino and Salvatore Sessa
Entropy 2018, 20(6), 424; https://doi.org/10.3390/e20060424 - 31 May 2018
Cited by 3 | Viewed by 3946
Abstract
We present a new method for assessing the strength of fuzzy rules with respect to a dataset, based on the measures of the greatest energy and smallest entropy of a fuzzy relation. Considering a fuzzy automaton (relation), in which A is the input [...] Read more.
We present a new method for assessing the strength of fuzzy rules with respect to a dataset, based on the measures of the greatest energy and smallest entropy of a fuzzy relation. Considering a fuzzy automaton (relation), in which A is the input fuzzy set and B the output fuzzy set, the fuzzy relation R1 with greatest energy provides information about the greatest strength of the input-output, and the fuzzy relation R2 with the smallest entropy provides information about uncertainty of the input-output relationship. We consider a new index of the fuzziness of the input-output based on R1 and R2. In our method, this index is calculated for each pair of input and output fuzzy sets in a fuzzy rule. A threshold value is set in order to choose the most relevant fuzzy rules with respect to the data. Full article
(This article belongs to the Special Issue Entropy and Complexity of Data)
Show Figures

Figure 1

774 KiB  
Article
Do We Really Need to Catch Them All? A New User-Guided Social Media Crawling Method
by Fredrik Erlandsson, Piotr Bródka, Martin Boldt and Henric Johnson
Entropy 2017, 19(12), 686; https://doi.org/10.3390/e19120686 - 13 Dec 2017
Cited by 5 | Viewed by 5731
Abstract
[-15]With the growing use of popular social media services like Facebook and Twitter it is challenging to collect all content from the networks without access to the core infrastructure or paying for it. Thus, if all content cannot be collected one must consider [...] Read more.
[-15]With the growing use of popular social media services like Facebook and Twitter it is challenging to collect all content from the networks without access to the core infrastructure or paying for it. Thus, if all content cannot be collected one must consider which data are of most importance. In this work we present a novel User-guided Social Media Crawling method (USMC) that is able to collect data from social media, utilizing the wisdom of the crowd to decide the order in which user generated content should be collected to cover as many user interactions as possible. USMC is validated by crawling 160 public Facebook pages, containing content from 368 million users including 1.3 billion interactions, and it is compared with two other crawling methods. The results show that it is possible to cover approximately 75% of the interactions on a Facebook page by sampling just 20% of its posts, and at the same time reduce the crawling time by 53%. In addition, the social network constructed from the 20% sample contains more than 75% of the users and edges compared to the social network created from all posts, and it has similar degree distribution. Full article
(This article belongs to the Special Issue Entropy and Complexity of Data)
Show Figures

Figure 1

6342 KiB  
Article
Capturing Causality for Fault Diagnosis Based on Multi-Valued Alarm Series Using Transfer Entropy
by Jianjun Su, Dezheng Wang, Yinong Zhang, Fan Yang, Yan Zhao and Xiangkun Pang
Entropy 2017, 19(12), 663; https://doi.org/10.3390/e19120663 - 4 Dec 2017
Cited by 22 | Viewed by 4507
Abstract
Transfer entropy (TE) is a model-free approach based on information theory to capture causality between variables, which has been used for the modeling and monitoring of, and fault diagnosis in, complex industrial processes. It is able to detect the causality between variables without [...] Read more.
Transfer entropy (TE) is a model-free approach based on information theory to capture causality between variables, which has been used for the modeling and monitoring of, and fault diagnosis in, complex industrial processes. It is able to detect the causality between variables without assuming any underlying model, but it is computationally burdensome. To overcome this limitation, a hybrid method of TE and the modified conditional mutual information (CMI) approach is proposed by using generated multi-valued alarm series. In order to obtain a process topology, TE can generate a causal map of all sub-processes and modified CMI can be used to distinguish the direct connectivity from the above-mentioned causal map by using multi-valued alarm series. The effectiveness and accuracy rate of the proposed method are validated by simulated and real industrial cases (the Tennessee-Eastman process) to capture process topology by using multi-valued alarm series. Full article
(This article belongs to the Special Issue Entropy and Complexity of Data)
Show Figures

Figure 1

Back to TopTop