Journal Menu► ▼ Journal Menu
Journal Browser► ▼ Journal Browser
Special Issue "Information–Theoretic Approaches to Computational Intelligence"
Deadline for manuscript submissions: 20 November 2019.
Computational Intelligence (CI) concerns information processing with biologically or linguistically inspired methods, such as neural networks, evolutionary computing, fuzzy logic, probabilistic and Bayesian methods, and learning theory. In contrast to this method-centered definition of CI, information theory concerns the fundamental limits of information processing, irrespective of the employed method. Nevertheless, information theory offers a great selection of quantities capturing statistical dependencies and similarities that have found applications as diverse as the analysis of cognitive processes and the design of man-made systems.
An information–theoretic approach to CI shall, therefore, concern the theory-driven development or theoretical analysis of CI methods with information–theoretic quantities. Specifically, such an approach shall give answers to the “what”, “how”, and “why”. What happens to a piece of information that is processed by a CI method? How is this information processing achieved? Why should information be processed in exactly this way? In this Special Issue, we hence seek contributions on:
- Input-to-output behavior of CI methods (e.g., the effect of a CI method on the information content of the processed signal). Theoretical or experimental analyses, using information–theoretic quantities, are welcome.
- Dynamic behavior of CI methods (e.g., information propagation in evolutionary computing, information–theoretic aspects of neural network training). Theoretical or experimental analyses, using information–theoretic quantities, are welcome.
- Development of CI methods using information–theoretic cost functions (e.g., information-maximizing CI methods, CI methods based on rate-distortion theory). Application-specific experimental analyses of information–theoretic cost functions are welcome, given that the proposed cost functions are justified from first principles and rigorously developed.
We wish to mention that the concurrent Special Issue “Information–Theoretic Approaches in Deep Learning” has a potential overlap with ours. We therefore reserve the right to prescreen submissions focused on neural networks and forward them to this other Special Issue in case of a better fit.
Prof. Dr. Gernot Kubin
Dr. Bernhard C. Geiger
Manuscript Submission Information
Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.
Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.
Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.
- evolutionary computing
- neural networks
- probabilistic methods
- fuzzy logic
- learning theory
- information–theoretic cost functions
- input–output behavior
The below list represents only planned manuscripts. Some of these manuscripts have not been received by the Editorial Office yet. Papers submitted to MDPI journals are subject to peer-review.
Title: Information Theoretic Concept Drift Detector
Authors: Shujian Yu, Francisco J. Valverde-Albacete, Carmen Pelaez-Moreno, Jose C. Principe
Abstract: Error-based methods have been widely used to detect concept drifts (i.e., changes in the joint distribution between predictor and response variables) in streaming data. Hence, the selection of the adopted error-related statistic is crucial to the performance of different concept drift detectors. In this paper, we suggest three novel error-related statistics in the entropy triangle, namely the redundancy, the mutual information and the variation of information, and use the Hoeffding’s inequality to monitor these three statistics over a sliding window. Experimental results, on both synthetic and real data, suggest that our new statistics perform well across different concept drift types (e.g., gradual or abrupt, recurrent or irregular) and different data stream distributions (e.g., balanced and imbalanced labels), regardless of the number of categories.