Information Theory, Probability and Statistics
A section of Entropy (ISSN 1099-4300).
Section Information
In 1948 C. E. Shannon published his paper “A Mathematical Theory of Communication” in the Bell Systems Technical Journal. He showed how information could be quantified with absolute precision, and demonstrated the essential unity of all information media. In brief, he introduced four groundbreaking concepts that were influential enough to help change the world. Thus, his most eminent result was the concept that every communication channel had a speed limit, measured in binary digits per second. Additionally, he also realized that the content of the message was irrelevant to its transmission, since once data is represented digitally it could be regenerated and transmitted without error. On the other hand, the efficient representation of data, i.e. the source coding, was another question that Shannon opened for discussion. Finally, his paper also defined the amount of information that can be sent down a noisy channel in terms of transmit power and bandwidth, thus introducing the concept of entropy.
From that moment, this theory has been widely applied to numerous scenarios, such as statistical inference, natural language processing, cryptography, neurobiology, molecular engineering, ecology, medical physics, biomedical engineering, thermal physics, quantum computing, linguistics, plagiarism detection, pattern recognition and anomaly detection, among others. Indeed, in recent decades, it has played a key role in the invention of the compact disc, the feasibility of mobile phones, the development of the Internet, the study of linguistics and of human perception, the understanding of black holes, as well as in numerous other fields.
This section, focuses on original and new research results regarding this broad and deep mathematical theory, as well as in diverse applications. Thus, manuscripts on source coding, channel coding, algorithmic complexity theory, algorithmic information theory, information–theoretic security, and measures of information, as well as on their application to traditional as well as novel scenarios are solicited. Submissions addressing critical up-to-date reviews will also be welcome. Download Section Flyer
Prof. Dr. Raúl Alcaraz Martínez
Section Editor-in-Chief
Keywords
Information Theory:
- communications and communications networks;
- sequences;
- coding theory and techniques;
- network coding and lattice theory;
- quantum information theory;
- signal processing;
- Shannon theory;
- complexity and cryptography;
- data compression;
- multi-user, multi-variate and hyper dimensional information theory;
- coded modulation;
- computational complexity;
- information dynamics and measures ;
- theoretical computer science and artificial intelligence;
- information theoretic learning;
- information fusion;
- fractional order generalized information;
- emerging applications of information theory in economics, medicine, biology, industry, thermodynamics, education, chemistry, physics, cognitive science, and social science;
- application of information theory in wireless/multimedia applications;
- application of information theory in image processing, computer graphics and visualization
Statistics:
- machine learning and its applications;
- deep learning and its applications;
- learning and inference;
- pattern recognition;
- statistical learning and data mining;
- information algebra/geometry;
- stochastic processes;
- computational statistics;
- statistical modeling;
- natural language processing;
- emerging applications of statistical theory in economics, medicine, biology, industry, thermodynamics, education, chemistry, physics, cognitive science,and social science
Probability:
- detection and estimation theory;
- probability theory;
- communication combinatorial problems;
- decision-making theory;
- emerging applications of probability theory in economics, medicine, biology, industry, thermodynamics, education, chemistry, physics, cognitive science, and social science
Editorial Board
Topical Advisory Panel
Special Issues
Following special issues within this section are currently open for submissions:
- Probabilistic Models in Machine and Human Learning (Deadline: 14 June 2023)
- Advances in Uncertain Information Fusion (Deadline: 15 June 2023)
- Information Geometry for Machine Learning (Deadline: 20 June 2023)
- Information Theory for Distributed Systems (Deadline: 20 June 2023)
- Fairness in Machine Learning: Information Theoretic Perspectives (Deadline: 30 June 2023)
- Information-Theoretic Causal Inference and Discovery (Deadline: 30 June 2023)
- Information Theory in Control Systems (Deadline: 30 June 2023)
- Shannon Entropy: Mathematical View (Deadline: 30 June 2023)
- Measures of Information III (Deadline: 30 June 2023)
- Application of Information Theory to Computer Vision and Image Processing (Deadline: 3 July 2023)
- Recent Advances in Entropy and Divergence Measures, with Applications in Statistics and Machine Learning (Deadline: 13 July 2023)
- Representation Learning: Theory, Applications and Ethical Issues II (Deadline: 15 July 2023)
- Information-Theoretic Methods in Deep Learning: Theory and Applications (Deadline: 15 July 2023)
- Information Theory and Swarm Optimization in Decision and Control (Deadline: 19 July 2023)
- Progress and Research Challenges to Catalyze B5G and 6G (Deadline: 19 July 2023)
- Information Theory in Emerging Wireless Communication Systems and Networks (Deadline: 28 July 2023)
- Integrated Information Theory and Consciousness II (Deadline: 31 July 2023)
- Selected Papers from the ICACTCE’23 Conference (Deadline: 31 July 2023)
- Information Theory and 5G/6G Wireless Communication System (Deadline: 10 August 2023)
- Entropy Based Data Hiding and Its Applications (Deadline: 15 August 2023)
- Forward Error Correction for Optical CDMA Networks (Deadline: 15 August 2023)
- Information Theory in Computer Vision and Artificial Intelligence (Deadline: 20 August 2023)
- Applied Probability, Information Theory and Applications (Deadline: 31 August 2023)
- Information Security and Privacy: From IoT to IoV (Deadline: 31 August 2023)
- Coding and Entropy (Deadline: 31 August 2023)
- Recent Advances in Statistical Inference for High Dimensional Data (Deadline: 10 September 2023)
- Learning from Games and Contests (Deadline: 15 September 2023)
- Information-Theoretic Approaches in Speech Processing and Recognition (Deadline: 15 September 2023)
- Wireless Networks: Information Theoretic Perspectives III (Deadline: 20 September 2023)
- Information Theory in Image Processing and Pattern Recognition (Deadline: 21 September 2023)
- Information Theory and Coding for Wireless Communications II (Deadline: 28 September 2023)
- Information-Theoretic Criteria for Statistical Model Selection (Deadline: 30 September 2023)
- Advances in Multiuser Information Theory (Deadline: 30 September 2023)
- Information Network Mining and Applications (Deadline: 30 September 2023)
- Delay-Doppler Domain Communications for Future Wireless Networks (Deadline: 15 October 2023)
- Information Theory in Multi-Agent Systems: Methods and Applications (Deadline: 20 October 2023)
- Synergy and Redundancy Measures: Theory and Applications to Characterize Complex Systems and Shape Neural Network Representations (Deadline: 20 October 2023)
- Stochastic Models and Statistical Inference: Analysis and Applications (Deadline: 31 October 2023)
- Maximum Entropy and Bayesian Methods for Image and Spatial Analysis (Deadline: 31 October 2023)
- Probability, Entropy, Information, and Semiosis in Living Systems (Deadline: 31 October 2023)
- Foundations of Goal-Oriented Semantic Communication in Intelligent Networks (Deadline: 25 November 2023)
- Information-Theoretic Methods in Data Analytics (Deadline: 30 November 2023)
- Robust Distance Metric Learning in the Framework of Statistical Information Theory (Deadline: 30 November 2023)
- Information Theory-Based Approach to Portfolio Optimization (Deadline: 30 November 2023)
- Wireless Communications: Signal Processing Perspectives (Deadline: 10 December 2023)
- Information-Theoretic Privacy in Retrieval, Computing, and Learning (Deadline: 15 December 2023)
- Information and Self-Organization III (Deadline: 15 December 2023)
- Information Theory and Cognitive Agents (Deadline: 15 December 2023)
- Extremal and Additive Combinatorial Aspects in Information Theory (Deadline: 31 December 2023)
- Causal Inference and Causal AI: Machine Learning Meets Information Theory (Deadline: 31 December 2023)
- Information Theory for Interpretable Machine Learning (Deadline: 31 December 2023)
- Machine Learning and Causal Inference (Deadline: 30 January 2024)
- Information Theory for Data Science (Deadline: 15 February 2024)
- Advances in Information Sciences and Applications II (Deadline: 29 February 2024)
- Quantum and Classical Physical Cryptography (Deadline: 30 April 2024)
- Integrated Sensing and Communications (Deadline: 31 May 2024)
Topical Collections
Following topical collections within this section are currently open for submissions: