entropy-logo

Journal Browser

Journal Browser

Information-Theoretic Methods in Data Analytics, 2nd Edition

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory, Probability and Statistics".

Deadline for manuscript submissions: 30 November 2025 | Viewed by 784

Special Issue Editor

Industrial Engineering, Hanyang University, Seoul 04763, Republic of Korea
Interests: data mining; machine learning; online learning; big data analysis; time series analysis
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Information-theoretic methods in data analytics are an important and basic research tool to solve practical problems under uncertain situations. They serve as a fundamental building block in modern data mining, machine learning, pattern recognition, and deep learning, among other fields, as well as in classical data modeling, This research direction has received consistent attention from both academia and industry. Despite the necessity and success of information-based methods, the research community to share information-based paradigms and their applications.

This Special Issue aims to collect works on novel information-driven methods and their applications—hopefully with an emphasis on statistical frameworks and flows—in numerous domains, such as medicine, finance, business, biology, marketing, education, etc. Works that include topics such as information, entropy, statistical inference, data compression, feature selection and extraction, discovery of clusters and/or communities in association with prediction, outlier detection, association rule mining, recommendation systems, reinforcement learning, pattern recognition, deep neural networks, and other data-based statistical and analytical topics are of particular interest.

Dr. Kichun Lee
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • information
  • probability
  • divergence
  • statistical inference
  • data compression
  • data visualization
  • community detection
  • outlier detection
  • feature selection
  • data mining
  • machine learning
  • pattern recognition
  • neural networks
  • applications of data analysis

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • Reprint: MDPI Books provides the opportunity to republish successful Special Issues in book format, both online and in print.

Further information on MDPI's Special Issue policies can be found here.

Related Special Issue

Published Papers (2 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

24 pages, 1917 KiB  
Article
Empirical Evaluation of the Relative Range for Detecting Outliers
by Dania Dallah, Hana Sulieman, Ayman Al Zaatreh and Firuz Kamalov
Entropy 2025, 27(7), 731; https://doi.org/10.3390/e27070731 - 7 Jul 2025
Viewed by 279
Abstract
Outlier detection plays a key role in data analysis by improving data quality, uncovering data entry errors, and spotting unusual patterns, such as fraudulent activities. Choosing the right detection method is essential, as some approaches may be too complex or ineffective depending on [...] Read more.
Outlier detection plays a key role in data analysis by improving data quality, uncovering data entry errors, and spotting unusual patterns, such as fraudulent activities. Choosing the right detection method is essential, as some approaches may be too complex or ineffective depending on the data distribution. In this study, we explore a simple yet powerful approach using the range distribution to identify outliers in univariate data. We compare the effectiveness of two range statistics: we normalize the range by the standard deviation (σ) and the interquartile range (IQR) across different types of distributions, including normal, logistic, Laplace, and Weibull distributions, with varying sample sizes (n) and error rates (α). An evaluation of the range behavior across multiple distributions allows for the determination of threshold values for identifying potential outliers. Through extensive experimental work, the accuracy of both statistics in detecting outliers under various contamination strategies, sample sizes, and error rates (α=0.1,0.05,0.01) is investigated. The results demonstrate the flexibility of the proposed statistic, as it adapts well to different underlying distributions and maintains robust detection performance under a variety of conditions. Our findings underscore the value of an adaptive method for reliable anomaly detection in diverse data environments. Full article
(This article belongs to the Special Issue Information-Theoretic Methods in Data Analytics, 2nd Edition)
Show Figures

Figure 1

25 pages, 2002 KiB  
Article
Information-Theoretic Reliability Analysis of Linear Consecutive r-out-of-n:F Systems and Uniformity Testing
by Ghadah Alomani, Faten Alrewely and Mohamed Kayid
Entropy 2025, 27(6), 590; https://doi.org/10.3390/e27060590 - 31 May 2025
Viewed by 340
Abstract
This paper explores the reliability of linear consecutive r-out-of-n:F systems from an information-theoretic perspective, with a particular focus on testing for uniformity. At the heart of the study is extropy, a complementary measure to entropy that we use to gain deeper insights into [...] Read more.
This paper explores the reliability of linear consecutive r-out-of-n:F systems from an information-theoretic perspective, with a particular focus on testing for uniformity. At the heart of the study is extropy, a complementary measure to entropy that we use to gain deeper insights into the uncertainty associated with system lifetimes. We begin by deriving general expressions for extropy in these systems and examine how it behaves under different component lifetime distributions, particularly highlighting the role of heterogeneity. Theoretical bounds are developed, along with new characterization results, shedding light on the unique properties of the uniform distribution within this framework. To bridge theory and application, we propose a nonparametric estimator for extropy and build a new test statistic to assess uniformity. The effectiveness of this test is evaluated through comprehensive simulation studies, where we compare its power against several well-known alternatives across a range of scenarios. Overall, our findings offer both theoretical contributions to the understanding of information measures in reliability analysis and practical tools for statistical testing in applied settings. Full article
(This article belongs to the Special Issue Information-Theoretic Methods in Data Analytics, 2nd Edition)
Show Figures

Figure 1

Back to TopTop