entropy-logo

Journal Browser

Journal Browser

Statistical Planning, Inference, and Decision Making in High-Dimensional Data Analysis

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory, Probability and Statistics".

Deadline for manuscript submissions: 30 April 2026 | Viewed by 1754

Special Issue Editor

*
E-Mail Website
Guest Editor

Special Issue Information

Dear Colleagues,

Recent advances in technology, major progress in mathematical representations, rapid developments in artificial intelligence modeling, and the rise of deep transdisciplinary collaborations have driven an explosion in data generation and a wide adoption of statistical and computational inference methods across diverse scientific fields. These advances bring both significant challenges and exciting opportunities to experiment, innovate, validate, and translate new mathematical and statistical foundations into applied AI that enhances many aspects of human life. Statistical learning, modeling, estimation, prediction, and machine learning algorithms are evolving in parallel at an unprecedented pace. This Special Issue will highlight modern approaches in statistics, computation, data science, and artificial intelligence, with a focus on their practical applications across scientific disciplines, particularly in post-estimation strategies.

Prof. Dr. S. Ejaz Ahmed
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 250 words) can be sent to the Editorial Office for assessment.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

 

Keywords

  • artificial intelligence
  • statistics
  • statistical learning
  • statistical inference
  • machine learning

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • Reprint: MDPI Books provides the opportunity to republish successful Special Issues in book format, both online and in print.

Further information on MDPI's Special Issue policies can be found here.

Published Papers (3 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

38 pages, 4142 KB  
Article
Failure Mode and Effect Analysis Using Large-Scale Group Decision Making and Normal Cloud Model
by Lijie Wu, Changchun Liu and Hanwen Song
Entropy 2026, 28(3), 360; https://doi.org/10.3390/e28030360 - 22 Mar 2026
Viewed by 407
Abstract
Failure Modes and Effects Analysis (FMEA) is crucial for complex system reliability. However, traditional FMEA and its existing enhancements face significant limitations. These notably include difficulties in handling diverse heterogeneous data, effectively coordinating large expert groups, and robustly propagating inherent uncertainties. To bridge [...] Read more.
Failure Modes and Effects Analysis (FMEA) is crucial for complex system reliability. However, traditional FMEA and its existing enhancements face significant limitations. These notably include difficulties in handling diverse heterogeneous data, effectively coordinating large expert groups, and robustly propagating inherent uncertainties. To bridge these critical gaps, this paper proposes an innovative and robust FMEA framework, specifically designed for Large Group Decision Making (LGDM) under uncertainty, leveraging the Normal Cloud Model (NCM). First, LGDM is genuinely integrated into FMEA by involving an unprecedented number of experts (>50). Second, a broad spectrum of heterogeneous data, including exact numbers, interval numbers, NCMs, linguistic terms, and linguistic expressions, is utilized to effectively model and manage diverse uncertainties. Third, a four-step data preprocessing method is incorporated to efficiently screen invalid and low-quality inputs, significantly enhancing the reliability of aggregated results. Fourth, an innovative and comprehensive expert weight determination method that judiciously combines subjective factors with objective data quality is proposed, ensuring more trustworthy and equitable aggregation of judgments. Distinctively, our method explicitly preserves and propagates uncertainty information across the entire computational process, yielding more insightful and informative results beyond simple rankings, encompassing detailed quantitative uncertainty analysis. A practical case study, alongside detailed result analysis, sensitivity analysis, both qualitative and quantitative comparative analysis, and advantages and limitations analysis, collectively confirms the effectiveness, practicality, rationality, and robustness of the proposed method. The sensitivity analyses demonstrate that the final risk rankings are highly stable even under varying trade-off coefficients, confirming the method’s strong robustness and insensitivity to parameter fluctuations. Our framework provides a scientifically advanced and robust approach for FMEA in complex decision-making environments, particularly applicable to high-stakes industries such as modern aviation, thereby enabling more informed risk management decisions. Full article
Show Figures

Figure 1

29 pages, 1017 KB  
Article
Bayesian Elastic Net Cox Models for Time-to-Event Prediction: Application to a Breast Cancer Cohort
by Ersin Yılmaz, Syed Ejaz Ahmed and Dursun Aydın
Entropy 2026, 28(3), 264; https://doi.org/10.3390/e28030264 - 27 Feb 2026
Viewed by 420
Abstract
High-dimensional survival analyses require calibrated risk and measurable uncertainty, but standard elastic net Cox models provide only point estimates. We develop a Bayesian elastic net Cox (BEN–Cox) model for high-dimensional proportional hazards regression that places a hierarchical global–local shrinkage prior on coefficients and [...] Read more.
High-dimensional survival analyses require calibrated risk and measurable uncertainty, but standard elastic net Cox models provide only point estimates. We develop a Bayesian elastic net Cox (BEN–Cox) model for high-dimensional proportional hazards regression that places a hierarchical global–local shrinkage prior on coefficients and performs full Bayesian inference via Hamiltonian Monte Carlo. We represent the elastic net penalty as a global–local Gaussian scale mixture with hyperpriors that learn the 1/2 trade-off, enabling adaptive sparsity that preserves correlated gene groups; using HMC with the Cox partial likelihood, we obtain full posterior distributions for hazard ratios and patient-level survival curves. Methodologically, we formalize a Bayesian analogue of the elastic net grouping effect at the posterior mode and establish posterior contraction under sparsity for the Cox partial likelihood, supporting the stability of the resulting risk scores. On the METABRIC breast cancer cohort (n=1903; p=440 gene-level features after preprocessing, derived from an Illumina HT-12 array with ≈24,000 probes at the raw feature level), BEN–Cox achieves slightly lower prediction error, higher discrimination, and better global calibration than a tuned ridge Cox, lasso Cox, and elastic net Cox baselines on a held-out test set. Posterior summaries provide credible intervals for hazard ratios and identify a compact gene panel that remains biologically plausible. BEN–Cox provides an uncertainty-aware alternative to tuned penalized Cox models with theoretical support, offering modest improvements in calibration and providing an interpretable sparse signature in highly-correlated survival data. Full article
Show Figures

Figure 1

22 pages, 1985 KB  
Article
Non-Parametric Goodness-of-Fit Tests Using Tsallis Entropy Measures
by Mehmet Siddik Cadirci
Entropy 2025, 27(12), 1210; https://doi.org/10.3390/e27121210 - 28 Nov 2025
Viewed by 469
Abstract
We develop goodness-of-fit (GOF) procedures rooted in Tsallis entropy, with a particular emphasis on multivariate exponential-power (generalized Gaussian) and q-Gaussian models. The GOF statistic compares a closed-form Tsallis entropy under the null with a non-parametric k-nearest-neighbor (k-NN) estimator. We [...] Read more.
We develop goodness-of-fit (GOF) procedures rooted in Tsallis entropy, with a particular emphasis on multivariate exponential-power (generalized Gaussian) and q-Gaussian models. The GOF statistic compares a closed-form Tsallis entropy under the null with a non-parametric k-nearest-neighbor (k-NN) estimator. We establish consistency and mean-square convergence of the estimator under mild regularity and tail assumptions, discuss an asymptotic normality regime as q1, and calibrate critical values by parametric bootstrap/permutation. Extensive Monte Carlo experiments report empirical size, power, and runtime. These are reported across dimensions, k, and q. An applied example illustrates practical calibration and sensitivity, which are essential for accurate measurement. Full article
Show Figures

Figure 1

Back to TopTop