entropy-logo

Journal Browser

Journal Browser

Stochastic Models and Statistical Inference: Analysis and Applications

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory, Probability and Statistics".

Deadline for manuscript submissions: closed (31 December 2024) | Viewed by 10936

Special Issue Editors


E-Mail Website
Guest Editor
Laboratoire de Mathématiques et Applications, University of Poitiers, 86000 Poitiers, France
Interests: nonparametric statistics; stochastic algorithms; statistics applied to life sciences; statistical learning of networks; artificial intelligence
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Assistant Guest Editor
Institut Montpelliérain Alexander Grothendieck, University Montpellier, CNRS, Montpellier, France
Interests: parameter estimation; stochastic model; stochastic model applied to life sciences; applied probability

Special Issue Information

Dear Colleagues,

Databases are becoming more and more accessible, voluminous and complex. In order to make the best use of them, non-parametric statistical methods, stochastic algorithms, statistical learning of networks, stochastic analysis are frequently used.

There has been growing increasing interest in Stochastic Models and Statistical Inference, including correlation analyses for spatial and temporal data and classification techniques for complex data. Progress has often been driven by the application areas, such as neurosciences, environmetrics, chemometrics, biometrics, medicine, and econometrics.

The application of Stochastic Models and Statistical Inference to data of real-world complex systems are often hindered by the frequent lack of the convergence problems and sufficient asymptotic mathematical properties. Contributions addressing any of these issues are very welcome.

This Special Issue aims to be a forum for the presentation of new and improved techniques in the area of Stochastic Models and Statistical Inference. In particular, the analysis and interpretation of real-world natural and engineered complex systems with the help of non-parametric statistical methods, stochastic algorithms, statistical learning of networks, Stochastic Models fall within the scope of this Special Issue as well as parameter estimation.

Dr. Yousri Slaoui
Dr. Solym Mawaki Manou-Abi
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • stochastic model
  • parameter estimation
  • data analysis
  • stochastic algorithms
  • simulation

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue policies can be found here.

Published Papers (9 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

20 pages, 630 KiB  
Article
Overcoming Dilution of Collision Probability in Satellite Conjunction Analysis via Confidence Distribution
by Hangbin Lee and Youngjo Lee
Entropy 2025, 27(4), 329; https://doi.org/10.3390/e27040329 - 21 Mar 2025
Viewed by 263
Abstract
In satellite conjunction analysis, dilution of collision probability has been recognized as a significant deficiency of probabilistic inference. A recent study identified the false confidence problem as another limitation and suggested a possible causal link between the two, arguing that addressing false confidence [...] Read more.
In satellite conjunction analysis, dilution of collision probability has been recognized as a significant deficiency of probabilistic inference. A recent study identified the false confidence problem as another limitation and suggested a possible causal link between the two, arguing that addressing false confidence could be necessary to prevent dilution of collision probability. However, this paper clarifies the distinction between probability dilution and false confidence by investigating a confidence distribution (CD) with a point mass at zero. Although the point mass in CD has often been perceived as paradoxical, we demonstrate that it plays an essential role in satellite conjunction analysis by capturing the uncertainty in the data. Consequently, the CD resolves the probability dilution of probabilistic inference and the ambiguity in Neymanian confidence intervals, while addressing the false confidence for the hypothesis of interest in satellite conjunction analysis. Furthermore, the confidence derived from the CD offers a direct interpretation as a p-value for hypothesis testing related to collision risk. Full article
Show Figures

Figure 1

42 pages, 3013 KiB  
Article
Optimal Power Procurement for Green Cellular Wireless Networks Under Uncertainty and Chance Constraints
by Nadhir Ben Rached, Shyam Mohan Subbiah Pillai and Raúl Tempone
Entropy 2025, 27(3), 308; https://doi.org/10.3390/e27030308 - 14 Mar 2025
Viewed by 421
Abstract
Given the increasing global emphasis on sustainable energy usage and the rising energy demands of cellular wireless networks, this work seeks an optimal short-term, continuous-time power-procurement schedule to minimize operating expenditure and the carbon footprint of cellular wireless networks equipped with energy-storage capacity, [...] Read more.
Given the increasing global emphasis on sustainable energy usage and the rising energy demands of cellular wireless networks, this work seeks an optimal short-term, continuous-time power-procurement schedule to minimize operating expenditure and the carbon footprint of cellular wireless networks equipped with energy-storage capacity, and hybrid energy systems comprising uncertain renewable energy sources. Despite the stochastic nature of wireless fading channels, the network operator must ensure a certain quality-of-service (QoS) constraint with high probability. This probabilistic constraint prevents using the dynamic programming principle to solve the stochastic optimal control problem. This work introduces a novel time-continuous Lagrangian relaxation approach tailored for real-time, near-optimal energy procurement in cellular networks, overcoming tractability problems associated with the probabilistic QoS constraint. The numerical solution procedure includes an efficient upwind finite-difference solver for the Hamilton–Jacobi–Bellman equation corresponding to the relaxed problem, and an effective combination of the limited memory bundle method (LMBM) for handling nonsmooth optimization and the stochastic subgradient method (SSM) to navigate the stochasticity of the dual problem. Numerical results, based on the German power system and daily cellular traffic data, demonstrate the computational efficiency of the proposed numerical approach, providing a near-optimal policy in a practical timeframe. Full article
Show Figures

Figure 1

19 pages, 4184 KiB  
Article
An Online Evaluation Method for Random Number Entropy Sources Based on Time-Frequency Feature Fusion
by Qian Sun, Kainan Ma, Yiheng Zhou, Zhaoyuxuan Wang, Chaoxing You and Ming Liu
Entropy 2025, 27(2), 136; https://doi.org/10.3390/e27020136 - 27 Jan 2025
Viewed by 627
Abstract
Traditional entropy source evaluation methods rely on statistical analysis and are hard to deploy on-chip or online. However, online detection of entropy source quality is necessary in some applications with high encryption levels. To address these issues, our experimental results demonstrate a significant [...] Read more.
Traditional entropy source evaluation methods rely on statistical analysis and are hard to deploy on-chip or online. However, online detection of entropy source quality is necessary in some applications with high encryption levels. To address these issues, our experimental results demonstrate a significant negative correlation between minimum entropy values and prediction accuracy, with a Pearson correlation coefficient of −0.925 (p-value = 1.07 × 10−7). This finding offers a novel approach for assessing entropy source quality, achieving an accurate rate in predicting the next bit of a random sequence using neural networks. To further improve prediction capabilities, we also propose a novel deep learning architecture, Fast Fourier Transform-Attention Mechanism-Long Short-Term Memory Network (FFT-ATT-LSTM), that integrates a simplified soft attention mechanism with Fast Fourier Transform (FFT), enabling effective fusion of time-domain and frequency-domain features. The FFT-ATT-LSTM improves prediction accuracy by 4.46% and 8% over baseline networks when predicting random numbers. Additionally, FFT-ATT-LSTM maintains a compact parameter size of 33.90 KB, significantly smaller than Temporal Convolutional Networks (TCN) at 41.51 KB and Transformers at 61.51 KB, while retaining comparable prediction performance. This optimal balance between accuracy and resource efficiency makes FFT-ATT-LSTM suitable for online deployment, demonstrating considerable application potential. Full article
Show Figures

Figure 1

22 pages, 1347 KiB  
Article
Semi-Empirical Approach to Evaluating Model Fit for Sea Clutter Returns: Focusing on Future Measurements in the Adriatic Sea
by Bojan Vondra
Entropy 2024, 26(12), 1069; https://doi.org/10.3390/e26121069 - 9 Dec 2024
Cited by 1 | Viewed by 679
Abstract
A method for evaluating Kullback–Leibler (KL) divergence and Squared Hellinger (SH) distance between empirical data and a model distribution is proposed. This method exclusively utilises the empirical Cumulative Distribution Function (CDF) of the data and the CDF of the model, avoiding data processing [...] Read more.
A method for evaluating Kullback–Leibler (KL) divergence and Squared Hellinger (SH) distance between empirical data and a model distribution is proposed. This method exclusively utilises the empirical Cumulative Distribution Function (CDF) of the data and the CDF of the model, avoiding data processing such as histogram binning. The proposed method converges almost surely, with the proof based on the use of exponentially distributed waiting times. An example demonstrates convergence of the KL divergence and SH distance to their true values when utilising the Generalised Pareto (GP) distribution as empirical data and the K distribution as the model. Another example illustrates the goodness of fit of these (GP and K-distribution) models to real sea clutter data from the widely used Intelligent PIxel processing X-band (IPIX) measurements. The proposed method can be applied to assess the goodness of fit of various models (not limited to GP or K distribution) to clutter measurement data such as those from the Adriatic Sea. Distinctive features of this small and immature sea, like the presence of over 1300 islands that affect local wind and wave patterns, are likely to result in an amplitude distribution of sea clutter returns that differs from predictions of models designed for oceans or open seas. However, to the author’s knowledge, no data on this specific topic are currently available in the open literature, and such measurements have yet to be conducted. Full article
Show Figures

Figure 1

20 pages, 430 KiB  
Article
On the Stress–Strength Reliability of Transmuted GEV Random Variables with Applications to Financial Assets Selection
by Melquisadec Oliveira, Felipe S. Quintino, Dióscoros Aguiar, Pushpa N. Rathie, Helton Saulo, Tiago A. da Fonseca and Luan Carlos de Sena Monteiro Ozelim
Entropy 2024, 26(6), 441; https://doi.org/10.3390/e26060441 - 23 May 2024
Viewed by 987
Abstract
In reliability contexts, probabilities of the type R=P(X<Y), where X and Y are random variables, have shown to be useful tools to compare the performance of these stochastic entities. By considering that both X and [...] Read more.
In reliability contexts, probabilities of the type R=P(X<Y), where X and Y are random variables, have shown to be useful tools to compare the performance of these stochastic entities. By considering that both X and Y follow a transmuted generalized extreme-value (TGEV) distribution, new analytical relationships were derived for R in terms of special functions. The results hereby obtained are more flexible when compared to similar results found in the literature. To highlight the applicability and correctness of our results, we conducted a Monte-Carlo simulation study and investigated the use of the reliability measure P(X<Y) to select among financial assets whose returns were characterized by the random variables X and Y. Our results highlight that R is an interesting alternative to modern portfolio theory, which usually relies on the contrast of involved random variables by a simple comparison of their means and standard deviations. Full article
Show Figures

Figure 1

13 pages, 539 KiB  
Article
Estimation of a Simple Structure in a Multidimensional IRT Model Using Structure Regularization
by Ryosuke Shimmura and Joe Suzuki
Entropy 2024, 26(1), 44; https://doi.org/10.3390/e26010044 - 31 Dec 2023
Viewed by 1496
Abstract
We develop a method for estimating a simple matrix for a multidimensional item response theory model. Our proposed method allows each test item to correspond to a single latent trait, making the results easier to interpret. It also enables clustering of test items [...] Read more.
We develop a method for estimating a simple matrix for a multidimensional item response theory model. Our proposed method allows each test item to correspond to a single latent trait, making the results easier to interpret. It also enables clustering of test items based on their corresponding latent traits. The basic idea of our approach is to use the prenet (product-based elastic net) penalty, as proposed in factor analysis. For optimization, we show that combining stochastic EM algorithms, proximal gradient methods, and coordinate descent methods efficiently yields solutions. Furthermore, our numerical experiments demonstrate its effectiveness, especially in cases where the number of test subjects is small, compared to methods using the existing L1 penalty. Full article
Show Figures

Figure 1

14 pages, 770 KiB  
Article
Bridging Extremes: The Invertible Bimodal Gumbel Distribution
by Cira G. Otiniano, Eduarda B. Silva, Raul Y. Matsushita and Alan Silva
Entropy 2023, 25(12), 1598; https://doi.org/10.3390/e25121598 - 29 Nov 2023
Viewed by 1312
Abstract
This paper introduces a novel three-parameter invertible bimodal Gumbel distribution, addressing the need for a versatile statistical tool capable of simultaneously modeling maximum and minimum extremes in various fields such as hydrology, meteorology, finance, and insurance. Unlike previous bimodal Gumbel distributions available in [...] Read more.
This paper introduces a novel three-parameter invertible bimodal Gumbel distribution, addressing the need for a versatile statistical tool capable of simultaneously modeling maximum and minimum extremes in various fields such as hydrology, meteorology, finance, and insurance. Unlike previous bimodal Gumbel distributions available in the literature, our proposed model features a simple closed-form cumulative distribution function, enhancing its computational attractiveness and applicability. This paper elucidates the behavior and advantages of the invertible bimodal Gumbel distribution through detailed mathematical formulations, graphical illustrations, and exploration of distributional characteristics. We illustrate using financial data to estimate Value at Risk (VaR) from our suggested model, considering maximum and minimum blocks simultaneously. Full article
Show Figures

Figure 1

13 pages, 1184 KiB  
Article
Gaussian and Lerch Models for Unimodal Time Series Forcasting
by Azzouz Dermoune, Daoud Ounaissi and Yousri Slaoui
Entropy 2023, 25(10), 1474; https://doi.org/10.3390/e25101474 - 22 Oct 2023
Viewed by 1623
Abstract
We consider unimodal time series forecasting. We propose Gaussian and Lerch models for this forecasting problem. The Gaussian model depends on three parameters and the Lerch model depends on four parameters. We estimate the unknown parameters by minimizing the sum of the absolute [...] Read more.
We consider unimodal time series forecasting. We propose Gaussian and Lerch models for this forecasting problem. The Gaussian model depends on three parameters and the Lerch model depends on four parameters. We estimate the unknown parameters by minimizing the sum of the absolute values of the residuals. We solve these minimizations with and without a weighted median and we compare both approaches. As a numerical application, we consider the daily infections of COVID-19 in China using the Gaussian and Lerch models. We derive a confident interval for the daily infections from each local minima. Full article
Show Figures

Figure 1

33 pages, 804 KiB  
Article
Kolmogorov Entropy for Convergence Rate in Incomplete Functional Time Series: Application to Percentile and Cumulative Estimation in High Dimensional Data
by Ouahiba Litimein, Fatimah Alshahrani, Salim Bouzebda, Ali Laksaci and Boubaker Mechab
Entropy 2023, 25(7), 1108; https://doi.org/10.3390/e25071108 - 24 Jul 2023
Cited by 1 | Viewed by 1596
Abstract
The convergence rate for free-distribution functional data analyses is challenging. It requires some advanced pure mathematics functional analysis tools. This paper aims to bring several contributions to the existing functional data analysis literature. First, we prove in this work that Kolmogorov entropy is [...] Read more.
The convergence rate for free-distribution functional data analyses is challenging. It requires some advanced pure mathematics functional analysis tools. This paper aims to bring several contributions to the existing functional data analysis literature. First, we prove in this work that Kolmogorov entropy is a fundamental tool in characterizing the convergence rate of the local linear estimation. Precisely, we use this tool to derive the uniform convergence rate of the local linear estimation of the conditional cumulative distribution function and the local linear estimation conditional quantile function. Second, a central limit theorem for the proposed estimators is established. These results are proved under general assumptions, allowing for the incomplete functional time series case to be covered. Specifically, we model the correlation using the ergodic assumption and assume that the response variable is collected with missing at random. Finally, we conduct Monte Carlo simulations to assess the finite sample performance of the proposed estimators. Full article
Show Figures

Figure 1

Back to TopTop