Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (1,114)

Search Parameters:
Keywords = statistical moments

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
19 pages, 456 KB  
Article
The Alpha Power Topp–Leone Dagum Distribution: Theory and Applications
by Hadeel S. Klakattawi and Wedad H. Aljuhani
Symmetry 2026, 18(1), 132; https://doi.org/10.3390/sym18010132 - 9 Jan 2026
Abstract
This article introduces a new flexible distribution, called the alpha power Topp–Leone Dagum (APTLDa) distribution, which extends the classical Dagum model by combining the Topp–Leone generator with the alpha power transformation (APT). The proposed distribution is capable of modeling data with symmetrical and [...] Read more.
This article introduces a new flexible distribution, called the alpha power Topp–Leone Dagum (APTLDa) distribution, which extends the classical Dagum model by combining the Topp–Leone generator with the alpha power transformation (APT). The proposed distribution is capable of modeling data with symmetrical and asymmetrical shapes for the probability density and hazard rate functions. This makes it suitable for lifetime and reliability data analysis. Several important statistical properties of the new distribution are derived, including the quantile function, moments, entropy measures, order statistics, and reliability-related functions. Parameter estimation is carried out using the maximum likelihood method, and the performance of the estimators is examined through an extensive simulation study under different sample sizes and parameter settings. The simulation results demonstrate the consistency and good finite-sample behavior of the estimators. The practical usefulness of the proposed distribution is illustrated through applications to two real datasets, where its performance is compared with several competing models. The results show that the APTLDa distribution provides a flexible and effective alternative for modeling lifetime data. Full article
(This article belongs to the Section Mathematics)
23 pages, 7558 KB  
Article
Instrumented Assessment of Gait in Pediatric Cancer Survivors: Identifying Functional Impairments After Oncological Treatment—A Pilot Study
by María Carratalá-Tejada, Diego Fernández-Vázquez, Víctor Navarro-López, Juan Aboitiz-Cantalapiedra, Francisco Molina-Rueda, Blanca López-Ibor Aliño and Alicia Cuesta-Gómez
Children 2026, 13(1), 96; https://doi.org/10.3390/children13010096 - 9 Jan 2026
Abstract
Background/Objectives: Pediatric cancer survivors frequently experience neuromuscular sequelae related to chemotherapy-induced neurotoxicity. Agents such as vincristine, methotrexate, and platinum compounds can lead to persistent gait alterations and sensorimotor deficits that impair mobility and quality of life. This study aimed to objectively assess [...] Read more.
Background/Objectives: Pediatric cancer survivors frequently experience neuromuscular sequelae related to chemotherapy-induced neurotoxicity. Agents such as vincristine, methotrexate, and platinum compounds can lead to persistent gait alterations and sensorimotor deficits that impair mobility and quality of life. This study aimed to objectively assess gait in pediatric cancer survivors after the completion of oncological pharmacological treatment to identify specific spatiotemporal, kinematic, and kinetic alterations and characterize neuromechanical patterns associated with neurotoxic exposure. Methods: A cross-sectional observational study was conducted including pediatric cancer survivors (6–18 years) who had completed chemotherapy and age- and sex-matched healthy controls. Gait was analyzed using a Vicon®3D motion capture system, with reflective markers placed on standardized anatomical landmarks. Spatiotemporal, kinematic, and kinetic variables were compared between groups using parametric tests and statistical parametric mapping (SPM) with Holm–Bonferroni correction (α = 0.05). Results: Pediatric cancer survivors showed slower gait velocity (Mean Difference (MD) = 0.17, p = 0.018, Confidence Interval CI95% = 0.04; 0.4), shorter step (MD = 0.1, p = 0.015, CI95% = 0.01; 0.19) and stride length (MD = 0.17, p = 0.018, CI95% = 0.03; 0.31), as well as reduced single support time (MD = 0.1, p = 0.043, CI95% = 0.01; 0.19), along with significant alterations in pelvic, hip, knee, and ankle kinematics compared with controls. Increased pelvic elevation (MD = 0.92, p = 0.018, CI95% = 0.25; 1.58), reduced hip extension during stance (MD = −2.99, p = 0.039, CI95% = −5.19; −0.74), knee hyperextension in mid-stance (MD = −3.84, p < 0.001, CI95% = −6.18; −0.72), and limited ankle dorsiflexion (MAS MD = −4.04, p < 0.001, CI95% = −6.79; −0.86, LAS MD = −3.16, p < 0.001) and plantarflexor moments in terminal stance (MAS MD = −149.65, p = 0.018, CI95% = −259.35; −48.25, LAS MD = −191.81, p = 0.008, CI95% = −323.81; −57.31) were observed. Ground reaction force peaks during loading response (MAS MD = −16.86, p < 0.001, CI95% = −26.12; −0.72 LAS MD = −11.74, p = 0.001, CI95% = −19.68; −3.94) and foot-off (MAS MD = 10.38, p = 0.015, CI95% = 0.41; 20.53, LAS MD = 11.88, p = 0.01, CI95% = 3.15; 22.38) were also reduced. Conclusions: Children who have completed chemotherapy present measurable gait deviations reflecting persistent neuromechanical impairment, likely linked to chemotherapy-induced neurotoxicity and deconditioning. Instrumented gait analysis allows early detection of these alterations and may support the design of targeted rehabilitation strategies to optimize functional recovery and long-term quality of life in pediatric cancer survivors. Full article
(This article belongs to the Special Issue Movement Disorders in Children: Challenges and Opportunities)
Show Figures

Figure 1

38 pages, 8851 KB  
Article
Numerical Investigation of Rim Seal Flow in a Single-Stage Axial Turbine
by Tuong Linh Nha, Duc Anh Nguyen, Phan Anh Trinh, Gia-Diem Pham and Cong Truong Dinh
Eng 2026, 7(1), 31; https://doi.org/10.3390/eng7010031 - 7 Jan 2026
Viewed by 62
Abstract
This study investigates rim seal flow in axial turbine configurations through a combined experimental–numerical approach, with the objective of identifying sealing-flow conditions that minimize ingestion while limiting aerodynamic losses. Experimental measurements from the University of BATH are used to validate computational methodology, ensuring [...] Read more.
This study investigates rim seal flow in axial turbine configurations through a combined experimental–numerical approach, with the objective of identifying sealing-flow conditions that minimize ingestion while limiting aerodynamic losses. Experimental measurements from the University of BATH are used to validate computational methodology, ensuring consistency with established sealing-effectiveness trends. The work places particular emphasis on the influence of computational domain selection and interface treatment, which is shown to strongly affect the prediction of ingestion mechanisms. A key contribution of this study is the systematic assessment of multiple domain configurations, demonstrating that a frozen rotor MRF formulation provides the most reliable steady-state representation of pressure-driven ingress, whereas stationary and non-interface domains tend to overpredict sealing effectiveness. A simplified thin-seal model is also evaluated and found to offer an efficient alternative for global performance predictions. Furthermore, a statistical orifice-based model is introduced to estimate minimum sealing flow for different rim seal geometries, providing a practical engineering tool for purge-flow scaling. The effects of pre-swirl injection are examined and shown to substantially reduce rotor wall shear and moment coefficient, contributing to lower windage losses without significantly modifying sealing characteristics. Unsteady flow features are explored using a harmonic balance method, revealing Kelvin–Helmholtz-type instabilities that drive large-scale structures within the rim seal cavity, particularly near design-speed operation. Finally, results highlight a clear trade-off between sealing-flow rate and turbine isentropic efficiency, underlining the importance of optimized purge-flow management. Full article
Show Figures

Figure 1

18 pages, 343 KB  
Article
The Bidirectional Relationship Between Audit Fees and Corporate Reputation: Panel Evidence from South African Listed Firms
by Omobolade Stephen Ogundele and Lethiwe Nzama-Sithole
J. Risk Financial Manag. 2026, 19(1), 35; https://doi.org/10.3390/jrfm19010035 - 4 Jan 2026
Viewed by 179
Abstract
As corporate accountability, credibility, transparency, and stakeholders’ trust continue to attract global attention, this study examines how corporate reputation influences audit fees and whether audit fees, in turn, shape reputation. Hence, this study examines the bidirectional relationship between audit fees and corporate reputation [...] Read more.
As corporate accountability, credibility, transparency, and stakeholders’ trust continue to attract global attention, this study examines how corporate reputation influences audit fees and whether audit fees, in turn, shape reputation. Hence, this study examines the bidirectional relationship between audit fees and corporate reputation in South African listed firms. This study reviewed three theories, such as capital reputation, stakeholder, and agency theories. Exploring panel data from sixteen listed firms over a period of ten years (2015–2024), this study employs panel regression analysis and two-step system generalised method of moments (System GMM) estimates in accounting for endogeneity, heterogeneity, and dynamic relationships. Data was sourced from the annual reports and accounts of selected firms. The results from the fixed effects model indicate that corporate reputation exerts a statistically significant and positive influence on audit fees. Conversely, findings from the random effects model reveal that audit fees positively influence corporate reputation. The two-step GMM result confirms a bidirectional causal relationship as lagged corporate reputation significantly influences subsequent audit fees, while lagged audit fees also significantly influence future corporate reputation. This study recommends that the board of directors should view audit not as an expense but as a strategic investment in sustaining stakeholder trust and reputation. Among other things, policymakers and regulators should also strengthen audit market transparency in ensuring that audit pricing reflects genuine reputational consideration and audit quality. Full article
(This article belongs to the Section Business and Entrepreneurship)
40 pages, 1118 KB  
Article
FORCE: Fast Outlier-Robust Correlation Estimation via Streaming Quantile Approximation for High-Dimensional Data Streams
by Sooyoung Jang and Changbeom Choi
Mathematics 2026, 14(1), 191; https://doi.org/10.3390/math14010191 - 4 Jan 2026
Viewed by 144
Abstract
The estimation of correlation matrices in high-dimensional data streams presents a fundamental conflict between computational efficiency and statistical robustness. Moment-based estimators, such as Pearson’s correlation, offer linear O(N) complexity but lack robustness. In contrast, high-breakdown methods like the minimum covariance [...] Read more.
The estimation of correlation matrices in high-dimensional data streams presents a fundamental conflict between computational efficiency and statistical robustness. Moment-based estimators, such as Pearson’s correlation, offer linear O(N) complexity but lack robustness. In contrast, high-breakdown methods like the minimum covariance determinant (MCD) are computationally prohibitive (O(Np2+p3)) for real-time applications. This paper introduces Fast Outlier-Robust Correlation Estimation (FORCE), a streaming algorithm that performs adaptive coordinate-wise trimming using the P2 algorithm for streaming quantile approximation, requiring only O(p) memory independent of stream length. We evaluate FORCE against six baseline algorithms—including exact trimmed methods (TP-Exact, TP-TER) that use O(NlogN) sorting with O(Np) storage—across five benchmark datasets spanning synthetic, financial, medical, and genomic domains. FORCE achieves speedups of approximately 470× over FastMCD and 3.9× over Spearman’s rank correlation. On S&P 500 financial data, coordinate-wise trimmed methods substantially outperform FastMCD: TP-Exact achieves the best RMSE (0.0902), followed by TP-TER (0.0909) and FORCE (0.1186), compared to FastMCD’s 0.1606. This result demonstrates that coordinate-wise trimming better accommodates volatility clustering in financial time series than multivariate outlier exclusion. FORCE achieves 76% of TP-Exact’s accuracy while requiring 104× less memory, enabling robust estimation in true streaming environments where data cannot be retained for batch processing. We validate the 25% breakdown point shared by all IQR-based trimmed methods using the ODDS-satellite benchmark (31.7% contamination), confirming identical degradation for FORCE, TP-Exact, and TP-TER. For memory-constrained streaming applications with contamination below 25%, FORCE provides the only viable path to robust correlation estimation with bounded memory. Full article
(This article belongs to the Special Issue Modeling and Simulation for Optimizing Complex Dynamical Systems)
Show Figures

Graphical abstract

30 pages, 8862 KB  
Article
Kalman Filter-Based Reconstruction of Power Trajectories for IoT-Based Photovoltaic System Monitoring
by Jorge Salvador Valdez-Martínez, Guillermo Ramirez-Zuñiga, Heriberto Adamas Pérez, Alberto Miguel Beltrán-Escobar, Estela Sarmiento-Bustos, Manuela Calixto-Rodriguez and Gustavo Delgado-Reyes
Mathematics 2026, 14(1), 144; https://doi.org/10.3390/math14010144 - 30 Dec 2025
Viewed by 297
Abstract
This paper presents the reconstruction of signal paths acquired from a power electronics system for energy conversion and management. This reconstruction is performed using the Kalman filter (KF) for monitoring photovoltaic (PV) systems enabled for Internet of Things (IoT) systems. This proposal is [...] Read more.
This paper presents the reconstruction of signal paths acquired from a power electronics system for energy conversion and management. This reconstruction is performed using the Kalman filter (KF) for monitoring photovoltaic (PV) systems enabled for Internet of Things (IoT) systems. This proposal is motivated by the fact that the global energy transition towards renewable sources makes PV systems a crucial alternative. To guarantee the efficiency and stability of these systems, monitoring critical electrical parameters using IoT technology is essential. However, the measurements acquired are frequently corrupted by stochastic noise, which obscures the true behavior of the system and limits its accurate characterization. Based on this problem, the main objective of this work is explicitly defined as evaluating the effectiveness of the KF as a power-path reconstruction method capable of recovering accurate electrical trajectories from noisy measurements in IoT-monitored photovoltaic networks. To achieve this goal, the system is modeled as a discrete-time stochastic process and the KF is implemented as a real-time estimator of power flow behavior. The experiment was conducted using real-world generation and consumption data from a proprietary two-layer IoT platform: an Edge Layer (acquisition with ESP8266 and PZEM-004T-100A sensors) and a Cloud Layer (visualization on Things-Board). To validate the results, quantitative metrics including the mean squared error (MSE), statistical moments, and probability distributions were computed. The MSE values were found to be nearly zero across all reconstructed power-paths. The statistical moments exhibited near-perfect agreement with those of the actual power signals, approaching 100% correspondence. Additionally, the probability distributions were compared visually and assessed statistically using the Kolmogorov–Smirnov (KS) test. The resulting KS values were very low, confirming the high accuracy of the reconstruction for all power-paths. The proposed research concluded that the KF successfully reconstructed the power trajectories, demonstrating high agreement with the measured steady-state behavior. This study thus confirms that integrating Kalman filtering with IoT monitoring delivers a practically viable and statistically accurate method for power trajectory reconstruction, which is fundamental for enhancing the observability and reliability of photovoltaic energy systems. Full article
(This article belongs to the Section C2: Dynamical Systems)
Show Figures

Figure 1

31 pages, 552 KB  
Article
The Impact of Metropolitan Integration on Land Use Efficiency and Its Mechanism
by Jiaxi Xiao and Fan Dong
Land 2026, 15(1), 52; https://doi.org/10.3390/land15010052 - 27 Dec 2025
Viewed by 278
Abstract
Against the backdrop of accelerating global spatial restructuring, metropolitan areas have become crucial spatial units for enhancing regional competitiveness and securing industrial chains. Although China has continuously advanced metropolitan area development, low land use efficiency remains a key constraint on sustainable progress. Metropolitan [...] Read more.
Against the backdrop of accelerating global spatial restructuring, metropolitan areas have become crucial spatial units for enhancing regional competitiveness and securing industrial chains. Although China has continuously advanced metropolitan area development, low land use efficiency remains a key constraint on sustainable progress. Metropolitan integration presents a new approach to addressing this challenge. This study constructs an analytical framework of “direct effects–indirect effects–dynamic evolution” and measures metropolitan integration and land use efficiency using a multidimensional indicator system and a super-efficiency slacks-based measure (SBM) model incorporating undesirable outputs. Employing the system generalized method of moments (System GMM) estimator, this study conducts both baseline and mediation analyses using balanced panel data for 32 Chinese metropolitan areas from 2016 to 2022. The results show that both metropolitan integration and land use efficiency improved steadily during the study period. The coefficient on metropolitan integration is positive and statistically significant, and the lagged dependent variable is also positive and statistically significant, indicating substantial persistence over time. Heterogeneity analyses further indicate that the estimated association is more pronounced in eastern metropolitan areas and nationally designated metropolitan areas. In addition, industrial agglomeration and industrial specialization operate as important mediating channels in this relationship. Based on these findings, the study proposes policy recommendations to strengthen metropolitan integration and industrial collaboration, thereby improving land use efficiency. Full article
(This article belongs to the Section Land Use, Impact Assessment and Sustainability)
Show Figures

Figure 1

26 pages, 3125 KB  
Article
Advancing Sustainable Development and the Net-Zero Emissions Transition: The Role of Green Technology Innovation, Renewable Energy, and Environmental Taxation
by Xiwen Zhou, Haining Chen and Guoping Ding
Sustainability 2026, 18(1), 221; https://doi.org/10.3390/su18010221 - 25 Dec 2025
Viewed by 315
Abstract
In the macro context of promoting sustainable development and achieving net zero emissions, the role of green technology innovation, renewable energy utilization and environmental policy is crucial. However, there is still a lack of consistent empirical evidence regarding the combined emission reduction effect [...] Read more.
In the macro context of promoting sustainable development and achieving net zero emissions, the role of green technology innovation, renewable energy utilization and environmental policy is crucial. However, there is still a lack of consistent empirical evidence regarding the combined emission reduction effect of these three factors in OECD countries. This study aims to empirically examine the combined impact of green technology innovation (GTI), renewable energy consumption (REC), and environmental taxes (ETAX) on carbon dioxide emissions. We expect that the former two will effectively reduce emissions, while the effect of environmental taxes depends on their design. Based on the panel data of 35 OECD economies from 1990 to 2019, this study adopts the augmented mean group (AMG) as the main estimation method, and uses the common correlation mean group (CCEMG) for the robustness test. To control potential endogenous issues, the difference generalized method of moments (GMM) is also employed for estimation. The causal relationship between variables is tested using the Dumitrescu–Herlin method. The results show that, as expected, GTI and REC have a significant negative impact on carbon dioxide reduction. However, ETAX is positively correlated with carbon emissions and does not have statistical significance, which deviates from the ideal policy effect and suggests that there may be efficiency bottlenecks in the current tax design. The causality test further reveals that there is a significant two-way causal relationship between CO2 emissions and GTI, REC, ETAX, GDP, and fossil fuel consumption (FEC). Therefore, it is recommended that OECD countries give priority to expanding investment in green technologies and renewable energy infrastructure and re-evaluate and optimize environmental tax policies to effectively promote the transition to a low-carbon economy. Full article
Show Figures

Figure 1

16 pages, 336 KB  
Article
Concomitants of Order Statistics from a Bivariate Generalized Linear Exponential Distribution: Theory and Practice
by Areej M. AL-Zayd
Entropy 2026, 28(1), 18; https://doi.org/10.3390/e28010018 - 24 Dec 2025
Viewed by 241
Abstract
This paper investigates the concomitants of order statistics from the bivariate generalized linear exponential (BGLE) distribution. We obtain the probability density function of a single concomitant and the joint probability density function of two concomitants of order statistics from the BGLE distribution. In [...] Read more.
This paper investigates the concomitants of order statistics from the bivariate generalized linear exponential (BGLE) distribution. We obtain the probability density function of a single concomitant and the joint probability density function of two concomitants of order statistics from the BGLE distribution. In addition, expressions for the single and product moments of concomitants of order statistics are derived. Furthermore, we find the best linear unbiased estimator of a scale parameter related to a study variable using various ranked set sampling techniques. Finally, we apply the findings to a real-life dataset. Full article
(This article belongs to the Section Information Theory, Probability and Statistics)
27 pages, 3196 KB  
Article
Reliability-Based Robust Design Optimization Using Data-Driven Polynomial Chaos Expansion
by Zhaowang Li, Zhaozhan Li, Jufang Jia and Xiangdong He
Machines 2026, 14(1), 20; https://doi.org/10.3390/machines14010020 - 23 Dec 2025
Viewed by 391
Abstract
As the complexity of modern engineering systems continues to increase, traditional reliability analysis methods still face challenges regarding computational efficiency and reliability in scenarios where the distribution information of random variables is incomplete and samples are sparse. Therefore, this study develops a data-driven [...] Read more.
As the complexity of modern engineering systems continues to increase, traditional reliability analysis methods still face challenges regarding computational efficiency and reliability in scenarios where the distribution information of random variables is incomplete and samples are sparse. Therefore, this study develops a data-driven polynomial chaos expansion (DD-PCE) model for scenarios with limited samples and applies it to reliability-based robust design optimization (RBRDO). The model directly constructs orthogonal polynomial basis functions from input data by matching statistical moments, thereby avoiding the need for original data or complete statistical information as required by traditional PCE methods. To address the statistical moment estimation bias caused by sparse samples, kernel density estimation (KDE) is employed to augment the data derived from limited samples. Furthermore, to enhance computational efficiency, after determining the DD-PCE coefficients, the first four moments of the DD-PCE are obtained analytically, and reliability is computed based on the maximum entropy principle (MEP), thereby eliminating the additional step of solving reliability as required by traditional PCE methods. The proposed approach is validated through a mechanical structure and five mathematical functions, with RBRDO studies conducted on three typical structures and one practical engineering case. The results demonstrate that, while ensuring computational accuracy, this method saves approximately 90% of the time compared to the Monte Carlo simulation (MCS) method, significantly improving computational efficiency. Full article
(This article belongs to the Section Machine Design and Theory)
Show Figures

Figure 1

25 pages, 4458 KB  
Article
Quantifying Knowledge Production Efficiency with Thermodynamics: A Data-Driven Study of Scientific Concepts
by Artem Chumachenko and Brett Buttliere
Entropy 2026, 28(1), 11; https://doi.org/10.3390/e28010011 - 22 Dec 2025
Viewed by 393
Abstract
We develop a data-driven framework for analyzing how scientific concepts evolve through their empirical in-text frequency distributions in large text corpora. For each concept, the observed distribution is paired with a maximum entropy equilibrium reference, which takes a generalized Boltzmann form determined by [...] Read more.
We develop a data-driven framework for analyzing how scientific concepts evolve through their empirical in-text frequency distributions in large text corpora. For each concept, the observed distribution is paired with a maximum entropy equilibrium reference, which takes a generalized Boltzmann form determined by two measurable statistical moments. Using data from more than 500,000 physics papers (about 13,000 concepts, 2000–2018), we reconstruct the temporal trajectories of the associated MaxEnt parameters and entropy measures, and we identify two characteristic regimes of concept dynamics, stable and driven, separated by a transition point near criticality. Departures from equilibrium are quantified using a residual-information measure that captures how much structure a concept exhibits beyond its equilibrium baseline. To analyze temporal change, we adapt the Hatano–Sasa and Esposito–Van den Broeck decomposition to discrete time and separate maintenance-like contributions from externally driven reorganization. The proposed efficiency indicators describe how concepts sustain or reorganize their informational structure under a finite representational capacity. Together, these elements provide a unified and empirically grounded description of concept evolution in scientific communication, based on equilibrium references, nonequilibrium structure, and informational work. Full article
(This article belongs to the Special Issue The Thermodynamics of Social Processes)
Show Figures

Figure 1

10 pages, 815 KB  
Article
Decline in Renal Function, Measured by Annual Estimated Glomerular Filtration Rate Based on Cystatin C in Patients with Rheumatoid Arthritis, Is Linked to Disease Activity Level and Duration: Small Retrospective Cohort Study
by Ichiro Yoshii, Naoya Sawada and Tatsumi Chijiwa
Rheumato 2026, 6(1), 1; https://doi.org/10.3390/rheumato6010001 - 19 Dec 2025
Viewed by 316
Abstract
Background/Objectives: Associations between renal function, as measured by the estimated glomerular filtration rate (eGFR) or its decline (dGFR), and clinical parameters in patients with rheumatoid arthritis (RA) were evaluated using a retrospective case–control series dataset. Methods: Patients with RA who followed up for [...] Read more.
Background/Objectives: Associations between renal function, as measured by the estimated glomerular filtration rate (eGFR) or its decline (dGFR), and clinical parameters in patients with rheumatoid arthritis (RA) were evaluated using a retrospective case–control series dataset. Methods: Patients with RA who followed up for one or more consecutive years were recruited for the study. For calculating the eGFR, cystatin C (CysC) was adopted. The moment when CysC was measured was set as the baseline. The association between the eGFR and baseline clinical parameters, including disease activity in RA as measured by the simplified disease activity index (SDAI), was statistically evaluated. The association between the mean annual decline in the eGFR from the baseline and clinical parameters was also statistically assessed. Results: A total of 513 patients were enrolled; with a mean age of 70.9; a mean follow-up length of 52.5 months; a mean BMI of 22.9; a 68.7 eGFR; and a mean annual dGFR of 2.74. Significant parameters that correlated with the eGFR were age; rheumatoid factor titer; C-reactive protein; the presence of hypertension; chronic heart failure; chronic obstructive pulmonary disease; type 2 diabetes mellitus; methotrexate administration; and polypharmacy at baseline. An annual dGFR was correlated with the follow-up length, and the mean SDAI score multiplied by the yearly length of the follow-up was significantly correlated. Conclusions: Many factors confound the determination of the eGFR in RA patients. The disease activity score and length of time are the key factors for declining eGFRs. Full article
Show Figures

Graphical abstract

30 pages, 1497 KB  
Article
A New Flexible Integrated Linear–Weibull Lifetime Model: Analytical Characterization and Real-Data Studies
by Isyaku Muhammad, Mustapha Muhammad, Zeineb Klai, Badamasi Abba and Zoalnoon Ahmed Abeid Allah Saad
Symmetry 2025, 17(12), 2163; https://doi.org/10.3390/sym17122163 - 16 Dec 2025
Viewed by 230
Abstract
In this work, we introduce a new four-parameter distribution, called the integrated linear–Weibull (ILW) model, constructed by embedding a dynamic linear component within the Weibull framework. The ILW distribution is capable of capturing a wide variety of symmetric and asymmetric density shapes and [...] Read more.
In this work, we introduce a new four-parameter distribution, called the integrated linear–Weibull (ILW) model, constructed by embedding a dynamic linear component within the Weibull framework. The ILW distribution is capable of capturing a wide variety of symmetric and asymmetric density shapes and accommodates diverse failure-rate behaviors. We derive several of its key mathematical and statistical properties, including moments, extropy, cumulative residual entropy, order statistics, and their asymptotic forms. The mean residual life function and its reciprocal relationship with the failure rate are also obtained. An algorithm for generating random samples from the ILW distribution is provided, and model identifiability is examined numerically through the Kullback–Leibler divergence. Parameter estimation is carried out via maximum likelihood, and a simulation study is conducted to assess the accuracy of the estimators; the results show improvements in estimator performance as sample size increases. Finally, three real datasets involving failure-time observations and one describing hydrological and epidemiological data are analyzed to demonstrate the practical usefulness of the ILW model. In these applications, the proposed model exhibits competitive or superior performance relative to several existing lifetime distributions based on standard model selection criteria and goodness-of-fit measures. Full article
Show Figures

Figure 1

35 pages, 4673 KB  
Article
Advances in Discrete Lifetime Modeling: A Novel Discrete Weibull Mixture Distribution with Applications to Medical and Reliability Studies
by Doha R. Salem, Mai A. Hegazy, Hebatalla H. Mohammad, Zakiah I. Kalantan, Gannat R. AL-Dayian, Abeer A. EL-Helbawy and Mervat K. Abd Elaal
Symmetry 2025, 17(12), 2140; https://doi.org/10.3390/sym17122140 - 12 Dec 2025
Viewed by 258
Abstract
In recent years, there has been growing interest in discrete probability distributions due to their ability to model the complex behavior of real-world count data. In this paper, a new discrete mixture distribution based on two Weibull components is introduced, constructed using the [...] Read more.
In recent years, there has been growing interest in discrete probability distributions due to their ability to model the complex behavior of real-world count data. In this paper, a new discrete mixture distribution based on two Weibull components is introduced, constructed using the general discretization approach. Several important statistical properties of the proposed distribution, including the survival function, hazard rate function, alternative hazard rate function, moments, quantile function, and order statistics are derived. It was concluded from the descriptive measures that the discrete mixture of two Weibull distributions transitions from being positively skewed with heavy tails to a more symmetric and light-tailed form. This demonstrates the high flexibility of the discrete mixture of two Weibull distributions in capturing a wide range of shapes as its parameter values vary. Estimation of the parameters is performed via maximum likelihood under Type II censoring scheme. A simulation study assesses the performance of the maximum likelihood estimators. Furthermore, the applicability of the proposed distribution is demonstrated using two real-life datasets. In summary, this paper constructs the discrete mixture of two Weibull distributions, investigates its statistical characteristics, and estimates its parameters, demonstrating its flexibility and practical applicability. These results highlight its potential as a powerful tool for modeling complex discrete data. Full article
Show Figures

Figure 1

26 pages, 855 KB  
Article
Regulation, Disclosure, and the Displacement of Internal Governance in Saudi Banks
by Ali Al-Sari
J. Risk Financial Manag. 2025, 18(12), 705; https://doi.org/10.3390/jrfm18120705 - 11 Dec 2025
Viewed by 741
Abstract
This study examines whether strengthened prudential supervision reduces the marginal influence of internal governance mechanisms on the performance of Saudi banks during the Vision 2030 reform period. Using a panel of ten listed Saudi banks from 2018 to 2024, governance measures are hand [...] Read more.
This study examines whether strengthened prudential supervision reduces the marginal influence of internal governance mechanisms on the performance of Saudi banks during the Vision 2030 reform period. Using a panel of ten listed Saudi banks from 2018 to 2024, governance measures are hand collected to align with Saudi Central Bank definitions, focusing on insider ownership and board independence. To address endogeneity arising from performance persistence and reverse causality, two-step system generalized method of moments with collapsed lagged internal instruments and Windmeijer-corrected standard errors are employed. The results reveal that insider ownership and board independence are statistically and economically insignificant for accounting performance and market valuation, whereas lagged performance remains the dominant predictor. Hansen J and Arellano–Bond AR(2) diagnostics support instrument validity, and robustness checks using alternative estimators and variable specifications produce consistent findings. The results suggest that in contexts where prudential oversight is comprehensive and consistently enforced, internal governance mechanisms may provide limited incremental monitoring value. However, they do not imply that boards or insiders are irrelevant during crises or when enforcement is uneven. Therefore, refining supervisory tools and disclosure practices should be prioritized over imposing additional structural mandates on boards or ownership configurations. Full article
(This article belongs to the Special Issue Financial Markets and Institutions and Financial Crises)
Show Figures

Figure 1

Back to TopTop