Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (617)

Search Parameters:
Keywords = empirical likelihood

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
17 pages, 788 KB  
Article
Service Urgency for Children and Youth: The Development of an Algorithm to Identify Urgent and Emergent Service Users in Children’s Mental Health
by Shannon L. Stewart, Abigail Withers and Jeffrey W. Poss
Int. J. Environ. Res. Public Health 2026, 23(5), 603; https://doi.org/10.3390/ijerph23050603 - 2 May 2026
Abstract
Timely access to children’s mental health services depends on accurate identification of service urgency; however, triage practices in Ontario, Canada vary widely, contributing to prolonged wait times and inconsistent pathways to care. This study aimed to develop and validate an empirically based decision-support [...] Read more.
Timely access to children’s mental health services depends on accurate identification of service urgency; however, triage practices in Ontario, Canada vary widely, contributing to prolonged wait times and inconsistent pathways to care. This study aimed to develop and validate an empirically based decision-support algorithm to support standardized triaging and prioritization in Ontario based children’s mental health agencies. Data were drawn from 17,564 children and youth aged 4–18 years assessed with the interRAI Child and Youth Mental Health Screener (ChYMH-S) as part of routine clinical practice. Interactive decision tree modelling was used to identify combinations of clinical indicators associated with high service urgency, with age-stratified models for children 7 years and younger, 8–11 years, and 12 years and older. The resulting interRAI Children’s Algorithm for Mental Health and Psychiatric Services (ChAMhPS) classified individuals into seven urgency levels. The algorithm demonstrated good discrimination for services required within seven days (c-statistic = 0.70) and for the urgency of a comprehensive assessment (c-statistic = 0.73), with stable performance across derivation and testing samples. Higher algorithm levels were associated with an increased likelihood of urgent assessment or service need. The ChAMhPS algorithm offers a standardized, empirically derived tool to support clinical decision-making and improve consistency in triage and prioritization of children and youth with urgent mental health needs. Full article
(This article belongs to the Special Issue Health Promotion Among People with Psychiatric Disorders)
Show Figures

Figure 1

11 pages, 465 KB  
Article
Assessment and Appraisal of Drug Innovativeness in Italy: Ultimate Evidence on Key Drivers and Consistency
by Alvise Verde, Federica Turati, Clara Trimarchi, Carlotta Galeone and Claudio Jommi
J. Mark. Access Health Policy 2026, 14(2), 28; https://doi.org/10.3390/jmahp14020028 - 2 May 2026
Viewed by 73
Abstract
This study aims to update and integrate empirical evidence on the key drivers and consistency of the appraisals of drug innovativeness in Italy by the Italian Medicines Agency (AIFA), and discuss if this evidence is supportive of the reform and requirements implemented in [...] Read more.
This study aims to update and integrate empirical evidence on the key drivers and consistency of the appraisals of drug innovativeness in Italy by the Italian Medicines Agency (AIFA), and discuss if this evidence is supportive of the reform and requirements implemented in 2025. Appraisals from July 2017 to December 2024 were retrieved from the AIFA website. The association between the innovativeness appraisal, the innovativeness domains (unmet need/added therapeutic value/quality of evidence) and disease/drug/evidence-specific variables was assessed using odds ratios (ORs) from binary/multinomial logistic regression models. Innovativeness status was strongly associated with added therapeutic value (OR > 70). Medicines for rare diseases were more likely to receive conditional innovativeness (OR = 2.95). Full innovativeness was more frequently recognized for indications including paediatric patients (OR = 3.60). References to severe diseases and patient-reported outcomes (PROs) had a higher, not statistically significant, likelihood of innovativeness, whereas reference to indirect treatment comparisons had a lower likelihood (OR = 0.18). The appraisal process showed high internal consistency, but its regulation needs more specific guidance. The innovativeness regulation was reformed in July 2025, including specific recommendations on the criteria to identify the alternative treatments; the role and robustness of indirect comparisons; and the role and requirements for PROs. Our evidence provides an empirical rationale for this reform. Full article
Show Figures

Figure 1

21 pages, 479 KB  
Article
On Simple EM Acceleration Schemes Suitable for Mixture Modelling with High Overlap Between Components
by Branislav Panić, Jernej Klemenc, Marko Nagode and Simon Oman
Mathematics 2026, 14(9), 1543; https://doi.org/10.3390/math14091543 - 1 May 2026
Viewed by 59
Abstract
The Expectation-Maximisation (EM) algorithm is widely used for maximum likelihood estimation in incomplete data problems such as mixture modelling, but it often converges slowly, particularly when mixture components overlap substantially. This study presents a comprehensive empirical evaluation of simple EM acceleration schemes for [...] Read more.
The Expectation-Maximisation (EM) algorithm is widely used for maximum likelihood estimation in incomplete data problems such as mixture modelling, but it often converges slowly, particularly when mixture components overlap substantially. This study presents a comprehensive empirical evaluation of simple EM acceleration schemes for Gaussian mixture models, comparing linear (STEM), quadratic (SQUAREM), and greedy (line search, golden section) methods across 240 simulated mixture configurations spanning three dimensionalities, four component counts, five overlap levels, and four sample sizes. A key contribution is the first systematic comparison of the three acceleration parameter estimates (α1, α2, α3) in the mixture modelling context: we show that only α3, which is derived as the geometric mean estimate of α1 and α2, provides genuine acceleration, while α1 and α2 consistently increase iteration counts by 50–110% relative to α3, effectively acting as deceleration. With α3, SQUAREM reduces iterations by up to 48% with negligible computational overhead, while greedy methods achieve similar iteration reductions but at 50–110% greater wall-clock time due to repeated log-likelihood evaluations. Crucially, acceleration does not degrade parameter estimation quality under any tested combination of initialisation, overlap, dimensionality, or number of components. We further examine the interaction between acceleration and initialisation, finding that k-means benefits most from acceleration (up to 50% time savings), while the REBMIX (Rough-Enhanced-Bayes MIXture estimation) algorithm benefits least as it already starts near the optimum. Among REBMIX configurations, histogram preprocessing with the outliers mode traversing strategy offers the best trade-off between quality and computational cost. The findings are validated on a real-world Backblaze hard drive failure dataset, confirming the practical utility of EM acceleration. All methods are implemented in the free and open-source R package rebmix, accompanied by full source code. Full article
(This article belongs to the Section E1: Mathematics and Computer Science)
20 pages, 861 KB  
Article
Fault Diagnosis for Active Distribution Network Based on Colored and Fuzzy Colored Petri Net
by Yulong Qin, Yifan Hou, Han Zhang and Ding Liu
Energies 2026, 19(9), 2162; https://doi.org/10.3390/en19092162 - 30 Apr 2026
Viewed by 124
Abstract
Accurate and rapid fault diagnosis is critical for active distribution networks characterized by growing structural complexity and diverse load profiles. This paper proposes a two-stage fault diagnosis framework that synergistically combines colored Petri nets (CPN) and fuzzy colored Petri nets (FCPN). In the [...] Read more.
Accurate and rapid fault diagnosis is critical for active distribution networks characterized by growing structural complexity and diverse load profiles. This paper proposes a two-stage fault diagnosis framework that synergistically combines colored Petri nets (CPN) and fuzzy colored Petri nets (FCPN). In the first stage, a CPN fault zone search model employing a breadth-first search (BFS) strategy is developed to identify suspected faulty components by processing circuit breaker operation information and grid topology. In the second stage, an FCPN diagnosis model is constructed by extending hierarchical fuzzy Petri nets through color assignment to confidence tokens. A key feature of this model is a dedicated initial confidence assessment module that dynamically evaluates the reliability of protection and circuit breaker actions by synthesizing device self-check alarms and operational timing information, thereby overcoming the limitation of empirical, static confidence assignment in existing methods. The resulting initial confidence values are then propagated through a hierarchical confidence inference module to determine the fault likelihood of each suspected component. Comparative simulations across four fault scenarios demonstrate that the proposed method achieves higher diagnostic accuracy and stronger fault tolerance than state-of-the-art approaches, correctly identifying all faulty components even under degraded alarm conditions. Full article
Show Figures

Figure 1

23 pages, 740 KB  
Article
Development and Psychometric Validation of the Emotional Intelligence Scale for Youth in the Conflict-Affected Southern Border Provinces of Thailand
by Kasetchai Laeheem
Psychiatry Int. 2026, 7(3), 90; https://doi.org/10.3390/psychiatryint7030090 - 29 Apr 2026
Viewed by 173
Abstract
This study developed and validated a specialised emotional intelligence (EI) scale for youth in the conflict-affected southern border provinces of Thailand. The primary objective was to establish a psychometric instrument tailored to this unique multicultural and sensitive context. Utilizing a sample of 500 [...] Read more.
This study developed and validated a specialised emotional intelligence (EI) scale for youth in the conflict-affected southern border provinces of Thailand. The primary objective was to establish a psychometric instrument tailored to this unique multicultural and sensitive context. Utilizing a sample of 500 local youth leaders, the instrument’s quality was rigorously evaluated through Second-order Confirmatory Factor Analysis (CFA) using Maximum Likelihood estimation. The final validated model comprises 25 indicators categorized into five dimensions: Self-Awareness, Self-Regulation, Self-Motivation, Social Awareness/Empathy, and Relationship Management. Results indicated an excellent model fit with empirical data (χ2 = 284.15, df = 265, p = 0.198, CFI = 0.99, GFI = 0.97, RMSEA = 0.02). Factor loadings ranged from 0.72 to 0.92, while composite reliability (CR) and average variance extracted (AVE) values exceeded 0.88 and 0.61, respectively, confirming high internal consistency and construct validity. Social Awareness/Empathy emerged as the most significant dimension (B = 0.91). This study suggests that the scale is a robust tool for assessing EI in conflict zones, providing a critical foundation for targeted psychosocial interventions and sustainable peace-building initiatives among youth in the region. Full article
Show Figures

Figure 1

20 pages, 511 KB  
Article
Estimation of Two-States Proportional Hazard Rates Models with Unobserved Heterogeneity
by Emilio Congregado, David Troncoso-Ponce, Nicola Rubino and Alejandro Morales-Kirioukhina
Econometrics 2026, 14(2), 22; https://doi.org/10.3390/econometrics14020022 - 28 Apr 2026
Viewed by 160
Abstract
This article examines two-state proportional hazard rate models with unobserved heterogeneity specific to each state, a framework that is especially relevant for labor market transitions. To make estimation feasible in large longitudinal datasets, we implement hshaz2s, a Stata routine that uses analytical expressions [...] Read more.
This article examines two-state proportional hazard rate models with unobserved heterogeneity specific to each state, a framework that is especially relevant for labor market transitions. To make estimation feasible in large longitudinal datasets, we implement hshaz2s, a Stata routine that uses analytical expressions for the gradient vector and Hessian matrix of the log-likelihood function through the dual second-order moment (d2 ml) method. The empirical application estimates a discrete-time duration model for transitions between employment and unemployment using Spanish labor market microdata for young low-skilled workers over 2000–2019. The results show that apprenticeship contracts are associated with lower exit rates from employment than other temporary contracts, but not with faster transitions from unemployment back into employment. The estimates also reveal substantial state-specific unobserved heterogeneity, with a large latent group characterized by persistent spells in both states. Analytical second-order information also markedly reduces convergence time under richer heterogeneity structures. Overall, the article makes this class of two-state hazard models operational for applied research and provides new evidence on apprenticeship and temporary contracts in Spain. Full article
Show Figures

Figure 1

30 pages, 1078 KB  
Article
Risk Assessment of Dams and Reservoirs to Climate Change in the Mediterranean Region: The Case of Almopeos Dam in Northern Greece
by Anastasios I. Stamou, Georgios Mitsopoulos, Athanasios Sfetsos, Athanasia Tatiana Stamou, Aristeidis Bloutsos, Konstantinos V. Varotsos, Christos Giannakopoulos and Aristeidis Koutroulis
Water 2026, 18(9), 1031; https://doi.org/10.3390/w18091031 - 26 Apr 2026
Viewed by 525
Abstract
Climate change poses significant challenges to the operation and safety of dam and reservoir (D&R) systems, particularly in regions characterized by water scarcity and high climate variability. This study presents a structured methodology for climate risk assessment that integrates regional climate projections, system-specific [...] Read more.
Climate change poses significant challenges to the operation and safety of dam and reservoir (D&R) systems, particularly in regions characterized by water scarcity and high climate variability. This study presents a structured methodology for climate risk assessment that integrates regional climate projections, system-specific thresholds, and a semi-quantitative risk matrix approach. A key innovation is the explicit linkage between climate indicators and system performance through physically based thresholds, combined with empirically derived exceedance probabilities from high-resolution climate projections. The methodology is applied to the Almopeos D&R system in northern Greece, using an ensemble of statistically downscaled CMIP6 simulations under two emission scenarios (SSP2-4.5 and SSP5-8.5) and two future periods (2041–2060 and 2081–2100). Three climate indicators are analyzed: TX35 (temperature extremes), CDD (consecutive dry days), and Rx1day (extreme precipitation). Results indicate that temperature increase is the dominant climate risk hazard, leading to increased irrigation demand and reduced system reliability, with risks classified as high to very high. Drought conditions represent a secondary but important risk, becoming critical during prolonged dry periods affecting reservoir storage, while extreme precipitation events exhibit low likelihood but potentially high consequences for dam safety. Adaptation measures are prioritized using a qualitative multi-criteria approach, highlighting the effectiveness of operational measures, while structural and monitoring interventions remain essential for ensuring system safety. The proposed methodology provides a transparent and transferable framework for climate-resilient planning of water infrastructure systems. Full article
Show Figures

Figure 1

27 pages, 440 KB  
Article
In-Hospital Mortality Predictors and a Bayesian Weighted-Incidence Antibiogram in Infective Endocarditis: A Seven-Year Cohort Study from a Mexican Tertiary University Hospital
by Itzel Elizabeth Garibay-Padilla, Jorge Eduardo Hernandez-Del Río, Dayana Estefania Orozco-Sepulveda, Christian Gonzalez-Padilla, Tomas Miranda-Aquino, Vanessa Salas-Bonales, Judith Carolina De Arcos-Jiménez and Jaime Briseño-Ramírez
Med. Sci. 2026, 14(2), 214; https://doi.org/10.3390/medsci14020214 - 26 Apr 2026
Viewed by 204
Abstract
Background/Objectives: Infective endocarditis (IE) carries substantial mortality, particularly in middle-income settings where patient profiles and microbial ecology differ from those of cohorts used to derive international prognostic scores. Syndrome-specific, locally grounded decision aids for empirical therapy are also scarce. We aimed to identify [...] Read more.
Background/Objectives: Infective endocarditis (IE) carries substantial mortality, particularly in middle-income settings where patient profiles and microbial ecology differ from those of cohorts used to derive international prognostic scores. Syndrome-specific, locally grounded decision aids for empirical therapy are also scarce. We aimed to identify predictors of in-hospital mortality, externally evaluate the RiskE and ICE scores, and construct a Bayesian weighted-incidence syndromic combination antibiogram (WISCA) for IE. Methods: We conducted a retrospective cohort study of consecutive adults with definite or possible IE admitted between January 2019 and January 2026. Candidate predictors were screened in two phases, and a clinically specified model was estimated with maximum-likelihood and Firth penalization, with 1000-replicate bootstrap optimism correction. Calibration was assessed with bootstrap calibration plots and the Hosmer–Lemeshow test. Discrimination was compared against RiskE and ICE using DeLong’s test and reclassification metrics. For empirical coverage, we built a WISCA using identified pathogens, reporting both non-Bayesian bootstrap estimates and Bayesian hierarchical partial-pooling estimates with species- and antibiotic-level random intercepts; analyses were also stratified by IE type. Results: In-hospital mortality was 22.9% in a young cohort (median 37 years) characterized by high hemodialysis prevalence (47.4%), substantial right-sided IE (46.4%), and Staphylococcus aureus predominance (32%) with no methicillin-resistant isolates. Vasopressor-requiring shock (Firth OR 9.23, 95% CI 2.40–40.61) and acute heart failure (OR 10.01, 95% CI 2.78–41.07) were the strongest predictors; the final model achieved an AUC of 0.922 (optimism-corrected 0.908), significantly outperforming RiskE (0.598) and ICE (0.632). The Bayesian WISCA identified multiple carbapenem-sparing and anti-MRSA–sparing regimens with adequate coverage (≥80%), particularly for community-acquired IE, supporting stewardship-oriented empirical selection. Coverage was consistently lower in healthcare-associated IE. Conclusions: A parsimonious three-variable model provided strong, locally valid mortality prediction in this hemodialysis-predominant, MRSA-free cohort, substantially outperforming European-derived scores. External validation in independent cohorts is required before clinical adoption. The Bayesian WISCA demonstrated that adequate empirical coverage is achievable without routine broad-spectrum agents, offering institution-specific guidance for stewardship-compatible regimen selection; multicenter validation is warranted. Full article
(This article belongs to the Section Cardiovascular Disease)
19 pages, 3497 KB  
Article
A Python-Based Workflow for Asbestos Roof Mapping and Temporal Monitoring Using Satellite Imagery
by Giuseppe Bonifazi, Alice Aurigemma, José Salas-Cáceres, Javier Lorenzo-Navarro, Silvia Serranti, Federica Paglietti, Sergio Bellagamba and Sergio Malinconico
Geomatics 2026, 6(3), 41; https://doi.org/10.3390/geomatics6030041 - 25 Apr 2026
Viewed by 194
Abstract
The detection and monitoring of asbestos–cement roofing remain a critical public health and environmental challenge, especially in urban and suburban areas where asbestos-containing materials are still widespread due to their extensive use in the 20th century. Although hyperspectral and high-resolution multispectral remote sensing [...] Read more.
The detection and monitoring of asbestos–cement roofing remain a critical public health and environmental challenge, especially in urban and suburban areas where asbestos-containing materials are still widespread due to their extensive use in the 20th century. Although hyperspectral and high-resolution multispectral remote sensing have proven effective for mapping asbestos–cement roofs, many existing approaches rely on proprietary software, limiting transparency, reproducibility, and large-scale adoption. This study presents a fully reproducible, cost-free Python-based workflow for the detection and temporal monitoring of asbestos–cement roofing using high-resolution multispectral WorldView-3 imagery. The workflow integrates atmospheric correction (using the Py6S radiative transfer model), spatial preprocessing, supervised pixel-based classification, postprocessing, and building-level aggregation within an open framework. A Maximum Likelihood Classifier is applied to VNIR and SWIR data using empirically defined roof typologies to enhance class separability. Pixel-level results are aggregated to the building scale through adaptive thresholding enabling the translation of spectral classifications into meaningful building-level information. Tested over the city of Mantua (Italy), the approach achieved reliable classification performance and enabled multi-temporal comparison to identify changes potentially due to roof remediation. Evaluation metrics (precision, recall, and F1-score) highlight the importance of carefully choosing the building-level threshold. By relying exclusively on open-source tools, the workflow enhances transparency, reproducibility, and scalability for long-term monitoring. Full article
Show Figures

Figure 1

21 pages, 484 KB  
Article
Balancing Work and Life Among Manufacturing Employees: The Role of Job Conditions, Support and Well-Being
by Rasa Balvočiūtė and Rasa Švėgždienė
Sustainability 2026, 18(9), 4239; https://doi.org/10.3390/su18094239 (registering DOI) - 24 Apr 2026
Viewed by 218
Abstract
Work–life balance (WLB) has become a critical component of social sustainability, yet empirical evidence remains uneven across economic sectors. While existing research predominantly focuses on service-oriented and public-sector occupations, comparatively little is known about the determinants of WLB in manufacturing, where high job [...] Read more.
Work–life balance (WLB) has become a critical component of social sustainability, yet empirical evidence remains uneven across economic sectors. While existing research predominantly focuses on service-oriented and public-sector occupations, comparatively little is known about the determinants of WLB in manufacturing, where high job demands, limited flexibility, and structural constraints on autonomy often characterize work. Addressing this gap, the present study examines how job characteristics, support mechanisms, and individual resources shape the likelihood of achieving WLB among manufacturing employees in a rapidly developing European economy. Drawing on the Job Demands–Resources (JD–R) framework, the study employs survey data from 361 manufacturing employees and estimates a series of Probit regression models. To facilitate a meaningful analysis, composite indices were constructed to capture job demands, job flexibility, organizational and social support, psychological boundaries, and overall well-being. Predicted probabilities were used to evaluate both direct effects and interaction patterns in the Probit models. The findings indicate that manageable job demands and individual resources, particularly well-being and effective self-management, are the strongest predictors of WLB. Job flexibility demonstrates a slight positive effect; however, when accounting for individual and structural factors, formal organizational and social support mechanisms do not show statistically significant direct effects. Furthermore, our analysis provides no empirical support for moderating effects, as the interaction terms between job characteristics and support variables are not statistically significant. This suggests that support mechanisms do not consistently modify the relationship between job demands, flexibility, and WLB within the analyzed sample. Overall, the findings underscore the importance of combining supportive organizational contexts with manageable work demands and individual resources to promote sustainable work–life balance in manufacturing. The study contributes sector-specific empirical evidence to sustainability research and offers practical insights for designing socially sustainable work environments in industrial settings. Full article
(This article belongs to the Section Social Ecology and Sustainability)
Show Figures

Figure 1

34 pages, 20484 KB  
Article
A Fast-Fourier-Transform-Based Dynamic Likelihood Ratio Framework for Controlling False Positives in DNA Database Matching
by François-Xavier Laurent, Willem Burgers, Wim Wiegerinck, Cyril Gout and Susan Hitchin
Genes 2026, 17(5), 499; https://doi.org/10.3390/genes17050499 - 23 Apr 2026
Viewed by 674
Abstract
Background/Objectives: Operational DNA databases traditionally rely on static locus-count thresholds to determine search eligibility and report matches. While computationally straightforward, these rigid criteria routinely discard high-value investigative leads from degraded forensic profiles while simultaneously permitting adventitious matches when common alleles are involved. [...] Read more.
Background/Objectives: Operational DNA databases traditionally rely on static locus-count thresholds to determine search eligibility and report matches. While computationally straightforward, these rigid criteria routinely discard high-value investigative leads from degraded forensic profiles while simultaneously permitting adventitious matches when common alleles are involved. To overcome the limitations of static rules, this study introduces an automated framework for dynamic likelihood ratio (LR) thresholding. Methods: Utilizing a Fast Fourier Transform (FFT) algorithm, the system calculates the Probability Mass Function (PMF) for any specific combination of shared loci in real-time, natively incorporating the Balding–Nichols model to account for population substructure. Instead of applying an arbitrary locus count or fixed LR cutoff, the framework defines admissibility based on a user-defined maximum upper bound of acceptable false positives at a specified confidence (probability) level (e.g., 95%). Results: This empowers database custodians to precisely predict and adapt their search criteria to match an acceptable administrative workload, dynamically adjusting the required LR threshold to the exact size of the searched database. This approach was validated through massive-scale empirical simulations across five reference population groups. Receiver Operating Characteristic (ROC) and Poisson distribution analyses reveal that static thresholds inevitably collapse under the multiplicity effect of large-scale comparisons; for instance, a static locus rule that maintains safety within a small DNA database yields an unmanageable false positive risk when scaled to larger DNA databases or international networks like the Prüm DNA Exchange. Conclusions: By explicitly coupling the decision threshold to the database size and the genetic rarity of the evidence, this dynamic framework provides a mathematically rigorous and scalable solution. Most notably, it identifies rare, low-locus matches that static rules typically discard, offering a method to maintain a predefined expected false positive rate. Full article
(This article belongs to the Special Issue Advances and Challenges in Forensic Genetics)
Show Figures

Figure 1

21 pages, 3370 KB  
Article
An Innovative Semiparametric Density Model for the Statistical Characterization of Ground-Vehicle Radar Cross Sections
by Zengcan Liu, Shuhao Wen, Houjun Sun and Ming Deng
Sensors 2026, 26(9), 2572; https://doi.org/10.3390/s26092572 - 22 Apr 2026
Viewed by 270
Abstract
Accurately characterizing the statistical fluctuations of vehicle radar cross sections (RCSs) across polarization states and azimuthal sectors is essential for evaluating detection performance, conducting probabilistic simulations, and analyzing target features in millimeter-wave radar systems. Existing one-dimensional RCS statistical models, including Weibull, Chi-square, Lognormal, [...] Read more.
Accurately characterizing the statistical fluctuations of vehicle radar cross sections (RCSs) across polarization states and azimuthal sectors is essential for evaluating detection performance, conducting probabilistic simulations, and analyzing target features in millimeter-wave radar systems. Existing one-dimensional RCS statistical models, including Weibull, Chi-square, Lognormal, Rice, and Gaussian distributions, are often limited by their restricted functional expressiveness, making it difficult to simultaneously capture skewness, tail thickness, and azimuthal dependence under narrow angular-domain conditions. In addition, purely nonparametric approaches tend to produce spurious modes under finite-sample conditions and lack interpretable structural priors. To address these limitations, this paper proposes a Unimodal RCS Semiparametric Density Estimator (URCS-SDE) tailored for ground-vehicle targets. The proposed approach adopts kernel density estimation (KDE) as a data-driven baseline representation and incorporates physically plausible structural constraints through unimodal shape projection. Then a beta-type tail template is further introduced in the normalized amplitude domain to regulate boundary decay behavior. Finally, weighted least-squares calibration is performed on the histogram grid of the empirical probability density function (PDF), achieving a balanced trade-off between fitting accuracy and stability in both the peak and tail regions. Using multi-azimuth RCS measurements of two representative ground vehicles, the URCS-SDE is systematically compared with five classical parametric distributions and a representative regularized mixture density network (MDN) baseline. Performance is evaluated under both full-azimuth and directional-window conditions using the sum of squared errors (SSE), root mean squared error (RMSE), coefficient of determination (R-square) and held-out negative log-likelihood (NLL). The results show that the URCS-SDE consistently provides the most accurate and stable density estimates, especially in narrow angular windows. In addition, a threshold-based detection-support example derived from the fitted PDFs demonstrates that the advantage of the URCS-SDE transfers from density reconstruction to a directly engineering-relevant downstream quantity. Full article
(This article belongs to the Section Radar Sensors)
Show Figures

Figure 1

30 pages, 558 KB  
Article
The Impact of Digitalization on Farmers’ Recycling Behavior of Pesticide Packaging Waste: Evidence from Rural China
by Congying Zhang and Xinrui Feng
Sustainability 2026, 18(8), 4054; https://doi.org/10.3390/su18084054 - 19 Apr 2026
Viewed by 294
Abstract
The recycling of pesticide packaging waste is crucial for the sustainable development of agriculture and the advancement of ecological civilization. However, the current recycling management still faces challenges. This study adopts a dynamic analytical framework of “ex-ante behavioral cognition and post-event outcome perception” [...] Read more.
The recycling of pesticide packaging waste is crucial for the sustainable development of agriculture and the advancement of ecological civilization. However, the current recycling management still faces challenges. This study adopts a dynamic analytical framework of “ex-ante behavioral cognition and post-event outcome perception” to investigate the impact of digitalization on farmers’ recycling behavior of pesticide packaging waste. The analysis draws on data from the 2020 China Rural Revitalization Survey and examines two dimensions of digitalization: digital technology access and digital technology usage. The findings indicate that integrating digital technologies into farming practices significantly increases the likelihood of farmers participating in pesticide packaging waste recycling programs. These results remain robust after conducting robustness checks and addressing potential endogeneity issues. A heterogeneity analysis reveals that the promotional effect of digitalization varies significantly across different categories of rural elite status, cooperative membership, education level, pesticide spraying methods, and income structure. Mechanism testing further indicates that hazard cognition regarding pesticide packaging serves as a mediating factor in the impact of both digital technology access and usage on farmers’ recycling behavior. In contrast, farmers’ satisfaction with their living environment mediates only the effect of digital technology usage on recycling behavior. Overall, these findings provide both theoretical and empirical support for the hypothesis that digitalization can facilitate the recycling of pesticide packaging waste and enhance the ecological effectiveness of agricultural policy governance. Full article
Show Figures

Figure 1

31 pages, 551 KB  
Article
Frequentist and Bayesian Predictive Inference for the Log-Logistic Distribution Under Progressive Type-II Censoring
by Ziteng Zhang and Wenhao Gui
Entropy 2026, 28(4), 466; https://doi.org/10.3390/e28040466 - 18 Apr 2026
Viewed by 212
Abstract
This paper investigates the prediction of unobserved future failure times for the heavy-tailed Log-Logistic distribution under Progressive Type-II censoring. We first develop point and interval estimates for the unknown parameters using both frequentist maximum likelihood and Bayesian approaches. For predicting future failures, we [...] Read more.
This paper investigates the prediction of unobserved future failure times for the heavy-tailed Log-Logistic distribution under Progressive Type-II censoring. We first develop point and interval estimates for the unknown parameters using both frequentist maximum likelihood and Bayesian approaches. For predicting future failures, we derive three distinct point predictors: the Best Unbiased Predictor (BUP), the Conditional Median Predictor (CMP), and the Bayesian Predictor (BP). Corresponding prediction intervals are constructed using frequentist pivotal quantities, Bayesian Equal-Tailed Intervals (ETIs), and Highest Posterior Density (HPD) methods. The Bayesian procedures are implemented via Markov chain Monte Carlo (MCMC) sampling. We evaluate the finite-sample performance of the proposed methodologies through a Monte Carlo simulation study and further validate them using two real-world datasets, namely bladder cancer remission times and guinea pig survival times. The numerical results indicate that the proposed BP, particularly under the empirical prior, provides the most accurate and stable overall performance for point prediction, while the frequentist predictors become less reliable in extreme heavy-tailed settings. For interval prediction, the Bayesian HPD method consistently outperforms the alternatives, substantially reducing interval lengths for right-skewed data while maintaining the nominal coverage probability. Full article
Show Figures

Figure 1

39 pages, 542 KB  
Article
A Novel Extension of the Weibull Distribution with Application in Quantitative and Reliability Sciences
by Shoaib Iqbal, Bassant Elkalzah, Zawar Hussain and Farrukh Jamal
Symmetry 2026, 18(4), 659; https://doi.org/10.3390/sym18040659 - 15 Apr 2026
Viewed by 213
Abstract
The main focus of this paper is to introduce a new probability model. Specifically, this paper presents a modified form of the Weibull distribution and investigates its various statistical properties, such as moments, moment-generating functions, reliability functions, quantile functions, and inequality measures such [...] Read more.
The main focus of this paper is to introduce a new probability model. Specifically, this paper presents a modified form of the Weibull distribution and investigates its various statistical properties, such as moments, moment-generating functions, reliability functions, quantile functions, and inequality measures such as Bonferroni and Lorenz curves. It also investigates the mean absolute deviation and entropy. Distributions of order statistics, reversed order statistics, and upper record values are also obtained. Additionally, univariate and bivariate moment structures are considered. The model parameters are estimated via the maximum likelihood method under simple random sampling and ranked set sampling, allowing an empirical evaluation of efficiency and reliability. Graphical representations exhibit the flexibility of the model, capturing various shapes in the probability density and hazard rate functions. To measure the practical quality of the model, actuarial metrics are used. A comparative analysis based on insurance, biomedical, and reliability datasets demonstrates the empirically improved performance and stability of the proposed new model for these specific datasets. Full article
(This article belongs to the Section Mathematics)
Show Figures

Figure 1

Back to TopTop