Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (10,737)

Search Parameters:
Keywords = uncertainty data

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
18 pages, 1308 KB  
Article
A New Chaotic Interval-Based Multi-Objective Honey Badger Algorithm for Real-Time Fire Localization
by Khedija Arour, Hadhami Kaabi, Mohamed Ben Farah and Raouf Abozariba
Information 2026, 17(2), 144; https://doi.org/10.3390/info17020144 (registering DOI) - 2 Feb 2026
Abstract
Real-time fire localization in urban environments remains a significant challenge due to sparse IoT sensor deployments, measurement uncertainties, and the computational uses of AI-based estimation techniques. To address these limitations, this paper proposes a Chaotic Interval-Based Multi-Objective Honey Badger Algorithm (CI-MOHBA) designed to [...] Read more.
Real-time fire localization in urban environments remains a significant challenge due to sparse IoT sensor deployments, measurement uncertainties, and the computational uses of AI-based estimation techniques. To address these limitations, this paper proposes a Chaotic Interval-Based Multi-Objective Honey Badger Algorithm (CI-MOHBA) designed to improve the accuracy and reliability of fire source localization under uncertain and limited sensor data. The approach formulates localization as a multi-objective optimization problem that simultaneously minimizes source estimation error, false alarm rates, and computation time. CI-MOHBA integrates a new chaotic map to improve global search capability and interval arithmetic to effectively manage sensor uncertainty within sparse measurement environments. Experimental evaluation of the proposed chaotic map, supported by entropy convergence analysis and Lyapunov exponent verification, demonstrates the stability and robustness of the proposed technique. Results indicate that CI-MOHBA achieves an average localization error of 0.73 m and a false alarm rate of 8.2%, while maintaining high computational efficiency. Results show that the proposed algorithm is well-suited for real-time fire localization in urban IoT-based monitoring systems. Full article
(This article belongs to the Special Issue AI and Data Analysis in Smart Cities)
Show Figures

Figure 1

26 pages, 2003 KB  
Article
Global Use of Casein Glycomacropeptide Protein Substitutes for Phenylketonuria (PKU): Health Professional Perspectives
by Sharon Evans, Rani Singh, Kirsten Ahring, Catherine Ashmore, Anne Daly, Suzanne Ford, Maria Ines Gama, Maria Giżewska, Melanie Hill, Fatma Ilgaz, Richard Jackson, Camille Newby, Alex Pinto, Martina Tosi, Ozlem Yilmaz Nas, Juri Zuvadelli and Anita MacDonald
Nutrients 2026, 18(3), 488; https://doi.org/10.3390/nu18030488 (registering DOI) - 2 Feb 2026
Abstract
Background/Objectives: Casein glycomacropeptide (cGMP) has been modified to enable its suitability as a low phenylalanine (Phe) protein substitute (PS) in phenylketonuria (PKU). No data is available about its global usage. Methods: A 60-item multiple choice and short answer/extended response questionnaire examining the use [...] Read more.
Background/Objectives: Casein glycomacropeptide (cGMP) has been modified to enable its suitability as a low phenylalanine (Phe) protein substitute (PS) in phenylketonuria (PKU). No data is available about its global usage. Methods: A 60-item multiple choice and short answer/extended response questionnaire examining the use of modified cGMP in PKU was distributed globally to dietitians and physicians via web-based professional inherited metabolic disorder groups. Results: Respondents (n = 208) from 45 countries across 6 continents completed the questionnaire. Of these, 83.7% (n = 174) were dietitians/nutritionists, 14.9% (n = 31) medical doctors/physicians and 1.4% (n = 3) other health professionals, caring for both paediatric and adult patients (59.1%), paediatrics only (25.0%) or adults only (15.9%). cGMP PS were reported as not available in their centre/hospital by 19.7% (n = 41), mostly in Africa, South America, and southern and western Asia. The main reasons included lack of regulatory approval (65.8%), not promoted by manufacturers (41.5%), and cost (29.3%). An estimated 25% of represented patients globally were using cGMP PS; 78.4% (n = 163) following refusal/poor adherence with Phe-free amino acids and 54.8% (n = 114) for adult patients recommencing dietary treatment. There were concerns about the residual Phe in cGMP negatively impacting blood Phe levels in children <12y (66.3%), adolescents (48.0%), adults (34.6%), and the first trimester of pregnancy (53.1%). Sixty nine percent (n = 145) adjusted dietary Phe prescription according to the cGMP Phe content, particularly in regions with a higher percentage of severe PKU variants. Commonly perceived clinical advantages with cGMP were improved taste/palatability (93.2%, n = 194) and fewer gastrointestinal symptoms (55.8%, n = 116). Perceived clinical disadvantages were residual Phe (72.1%, n = 150), lack of data in children < 3 years (48.1%, n = 100), and the high energy content of some brands (45.2%, n = 94). There were concerns that cGMP PS were too high in sugar (34.1%, n = 71) and dissatisfaction or uncertainty about the adequacy of its Phe (66.3%) and amino acid (34.1%) content. Conclusions: There is global inconsistency in access to cGMP PS suitable for PKU, and in the interpretation of evidence-based research. Some professionals have significant concerns about its nutritional composition particularly residual Phe, limiting its estimated use to approximately 25% of PKU patients globally. Full article
(This article belongs to the Special Issue Dietary Management for Patients with Inborn Errors of Metabolism)
Show Figures

Figure 1

21 pages, 2047 KB  
Article
Thermographic Diagnosis of Corrosion-Driven Contact Degradation in Power Equipment Using Infrared Imaging and Color-Channel Decomposition
by Milton Ruiz and Carlos Betancourt
Energies 2026, 19(3), 766; https://doi.org/10.3390/en19030766 (registering DOI) - 1 Feb 2026
Abstract
This study presents a measurement–modeling pathway for diagnosing corrosion-driven contact degradation in power equipment using infrared thermography and color-channel analysis. Thermal data were acquired with a Fluke Ti450 (LWIR, 7.5–14 μm) under typical high-altitude, temperate conditions in Quito, Ecuador. Radiometric parameters [...] Read more.
This study presents a measurement–modeling pathway for diagnosing corrosion-driven contact degradation in power equipment using infrared thermography and color-channel analysis. Thermal data were acquired with a Fluke Ti450 (LWIR, 7.5–14 μm) under typical high-altitude, temperate conditions in Quito, Ecuador. Radiometric parameters (emissivity, distance, ambient/reflected temperature, and humidity) are reported explicitly, and images are processed with a reproducible pipeline that combines adaptive thresholding, morphology, and region-of-interest statistics, including ΔT relative to a reference region. A worked example links an observed hotspot to emissivity-corrected temperature and discusses qualitative implications for the effective contact resistance Reff. Uncertainty is summarized through a per-case template that propagates uΔT to u(Reff) and Weibull characteristic life η. Environmental influences (solar load, wind, and emissivity variability) are acknowledged and mitigated. Two field cases illustrate the approach to substation assets. Because the dataset comprises single-visit inspections, formal parameter estimation (e.g., EIS-validated Reff and full Weibull/Arrhenius fits) is reserved for longitudinal follow-up. By making radiometry, processing steps, and limitations explicit, the study reduces ambiguity in the transition from temperature contrast to physics-based interpretation and supports auditable maintenance decisions. Full article
(This article belongs to the Section F: Electrical Engineering)
24 pages, 3539 KB  
Article
Novel Approach Using Multi-Source Features and Attention Mechanism for Crude Oil Futures Price Prediction
by Xin-Ying Liu, Ming-Ge Yang, Xiao-Zhen Liang and Juan Zhang
Computers 2026, 15(2), 88; https://doi.org/10.3390/computers15020088 (registering DOI) - 1 Feb 2026
Abstract
As an emerging trading market, the crude oil futures market has exhibited substantial uncertainty since its inception. Influenced by macroeconomic and geopolitical factors, its price movements are highly nonlinear and nonstationary, making accurate forecasting challenging. Therefore, it is vital to develop a powerful [...] Read more.
As an emerging trading market, the crude oil futures market has exhibited substantial uncertainty since its inception. Influenced by macroeconomic and geopolitical factors, its price movements are highly nonlinear and nonstationary, making accurate forecasting challenging. Therefore, it is vital to develop a powerful forecasting model for crude oil futures prices. However, conventional forecasting models rely solely on historical data and fail to capture the intrinsic patterns of complex sequences. This work presents a hybrid deep learning framework that incorporates multi-source features and a state-of-the-art attention mechanism. Specifically, search engine data were collected and integrated into the explanatory variables. By using lagged historical prices and search engine data to forecast future crude oil futures closing prices, the proposed framework effectively avoids lookahead bias. To reduce forecasting difficulty, the initial time series were then decomposed and reconstructed into several sub-sequences. Thereafter, traditional time series models (ARIMA) and attention-enhanced deep learning models were selected to forecast the reconstructed sub-sequences based on their distinct data features. The empirical study conducted on the INE crude oil futures price proves that the proposed model outperforms other benchmark models. The findings help fill the gap in the quantitative literature on crude oil futures price forecasting and offer valuable theoretical insights for affiliated policymakers, enterprises, and investors. Full article
Show Figures

Figure 1

17 pages, 1356 KB  
Article
Application of Homomorphic Encryption for a Secure-by-Design Approach to Protect the Confidentiality of Data in Proficiency Testing and Interlaboratory Comparisons
by Davor Vinko, Mirko Köhler, Kruno Miličević and Ivica Lukić
Telecom 2026, 7(1), 14; https://doi.org/10.3390/telecom7010014 (registering DOI) - 1 Feb 2026
Abstract
Accredited laboratories participating in Proficiency Testing (PT) and Interlaboratory Comparison (ILC) typically submit measurement results (and associated uncertainties) to an organizer for performance evaluation using statistics such as the z-score and the En value. This requirement can undermine confidentiality when the disclosed plaintext [...] Read more.
Accredited laboratories participating in Proficiency Testing (PT) and Interlaboratory Comparison (ILC) typically submit measurement results (and associated uncertainties) to an organizer for performance evaluation using statistics such as the z-score and the En value. This requirement can undermine confidentiality when the disclosed plaintext values reveal commercially sensitive methods or client-related information. This paper proposes a secure-by-design PT/ILC workflow based on fully homomorphic encryption (FHE), enabling the required scoring computations to be executed directly on ciphertexts. Using the CKKS scheme (Microsoft SEAL), the organizer distributes encrypted assigned values and a public/evaluation key set; each participant locally encrypts pre-processed measurement data, evaluates encrypted z-score and En value, and returns only encrypted performance metrics. The organizer decrypts the metrics without receiving the ciphertexts of participants’ raw measurement values. We quantify feasibility via execution time, run-to-run variability across fresh key generations (coefficient of variation), and relative calculation error versus plaintext scoring. On commodity hardware, end-to-end score computation takes 1 to 8 s, the coefficient of variation can be reduced below 1e−10, and the relative error remains below 1e−6, indicating practical deployability and numerical stability for PT/ILC decision-making. Given that PT/ILC reporting cycles are typically on the order of days to weeks, a per-participant computation time of seconds is operationally negligible, while the observed coefficient of variation and relative error indicate that the CKKS approximation and key-dependent variability are far below typical decision thresholds used for pass/fail classification. Full article
Show Figures

Figure 1

26 pages, 1183 KB  
Review
From Production to Application: Postbiotics in Meat, Meat Products, Other Food Matrices, and Bioactive Packaging
by Miłosz Trymers, Patryk Wiśniewski, Katarzyna Tkacz and Arkadiusz Zakrzewski
Foods 2026, 15(3), 501; https://doi.org/10.3390/foods15030501 (registering DOI) - 1 Feb 2026
Abstract
Postbiotics represent a promising strategy for reconciling increasing consumer demand for clean-label foods with the need to maintain high microbiological safety standards. The present review analyzed the applications of postbiotics in meat products, other food matrices and bioactive packaging, with particular emphasis on [...] Read more.
Postbiotics represent a promising strategy for reconciling increasing consumer demand for clean-label foods with the need to maintain high microbiological safety standards. The present review analyzed the applications of postbiotics in meat products, other food matrices and bioactive packaging, with particular emphasis on their production methods, compositional analysis and antimicrobial properties. Available evidence indicates that postbiotics offer important technological advantages over live probiotics, including enhanced stability during processing and storage and the absence of viable cells, which facilitates their integration into established food quality and safety control systems. The reviewed studies show that postbiotics produced mainly via fermentation with selected lactic acid bacteria and subsequently stabilized, most often by freeze-drying, exhibit pronounced antimicrobial activity in diverse food matrices, particularly meat and dairy products. Their ability to inhibit the growth of major foodborne pathogens, such as Listeria monocytogenes, Staphylococcus aureus, Escherichia coli, and Salmonella spp., highlights their potential as effective biopreservatives contributing to shelf-life extension and improved microbiological safety. From an industrial perspective, postbiotics can be implemented within the framework of hurdle technology and incorporated into active packaging systems and edible coatings. The wider use of postbiotics in industry remains limited by regulatory uncertainty and methodological diversity. Key challenges include inconsistent taxonomic/strain reporting, divergent methods of inactivation and final processing (which alter bioactive profiles), lack of standardized composition and potency testing, and limited food matrix validation and toxicological data. To eliminate these gaps, regulatory definitions and labelling should be harmonized, and guidelines for production and reporting (strain identity, inactivation parameters, preservation method), and targeted safety and shelf-life testing are recommended. These steps are necessary to translate the documented antibacterial and antioxidant properties of postbiotics into industrial applications. Full article
(This article belongs to the Special Issue Feature Review on Food Analytical Methods)
Show Figures

Figure 1

61 pages, 1035 KB  
Article
Sustainable Cross-Cultural Service Management: Cultural Intelligence as a Mediating Mechanism Between Cultural Values and Influence Tactics in International Civil Aviation
by Ercan Ergün, Tunay Sever Elüstün and Yavuz Selim Balcıoğlu
Sustainability 2026, 18(3), 1443; https://doi.org/10.3390/su18031443 (registering DOI) - 1 Feb 2026
Abstract
Sustainable service excellence in globalized industries requires organizations to develop workforce capabilities that support long-term relationship-building, cultural respect, and effective cross-cultural communication. This study examines how cultural intelligence functions as a mechanism for sustainable cross-cultural workforce development by investigating relationships among individual cultural [...] Read more.
Sustainable service excellence in globalized industries requires organizations to develop workforce capabilities that support long-term relationship-building, cultural respect, and effective cross-cultural communication. This study examines how cultural intelligence functions as a mechanism for sustainable cross-cultural workforce development by investigating relationships among individual cultural values, cultural intelligence dimensions, and influence tactics among airline cabin crew members. Integrating Hofstede’s cultural dimensions framework, Ang and colleagues’ cultural intelligence model, and Yukl’s influence tactics taxonomy, we test a comprehensive mediation model using survey data from six hundred and sixty-three cabin crew members employed by international airlines operating in Turkey. The findings reveal that collectivism, long-term orientation, and uncertainty avoidance positively predict cultural intelligence development, creating foundations for sustainable cross-cultural competence. Cultural intelligence dimensions demonstrate differentiated effects on influence tactics, with metacognitive and behavioral cultural intelligence enhancing rational persuasion, behavioral cultural intelligence exclusively predicting relational tactics, and complex competitive mediation patterns for coercive tactics wherein motivational cultural intelligence reduces pressure-based influence while cognitive and behavioral dimensions increase strategic assertiveness. Cultural values directly influence tactics beyond cultural intelligence effects, with uncertainty avoidance most strongly predicting both rational and relational approaches that support relationship sustainability, while masculinity and power distance drive coercive tactics that may undermine long-term service relationships. These findings demonstrate that cultural intelligence functions as a multidimensional mediating mechanism with sometimes opposing effects, challenging assumptions that cross-cultural competencies uniformly produce sustainable outcomes. The research contributes to sustainable human resource management theory by illuminating how cultural socialization influences behavioral outcomes through complex psychological pathways, while offering practical guidance for aviation industry recruitment, training, and performance management systems seeking to build sustainable cross-cultural service capabilities. By revealing that certain cultural intelligence dimensions can enable both relationship-building and strategic coercion, the study highlights the importance of coupling cross-cultural skill development with ethical frameworks and motivational engagement to ensure that enhanced cultural capabilities support rather than undermine sustainable, respectful cross-cultural service relationships. Full article
23 pages, 2667 KB  
Review
Physics-Informed Decision Framework for Reuse of Reclaimed Steel Members Under Uncertainty
by Sina Sarfarazi, Marcello Fulgione and Francesco Fabbrocino
Metals 2026, 16(2), 171; https://doi.org/10.3390/met16020171 (registering DOI) - 1 Feb 2026
Abstract
Structural steel reuse can gain large embodied-carbon savings, yet it is still not widely adopted since approval depends on the quality of the evidence, how uncertainty is handled, and if the design requirements are followed, not just on resistance. Reclaimed members frequently lack [...] Read more.
Structural steel reuse can gain large embodied-carbon savings, yet it is still not widely adopted since approval depends on the quality of the evidence, how uncertainty is handled, and if the design requirements are followed, not just on resistance. Reclaimed members frequently lack dependable documentation regarding material grade, loading history, boundary conditions, connection status, and degradation. For reuse decisions, conservative default assumptions protect safety but frequently eliminate qualified reuse options. This research examines data-driven and physics-informed computational methods from a decision-making standpoint, contending that their significance resides in facilitating an auditable approval process, not in supplanting deterministic verification. We differentiate feasibility, acceptability, and approval as distinct engineering phases. Data-driven models are thought of as tools for quickly screening candidates, surrogate evaluation, inverse reasoning, and stock-to-demand matching. Their goal is to reduce the list of candidates and prioritize evidence collection. Physics-informed approaches are examined as admissibility filters that impose restrictions of equilibrium, compatibility, stability, and plausible boundary-condition envelopes; therefore, minimizing mechanically invalid predictions under partial information. Next, we consider uncertainty quantification and explainability to be essential for reuse decisions. We suggest practical outputs for approval packages, such as resistance bounds within specified assumption envelopes, sensitivity rankings of decision-critical unknowns, low-support flags, and evidence actions for conditional acceptance. This document is organized into a process from audit to approval. It also states the open issues in reuse-specific datasets, standardized evidence capturing, decision-relevant validation under degradation, and regulatory acceptance. The resulting framework clarifies how advanced computational tools can enable adaptable, conservative, and transparent steel reuse in practice. Full article
(This article belongs to the Special Issue Novel Insights and Advances in Steels and Cast Irons (2nd Edition))
Show Figures

Figure 1

35 pages, 928 KB  
Article
Cyber Risk Management of API-Enabled Financial Crime in Open Banking Services
by Odion Gift Ojehomon, Joanna Cichorska and Jerzy Michnik
Entropy 2026, 28(2), 163; https://doi.org/10.3390/e28020163 (registering DOI) - 31 Jan 2026
Abstract
Open banking reshapes the financial sector by enabling regulated third-party providers to access bank data through APIs, fostering innovation but amplifying operational and financial-crime risks due to increased ecosystem interdependence. To address these challenges, this study proposes an integrated risk-management framework combining System [...] Read more.
Open banking reshapes the financial sector by enabling regulated third-party providers to access bank data through APIs, fostering innovation but amplifying operational and financial-crime risks due to increased ecosystem interdependence. To address these challenges, this study proposes an integrated risk-management framework combining System Dynamics, Agent-Based Modelling, and Monte Carlo simulation. This hybrid approach captures feedback effects, heterogeneous agent behaviour, and loss uncertainty within a simulated PSD2-style environment. Simulation experiments, particularly those modelling credential-stuffing waves, demonstrate that stricter onboarding thresholds, tighter API rate limits, and enhanced anomaly detection reduce operational tail losses by approximately 20–30% relative to baseline scenarios. Beyond these specific findings, the proposed framework exhibits significant universality; its modular design facilitates adaptation to broader contexts, including cross-border regulatory variations or emerging BigTech interactions. Ultimately, this multi-method approach translates complex open-banking dynamics into actionable risk metrics, providing a robust basis for targeted resource allocation and supervisory stress testing in evolving financial ecosystems. Full article
11 pages, 253 KB  
Article
Determinants of Severe Financial Distress in U.S. Acute Care Hospitals: A National Longitudinal Study
by James R. Langabeer, Francine R. Vega, Audrey Sarah Cohen, Tiffany Champagne-Langabeer, Andrea J. Yatsco and Karima Lalani
Healthcare 2026, 14(3), 366; https://doi.org/10.3390/healthcare14030366 (registering DOI) - 31 Jan 2026
Abstract
Background: Financial sustainability remains a central challenge for U.S. hospitals as rising operating costs, shifting federal reimbursement, and policy uncertainty intensify economic pressures. This study estimates the prevalence and recent changes in financial distress among U.S. short-term acute care hospitals. Methods: [...] Read more.
Background: Financial sustainability remains a central challenge for U.S. hospitals as rising operating costs, shifting federal reimbursement, and policy uncertainty intensify economic pressures. This study estimates the prevalence and recent changes in financial distress among U.S. short-term acute care hospitals. Methods: We conducted a national longitudinal analysis of all U.S. short-term acute care hospitals from 2021 to 2023 using financial and operational data from Medicare cost reports linked with community-level data from the American Community Survey. Financial distress was measured using the Altman Z-score, with severe distress defined as Z ≤ 1.8. Logistic regression models were used to identify organizational, operational, and market characteristics associated with distress. Results: The proportion of hospitals classified as severely financially distressed increased from 18.6% in 2021 to 22.0% in 2023. Operating margins and returns on assets declined significantly over the study period, while mean Z-scores showed a modest but non-significant downward trend. In adjusted models, urban hospitals had higher odds of distress (OR 1.27, 95% CI 1.15–1.40, p < 0.001), as did hospitals with longer average lengths of stay (OR 1.07 per day, 95% CI 1.04–1.09, p < 0.001) and higher debt-to-equity ratios (OR 1.05 per unit, 95% CI 1.05–1.06, p < 0.001). Higher occupancy rates were protective (OR 0.31, 95% CI 0.25–0.40, p < 0.001). Larger market population was also associated with increased distress risk (OR 1.61, 95% CI 1.21–2.14, p = 0.001), while other market characteristics were not significant. Conclusions: Financial distress remains widespread and appears to be increasing among U.S. acute care hospitals. Operational efficiency, capital structure, and local market scale are key drivers of financial vulnerability, highlighting the need for targeted strategies to strengthen hospital resilience and preserve access to essential acute care services. Full article
(This article belongs to the Section Healthcare Organizations, Systems, and Providers)
41 pages, 3116 KB  
Review
An In-Depth Review on Sensing, Heat-Transfer Dynamics, and Predictive Modeling for Aircraft Wheel and Brake Systems
by Lusitha S. Ramachandra, Ian K. Jennions and Nicolas P. Avdelidis
Sensors 2026, 26(3), 921; https://doi.org/10.3390/s26030921 (registering DOI) - 31 Jan 2026
Abstract
An accurate prediction of aircraft wheel and brake (W&B) temperatures is increasingly important for ensuring landing gear safety, supporting turnaround decision-making, and allowing for more effective condition monitoring. Although the thermal behavior of brake assemblies has been studied through component-level testing, analytical formulations, [...] Read more.
An accurate prediction of aircraft wheel and brake (W&B) temperatures is increasingly important for ensuring landing gear safety, supporting turnaround decision-making, and allowing for more effective condition monitoring. Although the thermal behavior of brake assemblies has been studied through component-level testing, analytical formulations, and numerical simulation, current understandings remain fragmented and limited in operational relevance. This paper discusses research across landing gear sensing, thermal modeling, and data-driven prediction to evaluate the state of knowledge supporting a non-intrusive, temperature-centric monitoring framework. Methods surveyed include optical, electromagnetic, acoustic, and infrared sensing techniques as well as traditional machine-learning methods, sequence-based models, and emerging hybrid physics–data approaches. The review synthesizes findings on conduction, convection, and radiation pathways; phase-dependent cooling behavior during landing roll, taxi, and wheel-well retraction; and the capabilities and limitations of existing numerical and empirical models. This study highlights four core gaps: the scarcity of real-flight thermal datasets, insufficient multi-physics integration, limited use of infrared thermography for spatial temperature mapping, and the absence of advanced predictive models for transient brake temperature evolution. Opportunities arise from emissivity-aware infrared thermography, multi-modal dataset development, and machine learning models capable of capturing transient thermal dynamics, while notable challenges relate to measurement uncertainty, environmental sensitivity, model generalization, and deployment constraints. Overall, this review establishes a coherent foundation for thermography-enabled temperature prediction framework for aircraft wheels and brakes. Full article
16 pages, 491 KB  
Article
Insuring Algorithmic Operations: Liability Risk, Pricing, and Risk Control
by Zhiyong (John) Liu, Jin Park, Mengying Wang and He Wen
Risks 2026, 14(2), 26; https://doi.org/10.3390/risks14020026 (registering DOI) - 31 Jan 2026
Abstract
Businesses increasingly rely on algorithmic systems and machine learning models to make operational decisions about customers, employees, and counterparties. These “algorithmic operations” can improve efficiency but also concentrate liability in a small number of technically complex, drifting models. Algorithmic operations liability (AOL) risk [...] Read more.
Businesses increasingly rely on algorithmic systems and machine learning models to make operational decisions about customers, employees, and counterparties. These “algorithmic operations” can improve efficiency but also concentrate liability in a small number of technically complex, drifting models. Algorithmic operations liability (AOL) risk arises when these systems generate legally cognizable harm. We develop a simple taxonomy of AOL risk sources: model error and bias, data quality failures, distribution shift and concept drift, miscalibration, machine learning operations (MLOps) and integration failures, governance gaps, and ecosystem-level externalities. Building on this taxonomy, we outline a simple analysis of AOL risk pricing using some basic actuarial building blocks: (i) a confusion-matrix-based expected-loss model for false positives and false negatives; (ii) drift-adjusted error rates and stress scenarios; and (iii) credibility-weighted rates when insureds have limited experience data. We then introduce capital and loss surcharges that incorporate distributional uncertainty and tail risk. Finally, we link the framework to AOL risk controls by identifying governance, documentation, model-monitoring, and MLOps practices that both reduce loss frequency and severity and serve as underwriting prerequisites. Full article
(This article belongs to the Special Issue AI for Financial Risk Perception)
19 pages, 657 KB  
Article
Entropy-Based Patent Valuation: Decoding “Costly Signals” in the Food Industry via a Robust Entropy–TOPSIS Framework
by Xiaoman Li, Wei Liu, Xiaohe Liang and Ailian Zhou
Entropy 2026, 28(2), 159; https://doi.org/10.3390/e28020159 (registering DOI) - 31 Jan 2026
Abstract
Accurate patent valuation remains a persistent challenge in intellectual property management, particularly in the food industry, where technological homogeneity and rapid innovation cycles introduce substantial noise into observable performance indicators. Traditional valuation approaches, whether based on subjective expert judgment or citation-based metrics, often [...] Read more.
Accurate patent valuation remains a persistent challenge in intellectual property management, particularly in the food industry, where technological homogeneity and rapid innovation cycles introduce substantial noise into observable performance indicators. Traditional valuation approaches, whether based on subjective expert judgment or citation-based metrics, often struggle to effectively reduce information uncertainty in this context. To address this limitation, this study proposes an objective, data-driven patent valuation framework grounded in information theory. We construct a multidimensional evaluation system comprising nine indicators across technological, legal, and economic dimensions and apply it to a large-scale dataset of 100,648 invention patents. To address the heavy-tailed nature of patent indicators without sacrificing the information contained in high-impact outliers, we introduce a square-root transformation strategy that stabilizes dispersion while preserving ordinal relationships. Indicator weights are determined objectively via Shannon entropy, capturing the relative scarcity and discriminatory information content of each signal, after which comprehensive value scores are derived using the TOPSIS method. Empirical results reveal that the entropy-based model assigns dominant weights to so-called “costly signals”, specifically PCT applications (29.53%) and patent transfers (24.36%). Statistical correlation analysis confirms that these selected indicators are significantly associated with patent value (p<0.001), while bootstrapping tests demonstrate the robustness of the resulting weight structure. The model’s validity is further evaluated using an external benchmark (“ground truth”) dataset comprising 55 patents recognized by the China Patent Award. The proposed framework demonstrates substantially stronger discriminatory capability than baseline methods, awarded patents achieve an average score 2.64 times higher than that of ordinary patents, and the enrichment factor for award-winning patents within the Top-100 ranking reaches 91.5. Additional robustness analyses, including benchmarking against the Weighted Sum Model (WSM), further confirm the methodological stability of the framework, with sensitivity analysis revealing an exceptional enrichment factor of 183.1 for the Top-50 patents. These findings confirm that the Entropy–TOPSIS framework functions as an effective information-filtering mechanism, amplifying high-value patent signals in noise-intensive environments. Consequently, the proposed model serves as a generalizable and theoretically grounded tool for objective patent valuation, with particular relevance to industries characterized by heavy-tailed data and high information uncertainty. Full article
(This article belongs to the Section Multidisciplinary Applications)
21 pages, 1294 KB  
Systematic Review
Characteristics of Digital Health Interventions Associated with Improved Glycemic Control in T2DM: A Systematic Review and Meta-Analysis
by Oscar Eduardo Rodríguez-Montes, María del Carmen Gogeascoechea-Trejo and Clara Bermúdez-Tamayo
J. Clin. Med. 2026, 15(3), 1123; https://doi.org/10.3390/jcm15031123 (registering DOI) - 31 Jan 2026
Abstract
Background/Objective: Type 2 Diabetes Mellitus (T2DM) represents a major increasing burden for primary care systems worldwide. Digital health interventions (DHIs) have been proposed as scalable tools to improve glycemic control, yet uncertainty remains regarding which intervention characteristics yield the greatest benefit. To evaluate [...] Read more.
Background/Objective: Type 2 Diabetes Mellitus (T2DM) represents a major increasing burden for primary care systems worldwide. Digital health interventions (DHIs) have been proposed as scalable tools to improve glycemic control, yet uncertainty remains regarding which intervention characteristics yield the greatest benefit. To evaluate the effectiveness of DHIs on HbA1c levels in adults with T2DM and to examine whether intervention duration, engagement intensity, glucometer integration, and healthcare provider involvement modify glycemic outcomes. Data Sources: PubMed, Embase, Cochrane Library, and JMIR databases were systematically searched for relevant studies published between January 2020 and May 2025. Study Eligibility Criteria: Randomized controlled trials comparing DHIs plus usual care versus usual care alone in adults with T2DM and reporting HbA1c as the primary outcome. Methods: Data were extracted using the Jadad scale and TIDieR framework. Random-effects meta-analysis estimated pooled mean differences (MD) in HbA1c with 95% CIs. Subgroup analyses examined effects by intervention characteristics. Heterogeneity and sources of variance were assessed through Cochran’s Q, I2, meta-regression, and sensitivity analyses (leave-one-out and trim-and-fill). Results: Thirteen RCTs (n ≈ 20,000) met inclusion criteria. DHIs achieved significant HbA1c reductions (range 0.01% to 1.57%; pooled MD −1.08%; 95% CI −1.18 to −0.99; p = 0.001). Short-term (≤6 months), low-intensity interventions showed the largest effect sizes (MD −1.16%, 95% CI 0.94 to 1.39). Glucometer integration and healthcare provider involvement contributed minimally to additional benefit. Meta-regression confirmed substantial heterogeneity, but no single factor explained variance across studies. Limitations: Considerable heterogeneity across interventions and variability in engagement measurement may limit the generalizability of findings. Conclusions: Short-term, low-intensity DHIs significantly improve glycemic control in primary care populations with T2DM. Advanced meta-analytic techniques confirm the robustness of these effects, providing practical guidance for selecting and implementing effective digital interventions in routine diabetes care. Full article
(This article belongs to the Section Endocrinology & Metabolism)
Show Figures

Figure 1

23 pages, 387 KB  
Article
Least Absolute Deviation Estimation for Uncertain Regression Model via Uncertainty Distribution and Its Application in Sport Statistics
by Yichen Dong
Symmetry 2026, 18(2), 260; https://doi.org/10.3390/sym18020260 - 30 Jan 2026
Abstract
Uncertain regression analysis is a powerful tool for analyzing and interpreting the complex relationships between explanatory and response variables under uncertain environments, and a crucial step in analyzing datasets containing complex uncertainties is statistical inference based on uncertain parameter estimation methods. However, the [...] Read more.
Uncertain regression analysis is a powerful tool for analyzing and interpreting the complex relationships between explanatory and response variables under uncertain environments, and a crucial step in analyzing datasets containing complex uncertainties is statistical inference based on uncertain parameter estimation methods. However, the existing parameter estimation studies of uncertain regression models all fail to effectively avoid the negative impact of outliers on the estimation results. To solve the above problem and further enrich the parameter estimation research, this paper constructs a symmetric statistical invariant for the uncertain regression model based on observed data and uncertain disturbance terms. Based on this statistical invariant, the least absolute deviation criterion is applied to propose a least absolute deviation estimation for the uncertain regression model. Finally, two numerical examples are provided to illustrate the advantages of the proposed method compared to existing methods, and the comparative results show that in certain scenarios, the least absolute deviation estimation method exhibits superior performance compared to other existing methods in terms of mean squared error, mean absolute error, and mean absolute percentage error. Furthermore, as a byproduct of this paper, the proposed method is applied to sports statistics, and two empirical cases are also provided to demonstrate the effectiveness of this application. Full article
Show Figures

Figure 1

Back to TopTop