Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (320)

Search Parameters:
Keywords = calibrated confidence

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
13 pages, 793 KiB  
Communication
Gamma-Ray Bursts Calibrated by Using Artificial Neural Networks from the Pantheon+ Sample
by Zhen Huang, Xin Luo, Bin Zhang, Jianchao Feng, Puxun Wu, Yu Liu and Nan Liang
Universe 2025, 11(8), 241; https://doi.org/10.3390/universe11080241 - 23 Jul 2025
Viewed by 137
Abstract
In this paper, we calibrate the luminosity relation of gamma−ray bursts (GRBs) by employing artificial neural networks (ANNs) to analyze the Pantheon+ sample of type Ia supernovae (SNe Ia) in a manner independent of cosmological assumptions. The A219 GRB dataset is used to [...] Read more.
In this paper, we calibrate the luminosity relation of gamma−ray bursts (GRBs) by employing artificial neural networks (ANNs) to analyze the Pantheon+ sample of type Ia supernovae (SNe Ia) in a manner independent of cosmological assumptions. The A219 GRB dataset is used to calibrate the Amati relation (Ep-Eiso) at low redshift with the ANN framework, facilitating the construction of the Hubble diagram at higher redshifts. Cosmological models are constrained with GRBs at high redshift and the latest observational Hubble data (OHD) via the Markov chain Monte Carlo numerical approach. For the Chevallier−Polarski−Linder (CPL) model within a flat universe, we obtain Ωm=0.3210.069+0.078h=0.6540.071+0.053w0=1.020.50+0.67, and wa=0.980.58+0.58 at the 1 −σ confidence level, which indicates a preference for dark energy with potential redshift evolution (wa0). These findings using ANNs align closely with those derived from GRBs calibrated using Gaussian processes (GPs). Full article
Show Figures

Figure 1

17 pages, 1316 KiB  
Article
A Low-Cost IoT-Based Bidirectional Torque Measurement System with Strain Gauge Technology
by Cosmin Constantin Suciu, Virgil Stoica, Mariana Ilie, Ioana Ionel and Raul Ionel
Appl. Sci. 2025, 15(15), 8158; https://doi.org/10.3390/app15158158 - 22 Jul 2025
Viewed by 333
Abstract
The scope of this paper is the development of a cost-effective wireless torque measurement system for vehicle drivetrain shafts. The prototype integrates strain gauges, an HX711 conditioner, a Wemos D1 Mini ESP8266, and a rechargeable battery directly on the rotating shaft, forming a [...] Read more.
The scope of this paper is the development of a cost-effective wireless torque measurement system for vehicle drivetrain shafts. The prototype integrates strain gauges, an HX711 conditioner, a Wemos D1 Mini ESP8266, and a rechargeable battery directly on the rotating shaft, forming a self-contained sensor node. Calibration against a certified dynamometric wrench confirmed an operating span of ±5–50 N·m. Within this range, the device achieved a mean absolute error of 0.559 N·m. It also maintained precision better than ±2.5 N·m at 95% confidence, while real-time data were transmitted via Wi-Fi. The total component cost is below EUR 30 based on current prices. The novelty of this proof-of-concept implementation demonstrates that reliable, IoT-enabled torque sensing can be realized with low-cost, readily available parts. The paper details assembly, calibration, and deployment procedures, providing a transparent pathway for replication. By aligning with Industry 4.0 requirements for smart, connected equipment, the proposed torque measurement system offers an affordable solution for process monitoring and predictive maintenance in automotive and industrial settings. Full article
(This article belongs to the Section Electrical, Electronics and Communications Engineering)
Show Figures

Figure 1

15 pages, 1991 KiB  
Article
Hybrid Deep–Geometric Approach for Efficient Consistency Assessment of Stereo Images
by Michał Kowalczyk, Piotr Napieralski and Dominik Szajerman
Sensors 2025, 25(14), 4507; https://doi.org/10.3390/s25144507 - 20 Jul 2025
Viewed by 453
Abstract
We present HGC-Net, a hybrid pipeline for assessing geometric consistency between stereo image pairs. Our method integrates classical epipolar geometry with deep learning components to compute an interpretable scalar score A, reflecting the degree of alignment. Unlike traditional techniques, which may overlook subtle [...] Read more.
We present HGC-Net, a hybrid pipeline for assessing geometric consistency between stereo image pairs. Our method integrates classical epipolar geometry with deep learning components to compute an interpretable scalar score A, reflecting the degree of alignment. Unlike traditional techniques, which may overlook subtle miscalibrations, HGC-Net reliably detects both severe and mild geometric distortions, such as sub-degree tilts and pixel-level shifts. We evaluate the method on the Middlebury 2014 stereo dataset, using synthetically distorted variants to simulate misalignments. Experimental results show that our score degrades smoothly with increasing geometric error and achieves high detection rates even at minimal distortion levels, outperforming baseline approaches based on disparity or calibration checks. The method operates in real time (12.5 fps on 1080p input) and does not require access to internal camera parameters, making it suitable for embedded stereo systems and quality monitoring in robotic and AR/VR applications. The approach also supports explainability via confidence maps and anomaly heatmaps, aiding human operators in identifying problematic regions. Full article
(This article belongs to the Special Issue Feature Papers in Physical Sensors 2025)
Show Figures

Figure 1

49 pages, 763 KiB  
Review
A Comprehensive Review on Sensor-Based Electronic Nose for Food Quality and Safety
by Teodora Sanislav, George D. Mois, Sherali Zeadally, Silviu Folea, Tudor C. Radoni and Ebtesam A. Al-Suhaimi
Sensors 2025, 25(14), 4437; https://doi.org/10.3390/s25144437 - 16 Jul 2025
Viewed by 737
Abstract
Food quality and safety are essential for ensuring public health, preventing foodborne illness, reducing food waste, maintaining consumer confidence, and supporting regulatory compliance and international trade. This has led to the emergence of many research works that focus on automating and streamlining the [...] Read more.
Food quality and safety are essential for ensuring public health, preventing foodborne illness, reducing food waste, maintaining consumer confidence, and supporting regulatory compliance and international trade. This has led to the emergence of many research works that focus on automating and streamlining the assessment of food quality. Electronic noses have become of paramount importance in this context. We analyze the current state of research in the development of electronic noses for food quality and safety. We examined research papers published in three different scientific databases in the last decade, leading to a comprehensive review of the field. Our review found that most of the efforts use portable, low-cost electronic noses, coupled with pattern recognition algorithms, for evaluating the quality levels in certain well-defined food classes, reaching accuracies exceeding 90% in most cases. Despite these encouraging results, key challenges remain, particularly in diversifying the sensor response across complex substances, improving odor differentiation, compensating for sensor drift, and ensuring real-world reliability. These limitations indicate that a complete device mimicking the flexibility and selectivity of the human olfactory system is not yet available. To address these gaps, our review recommends solutions such as the adoption of adaptive machine learning models to reduce calibration needs and enhance drift resilience and the implementation of standardized protocols for data acquisition and model validation. We introduce benchmark comparisons and a future roadmap for electronic noses that demonstrate their potential to evolve from controlled studies to scalable industrial applications. In doing so, this review aims not only to assess the state of the field but also to support its transition toward more robust, interpretable, and field-ready electronic nose technologies. Full article
(This article belongs to the Special Issue Sensors in 2025)
Show Figures

Figure 1

42 pages, 2145 KiB  
Article
Uncertainty-Aware Predictive Process Monitoring in Healthcare: Explainable Insights into Probability Calibration for Conformal Prediction
by Maxim Majlatow, Fahim Ahmed Shakil, Andreas Emrich and Nijat Mehdiyev
Appl. Sci. 2025, 15(14), 7925; https://doi.org/10.3390/app15147925 - 16 Jul 2025
Viewed by 422
Abstract
In high-stakes decision-making environments, predictive models must deliver not only high accuracy but also reliable uncertainty estimations and transparent explanations. This study explores the integration of probability calibration techniques with Conformal Prediction (CP) within a predictive process monitoring (PPM) framework tailored to healthcare [...] Read more.
In high-stakes decision-making environments, predictive models must deliver not only high accuracy but also reliable uncertainty estimations and transparent explanations. This study explores the integration of probability calibration techniques with Conformal Prediction (CP) within a predictive process monitoring (PPM) framework tailored to healthcare analytics. CP is renowned for its distribution-free prediction regions and formal coverage guarantees under minimal assumptions; however, its practical utility critically depends on well-calibrated probability estimates. We compare a range of post-hoc calibration methods—including parametric approaches like Platt scaling and Beta calibration, as well as non-parametric techniques such as Isotonic Regression and Spline calibration—to assess their impact on aligning raw model outputs with observed outcomes. By incorporating these calibrated probabilities into the CP framework, our multilayer analysis evaluates improvements in prediction region validity, including tighter coverage gaps and reduced minority error contributions. Furthermore, we employ SHAP-based explainability to explain how calibration influences feature attribution for both high-confidence and ambiguous predictions. Experimental results on process-driven healthcare data indicate that the integration of calibration with CP not only enhances the statistical robustness of uncertainty estimates but also improves the interpretability of predictions, thereby supporting safer and robust clinical decision-making. Full article
(This article belongs to the Special Issue Digital Innovations in Healthcare)
Show Figures

Figure 1

13 pages, 887 KiB  
Article
Substantiation of Prostate Cancer Risk Calculator Based on Physical Activity, Lifestyle Habits, and Underlying Health Conditions: A Longitudinal Nationwide Cohort Study
by Jihwan Park
Appl. Sci. 2025, 15(14), 7845; https://doi.org/10.3390/app15147845 - 14 Jul 2025
Viewed by 224
Abstract
Purpose: Despite increasing rates of prostate cancer among men, prostate cancer risk assessments continue to rely on invasive laboratory tests like prostate-specific antigen and Gleason score tests. This study aimed to develop a noninvasive, data-driven risk model for patients to evaluate themselves [...] Read more.
Purpose: Despite increasing rates of prostate cancer among men, prostate cancer risk assessments continue to rely on invasive laboratory tests like prostate-specific antigen and Gleason score tests. This study aimed to develop a noninvasive, data-driven risk model for patients to evaluate themselves before deciding whether to visit a hospital. Materials and Methods: To train the model, data from the National Health Insurance Sharing Service cohort datasets, comprising 347,575 individuals, including 1928 with malignant neoplasms of the prostate, 5 with malignant neoplasms of the penis, 18 with malignant neoplasms of the testis, and 14 with malignant neoplasms of the epididymis, were used. The risk model harnessed easily accessible inputs, such as history of treatment for diseases including stroke, heart disease, and cancer; height; weight; exercise days per week; and duration of smoking. An additional 286,727 public datasets were obtained from the National Health Insurance Sharing Service, which included 434 (0.15%) prostate cancer incidences. Results: The risk calculator was built based on Cox proportional hazards regression, and I validated the model by calibration using predictions and observations. The concordance index was 0.573. Additional calibration of the risk calculator was performed to ensure confidence in accuracy verification. Ultimately, the actual proof showed a sensitivity of 60 (60.5) for identifying a high-risk population. Conclusions: The feasibility of the model to evaluate prostate cancer risk without invasive tests was demonstrated using a public dataset. As a tool for individuals to use before hospital visits, this model could improve public health and reduce social expenses for medical treatment. Full article
Show Figures

Figure 1

15 pages, 17572 KiB  
Article
High-Resolution Mapping and Biomass Estimation of Suaeda salsa in Coastal Wetlands Using UAV Visible-Light Imagery and Hue Angle Inversion
by Lin Wang, Xiang Wang, Xiu Su, Shiyong Wen, Xinxin Wang, Qinghui Meng and Lingling Jiang
Appl. Sci. 2025, 15(13), 7423; https://doi.org/10.3390/app15137423 - 2 Jul 2025
Viewed by 227
Abstract
Unmanned Aerial Vehicles (UAVs) have become powerful tools for high-resolution, quantitative remote sensing in ecological and environmental studies. In this study, we present a novel approach to accurately mapping and estimating the biomass of Suaeda salsa using UAV-based visible-light imagery combined with hue [...] Read more.
Unmanned Aerial Vehicles (UAVs) have become powerful tools for high-resolution, quantitative remote sensing in ecological and environmental studies. In this study, we present a novel approach to accurately mapping and estimating the biomass of Suaeda salsa using UAV-based visible-light imagery combined with hue angle inversion modeling. By integrating diffuse reflectance standard plates into the flight protocol, we converted RGB pixel values into reflectance and derived hue angle metrics with enhanced radiometric accuracy. A hue angle cutoff threshold of 249.01° was identified as the optimal cutoff to distinguish Suaeda salsa from the surrounding land cover types with high confidence. To estimate biomass, we developed an exponential inversion model based on hue angle data calibrated through extensive field measurements. The resulting model—Biomass = 3.57639 × 10−15 × e0.12201×α—achieved exceptional performance (R2 = 0.99696; MAPE = 3.616%; RMSE = 0.02183 kg/m2), indicating strong predictive accuracy and robustness. This study highlights a cost-effective, non-destructive, and scalable method for the real-time monitoring of coastal vegetation, offering a significant advancement in remote sensing applications for wetland ecosystem management. Full article
(This article belongs to the Section Environmental Sciences)
Show Figures

Figure 1

25 pages, 2723 KiB  
Article
A Human-Centric, Uncertainty-Aware Event-Fused AI Network for Robust Face Recognition in Adverse Conditions
by Akmalbek Abdusalomov, Sabina Umirzakova, Elbek Boymatov, Dilnoza Zaripova, Shukhrat Kamalov, Zavqiddin Temirov, Wonjun Jeong, Hyoungsun Choi and Taeg Keun Whangbo
Appl. Sci. 2025, 15(13), 7381; https://doi.org/10.3390/app15137381 - 30 Jun 2025
Cited by 1 | Viewed by 336
Abstract
Face recognition systems often falter when deployed in uncontrolled settings, grappling with low light, unexpected occlusions, motion blur, and the degradation of sensor signals. Most contemporary algorithms chase raw accuracy yet overlook the pragmatic need for uncertainty estimation and multispectral reasoning rolled into [...] Read more.
Face recognition systems often falter when deployed in uncontrolled settings, grappling with low light, unexpected occlusions, motion blur, and the degradation of sensor signals. Most contemporary algorithms chase raw accuracy yet overlook the pragmatic need for uncertainty estimation and multispectral reasoning rolled into a single framework. This study introduces HUE-Net—a Human-centric, Uncertainty-aware, Event-fused Network—designed specifically to thrive under severe environmental stress. HUE-Net marries the visible RGB band with near-infrared (NIR) imagery and high-temporal-event data through an early-fusion pipeline, proven more responsive than serial approaches. A custom hybrid backbone that couples convolutional networks with transformers keeps the model nimble enough for edge devices. Central to the architecture is the perturbed multi-branch variational module, which distills probabilistic identity embeddings while delivering calibrated confidence scores. Complementing this, an Adaptive Spectral Attention mechanism dynamically reweights each stream to amplify the most reliable facial features in real time. Unlike previous efforts that compartmentalize uncertainty handling, spectral blending, or computational thrift, HUE-Net unites all three in a lightweight package. Benchmarks on the IJB-C and N-SpectralFace datasets illustrate that the system not only secures state-of-the-art accuracy but also exhibits unmatched spectral robustness and reliable probability calibration. The results indicate that HUE-Net is well-positioned for forensic missions and humanitarian scenarios where trustworthy identification cannot be deferred. Full article
Show Figures

Figure 1

20 pages, 327 KiB  
Article
Gauging the Impact of Digital Finance on Financial Stability in the Presence of Multiple Unknown Structural Breaks: Evidence from Developing Economies
by Tochukwu Timothy Okoli
Economies 2025, 13(7), 187; https://doi.org/10.3390/economies13070187 - 28 Jun 2025
Viewed by 395
Abstract
The implications of digital finance for financial stability has come under serious scrutiny since the aftermath of the 2008 global financial crisis (GFC). Empirical evidence on this nexus are somewhat inconsistent and ambiguous. This study therefore attributes this puzzle to multiple structural breaks [...] Read more.
The implications of digital finance for financial stability has come under serious scrutiny since the aftermath of the 2008 global financial crisis (GFC). Empirical evidence on this nexus are somewhat inconsistent and ambiguous. This study therefore attributes this puzzle to multiple structural breaks (MSBs) which were long neglected by previous studies. Consequently, this study aims to identify possible MSBs in the digital finance–stability nexus and examine if its impact is consistent/weakened in the presence of MSBs in a sample of 41 developing African economies for the 2004–2023 periods. Results from the PCA index generation report that instability is more susceptible to bank crisis/Z-score. Again, the panel extension of BP98 MSBs detection identified three breaks with their confidence intervals overlapping the periods of the 2006–2011 GFC/subprime mortgage crises, the 2012–2016 Br-exit referendum and the 2017–2021 COVID 19 pandemic/Ukraine war. The quantile regression methodology also shows that these breaks weaken the impact of digital finance (i.e., mobile banking and internet banking) on financial stability, particularly for economies at lower quantiles of financial stability but with marginal effects for economies at higher quantiles. The study concludes that digital finance can stabilize the financial system of developing economies when shocks from structural breaks are controlled. Therefore, the study contributes to knowledge by developing a new econometric model for BP98 panel extension of MSBs detection, calibrating an index for financial stability and detecting valid break dates for three major breaks. Structural and financial development through policy coordination to forestall the effects of structural breaks were recommended. Full article
(This article belongs to the Section Macroeconomics, Monetary Economics, and Financial Markets)
12 pages, 32009 KiB  
Article
A Confidence Calibration Based Ensemble Method for Oriented Electrical Equipment Detection in Thermal Images
by Ying Lin, Zhuangzhuang Li, Bo Song, Ning Ge, Yiwei Sun and Xiaojin Gong
Energies 2025, 18(12), 3191; https://doi.org/10.3390/en18123191 - 18 Jun 2025
Viewed by 325
Abstract
Detecting oriented electrical equipment plays a fundamental role in enabling intelligent defect diagnosis in power systems. However, existing oriented object detection methods each have their own limitations, making it challenging to achieve robust and accurate detection under varying conditions. This work proposes a [...] Read more.
Detecting oriented electrical equipment plays a fundamental role in enabling intelligent defect diagnosis in power systems. However, existing oriented object detection methods each have their own limitations, making it challenging to achieve robust and accurate detection under varying conditions. This work proposes a model ensemble approach that leverages the complementary strengths of two representative detectors—Oriented R-CNN and S2A-Net—to enhance detection performance. Recognizing that discrepancies in confidence score distributions may negatively impact ensemble results, this work first designs a calibration method to align the confidence levels of predictions from each model. Following calibration, a soft non-maximum suppression (Soft-NMS) strategy is employed to fuse the outputs, effectively refining the final detections by jointly considering spatial overlap and the calibrated confidence scores. The proposed method is evaluated on an infrared image dataset for electric power equipment detection. Experimental results demonstrate that our approach not only improves the performance of each individual model by 1.95 mean Average Precision (mAP) but also outperforms other state-of-the-art methods. Full article
(This article belongs to the Section F5: Artificial Intelligence and Smart Energy)
Show Figures

Figure 1

28 pages, 7802 KiB  
Article
Anomalous Behavior in Weather Forecast Uncertainty: Implications for Ship Weather Routing
by Marijana Marjanović, Jasna Prpić-Oršić, Anton Turk and Marko Valčić
J. Mar. Sci. Eng. 2025, 13(6), 1185; https://doi.org/10.3390/jmse13061185 - 17 Jun 2025
Viewed by 1105
Abstract
Ship weather routing is heavily dependent on weather forecasts. However, the predictive nature of meteorological models introduces an unavoidable level of uncertainty which, if not accounted for, can compromise navigational safety, operational efficiency, and environmental impact. This study examines the temporal degradation of [...] Read more.
Ship weather routing is heavily dependent on weather forecasts. However, the predictive nature of meteorological models introduces an unavoidable level of uncertainty which, if not accounted for, can compromise navigational safety, operational efficiency, and environmental impact. This study examines the temporal degradation of forecast accuracy across certain oceanographic and atmospheric variables, using a six-month dataset for the area of North Atlantic provided by the National Oceanic and Atmospheric Administration (NOAA). The analysis reveals distinct variable-specific uncertainty trends with wind speed forecasts exhibiting significant temporal fluctuation (RMSE increasing from 0.5 to 4.0 m/s), while significant wave height forecasts degrade in a more stable and predictable pattern (from 0.2 to 0.9 m). Confidence intervals also exhibit non-monotonic evolution, narrowing by up to 15% between 96–120-h lead times. To address these dynamics, a Python-based framework combines distribution-based modeling with calibrated confidence intervals to generate uncertainty bounds that evolve with forecast lead time (R2 = 0.87–0.93). This allows uncertainty to be quantified not as a static estimate, but as a function sensitive to both variable type and prediction horizon. When integrated into routing algorithms, such representations allow for route planning strategies that are not only more reflective of real-world meteorological limitations but also more robust to evolving weather conditions, demonstrated by a 3–7% increase in travel time in exchange for improved safety margins across eight test cases. Full article
(This article belongs to the Section Ocean Engineering)
Show Figures

Figure 1

16 pages, 980 KiB  
Article
Statistical Analysis of Temperature Sensors Applied to a Biological Material Transport System: Challenges, Discrepancies, and a Proposed Monitoring Methodology
by Felipe Roque de Albuquerque Neto, José Eduardo Ferreira de Oliveira, Rodrigo Gustavo Dourado da Silva, Andrezza Carolina Carneiro Tomás, Alvaro Antonio Villa Ochoa, José Ângelo Peixoto da Costa, Alisson Cocci de Souza and Paula Suemy Arruda Michima
Processes 2025, 13(6), 1904; https://doi.org/10.3390/pr13061904 - 16 Jun 2025
Viewed by 499
Abstract
Conventional methods for transporting biological materials typically use dry ice or ice for preservation but often overlook important aspects of temperature monitoring and metrological control. These methods generally do not include temperature sensors to track the thermal conditions of the materials during transport, [...] Read more.
Conventional methods for transporting biological materials typically use dry ice or ice for preservation but often overlook important aspects of temperature monitoring and metrological control. These methods generally do not include temperature sensors to track the thermal conditions of the materials during transport, nor do they apply essential metrological practices such as regular sensor calibration and stability checks. This lack of precise monitoring poses significant risks to the integrity of temperature-sensitive biological materials. This study presents a statistical analysis of DS18B20 digital temperature sensors used in an experimental refrigeration system based on thermoelectric modules. The aim was to verify sensor consistency and investigate sources of measurement error. The research was motivated by a prior phase of study, which revealed significant discrepancies of approximately 3 °C between experimental temperature data and numerical simulations. To investigate a potential cause, we conducted a case study analyzing measurements from three identical temperature sensors (same model, brand, and manufacturer). Statistical analyses included ANOVA (analysis of variance) and Tukey’s test with a 95% confidence interval. Since the data did not follow a normal distribution (p-value < 0.05), non-parametric methods such as the Kruskal–Wallis and Levene’s procedures were also applied. The results showed that all sensors recorded statistically significant different temperature values (p-value < 0.05). Although experimental conditions were kept consistent, temperature differences of up to 0.37 °C were observed between sensors. This finding demonstrates an inherent inter-sensor variability that, while within manufacturer specifications, represents a source of systematic error that can contribute to larger discrepancies in complex systems, highlighting the need for individual calibration. Full article
(This article belongs to the Special Issue Multiscale Modeling and Control of Biomedical Systems)
Show Figures

Figure 1

19 pages, 4785 KiB  
Article
A Deep Equilibrium Model for Remaining Useful Life Estimation of Aircraft Engines
by Spyridon Plakias and Yiannis S. Boutalis
Electronics 2025, 14(12), 2355; https://doi.org/10.3390/electronics14122355 - 9 Jun 2025
Viewed by 462
Abstract
Estimating Remaining Useful Life (RUL) is crucial in modern Prognostic and Health Management (PHM) systems providing valuable information for planning the maintenance strategy of critical components in complex systems such as aircraft engines. Deep Learning (DL) models have shown great performance in the [...] Read more.
Estimating Remaining Useful Life (RUL) is crucial in modern Prognostic and Health Management (PHM) systems providing valuable information for planning the maintenance strategy of critical components in complex systems such as aircraft engines. Deep Learning (DL) models have shown great performance in the accurate prediction of RUL, building hierarchical representations by the stacking of multiple explicit neural layers. In the current research paper, we follow a different approach presenting a Deep Equilibrium Model (DEM) that effectively captures the spatial and temporal information of the sequential sensor. The DEM, which incorporates convolutional layers and a novel dual-input interconnection mechanism to capture sensor information effectively, estimates the degradation representation implicitly as the equilibrium solution of an equation, rather than explicitly computing it through multiple layer passes. The convergence representation of the DEM is estimated by a fixed-point equation solver while the computation of the gradients in the backward pass is made using the Implicit Function Theorem (IFT). The Monte Carlo Dropout (MCD) technique under calibration is the final key component of the framework that enhances regularization and performance providing a confidence interval for each prediction, contributing to a more robust and reliable outcome. Simulation experiments on the widely used NASA Turbofan Jet Engine Data Set show consistent improvements, with the proposed framework offering a competitive alternative for RUL prediction under diverse conditions. Full article
(This article belongs to the Special Issue Advances in Condition Monitoring and Fault Diagnosis)
Show Figures

Figure 1

21 pages, 5234 KiB  
Article
Calibration of Integrated Low-Cost Environmental Sensors for Urban Air Temperature Based on Machine Learning
by Fang Nan, Chao Zeng, Huanfeng Shen and Liupeng Lin
Sensors 2025, 25(11), 3398; https://doi.org/10.3390/s25113398 - 28 May 2025
Viewed by 589
Abstract
Monitoring urban microenvironments using low-cost sensors effectively addresses the spatiotemporal limitations of conventional monitoring networks. However, their widespread adoption is hindered by concerns regarding data quality. Calibrating these sensors is crucial for enabling their large-scale deployment and increasing confidence among researchers and users. [...] Read more.
Monitoring urban microenvironments using low-cost sensors effectively addresses the spatiotemporal limitations of conventional monitoring networks. However, their widespread adoption is hindered by concerns regarding data quality. Calibrating these sensors is crucial for enabling their large-scale deployment and increasing confidence among researchers and users. This study focuses on an internet of things (IoT) application in Wuhan, China, aiming to enhance the quality of long-term hourly air temperature data collected by low-cost sensors through on-site calibration. Multiple linear regression (MLR) and light gradient boosting machine (LightGBM) algorithms were employed for calibration, with leave-one-out cross-validation (LOOCV) being used for model evaluation. Factors, such as multiple scenarios, spatial distances, and seasonal variations, were also examined for their influence on long-term data calibration. The experimental findings revealed that the LightGBM method consistently outperformed MLR. Calibration using this approach markedly improved the sensor data quality, with the R-squared (R2) value of the sensor with the poorest raw data increasing from 0.416 to 0.957, its mean absolute error (MAE) decreasing from 6.255 to 1.680, and its root mean square error (RMSE) being reduced from 7.881 to 2.148. This study demonstrates the application potential of using LightGBM as an advanced machine learning (ML) method in innovative low-cost sensors, thereby providing a method of obtaining high-quality and real-time information for urban environmental and public health research. Full article
(This article belongs to the Special Issue Integrated Sensor Systems for Environmental Applications)
Show Figures

Figure 1

23 pages, 1586 KiB  
Article
GOMFuNet: A Geometric Orthogonal Multimodal Fusion Network for Enhanced Prediction Reliability
by Yi Guo and Rui Zhong
Mathematics 2025, 13(11), 1791; https://doi.org/10.3390/math13111791 - 27 May 2025
Viewed by 544
Abstract
Integrating information from heterogeneous data sources poses significant mathematical challenges, particularly in ensuring the reliability and reducing the uncertainty of predictive models. This paper introduces the Geometric Orthogonal Multimodal Fusion Network (GOMFuNet), a novel mathematical framework designed to address these challenges. GOMFuNet synergistically [...] Read more.
Integrating information from heterogeneous data sources poses significant mathematical challenges, particularly in ensuring the reliability and reducing the uncertainty of predictive models. This paper introduces the Geometric Orthogonal Multimodal Fusion Network (GOMFuNet), a novel mathematical framework designed to address these challenges. GOMFuNet synergistically combines two core mathematical principles: (1) It utilizes geometric deep learning, specifically Graph Convolutional Networks (GCNs), within its Cross-Modal Label Fusion Module (CLFM) to perform fusion in a high-level semantic label space, thereby preserving inter-sample topological relationships and enhancing robustness to inconsistencies. (2) It incorporates a novel Label Confidence Learning Module (LCLM) derived from optimization theory, which explicitly enhances prediction reliability by enforcing mathematical orthogonality among the predicted class probability vectors, directly minimizing output uncertainty. We demonstrate GOMFuNet’s effectiveness through comprehensive experiments, including confidence calibration analysis and robustness tests, and validate its practical utility via a case study on educational performance prediction using structured, textual, and audio data. Results show GOMFuNet achieves significantly improved performance (90.17% classification accuracy, 88.03% R2 regression) and enhanced reliability compared to baseline and state-of-the-art multimodal methods, validating its potential as a robust framework for reliable multimodal learning. Full article
(This article belongs to the Special Issue Deep Neural Network: Theory, Algorithms and Applications)
Show Figures

Figure 1

Back to TopTop