Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (1,396)

Search Parameters:
Keywords = interval uncertainty

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
26 pages, 4687 KiB  
Article
Geant4-Based Logging-While-Drilling Gamma Gas Detection for Quantitative Inversion of Downhole Gas Content
by Xingming Wang, Xiangyu Wang, Qiaozhu Wang, Yuanyuan Yang, Xiong Han, Zhipeng Xu and Luqing Li
Processes 2025, 13(8), 2392; https://doi.org/10.3390/pr13082392 - 28 Jul 2025
Viewed by 252
Abstract
Downhole kick is one of the most severe safety hazards in deep and ultra-deep well drilling operations. Traditional monitoring methods, which rely on surface flow rate and fluid level changes, are limited by their delayed response and insufficient sensitivity, making them inadequate for [...] Read more.
Downhole kick is one of the most severe safety hazards in deep and ultra-deep well drilling operations. Traditional monitoring methods, which rely on surface flow rate and fluid level changes, are limited by their delayed response and insufficient sensitivity, making them inadequate for early warning. This study proposes a real-time monitoring technique for gas content in drilling fluid based on the attenuation principle of Ba-133 γ-rays. By integrating laboratory static/dynamic experiments and Geant4-11.2 Monte Carlo simulations, the influence mechanism of gas–liquid two-phase media on γ-ray transmission characteristics is systematically elucidated. Firstly, through a comparative analysis of radioactive source parameters such as Am-241 and Cs-137, Ba-133 (main peak at 356 keV, half-life of 10.6 years) is identified as the optimal downhole nuclear measurement source based on a comparative analysis of penetration capability, detection efficiency, and regulatory compliance. Compared to alternative sources, Ba-133 provides an optimal energy range for detecting drilling fluid density variations, while also meeting exemption activity limits (1 × 106 Bq) for field deployment. Subsequently, an experimental setup with drilling fluids of varying densities (1.2–1.8 g/cm3) is constructed to quantify the inverse square attenuation relationship between source-to-detector distance and counting rate, and to acquire counting data over the full gas content range (0–100%). The Monte Carlo simulation results exhibit a mean relative error of 5.01% compared to the experimental data, validating the physical correctness of the model. On this basis, a nonlinear inversion model coupling a first-order density term with a cubic gas content term is proposed, achieving a mean absolute percentage error of 2.3% across the full range and R2 = 0.999. Geant4-based simulation validation demonstrates that this technique can achieve a measurement accuracy of ±2.5% for gas content within the range of 0–100% (at a 95% confidence interval). The anticipated field accuracy of ±5% is estimated by accounting for additional uncertainties due to temperature effects, vibration, and mud composition variations under downhole conditions, significantly outperforming current surface monitoring methods. This enables the high-frequency, high-precision early detection of kick events during the shut-in period. The present study provides both theoretical and technical support for the engineering application of nuclear measurement techniques in well control safety. Full article
(This article belongs to the Section Chemical Processes and Systems)
Show Figures

Figure 1

26 pages, 3625 KiB  
Article
Deep-CNN-Based Layout-to-SEM Image Reconstruction with Conformal Uncertainty Calibration for Nanoimprint Lithography in Semiconductor Manufacturing
by Jean Chien and Eric Lee
Electronics 2025, 14(15), 2973; https://doi.org/10.3390/electronics14152973 - 25 Jul 2025
Viewed by 237
Abstract
Nanoimprint lithography (NIL) has emerged as a promising sub-10 nm patterning at low cost; yet, robust process control remains difficult because of time-consuming physics-based simulators and labeled SEM data scarcity. We propose a data-efficient, two-stage deep-learning framework here that directly reconstructs post-imprint SEM [...] Read more.
Nanoimprint lithography (NIL) has emerged as a promising sub-10 nm patterning at low cost; yet, robust process control remains difficult because of time-consuming physics-based simulators and labeled SEM data scarcity. We propose a data-efficient, two-stage deep-learning framework here that directly reconstructs post-imprint SEM images from binary design layouts and delivers calibrated pixel-by-pixel uncertainty simultaneously. First, a shallow U-Net is trained on conformalized quantile regression (CQR) to output 90% prediction intervals with statistically guaranteed coverage. Moreover, per-level errors on a small calibration dataset are designed to drive an outlier-weighted and encoder-frozen transfer fine-tuning phase that refines only the decoder, with its capacity explicitly focused on regions of spatial uncertainty. On independent test layouts, our proposed fine-tuned model significantly reduces the mean absolute error (MAE) from 0.0365 to 0.0255 and raises the coverage from 0.904 to 0.926, while cutting the labeled data and GPU time by 80% and 72%, respectively. The resultant uncertainty maps highlight spatial regions associated with error hotspots and support defect-aware optical proximity correction (OPC) with fewer guard-band iterations. Extending the current perspective beyond OPC, the innovatively model-agnostic and modular design of the pipeline here allows flexible integration into other critical stages of the semiconductor manufacturing workflow, such as imprinting, etching, and inspection. In these stages, such predictions are critical for achieving higher precision, efficiency, and overall process robustness in semiconductor manufacturing, which is the ultimate motivation of this study. Full article
Show Figures

Figure 1

15 pages, 2432 KiB  
Article
A Comparison Index for Costs of Interval Linear Programming Models
by Maria Letizia Guerra, Laerte Sorini and Luciano Stefanini
Axioms 2025, 14(8), 569; https://doi.org/10.3390/axioms14080569 - 24 Jul 2025
Viewed by 291
Abstract
Interval Linear Programming (ILP) presents several compelling challenges when applied to real-world problems that cannot be easily captured by traditional robust uncertainty models. In this paper, we propose a novel solution method that employs a comparison index for interval ordering based on the [...] Read more.
Interval Linear Programming (ILP) presents several compelling challenges when applied to real-world problems that cannot be easily captured by traditional robust uncertainty models. In this paper, we propose a novel solution method that employs a comparison index for interval ordering based on the generalized Hukuhara difference. This approach proves to be highly effective in comparing solutions within ILP frameworks. Additionally, we discuss the robustness of the proposed methodology and its implications for decision-making under uncertainty. Full article
Show Figures

Figure 1

28 pages, 9894 KiB  
Article
At-Site Versus Regional Frequency Analysis of Sub-Hourly Rainfall for Urban Hydrology Applications During Recent Extreme Events
by Sunghun Kim, Kyungmin Sung, Ju-Young Shin and Jun-Haeng Heo
Water 2025, 17(15), 2213; https://doi.org/10.3390/w17152213 - 24 Jul 2025
Viewed by 195
Abstract
Accurate rainfall quantile estimation is critical for urban flood management, particularly given the escalating climate change impacts. This study comprehensively compared at-site frequency analysis and regional frequency analysis for sub-hourly rainfall quantile estimation, using data from 27 sites across Seoul. The analysis focused [...] Read more.
Accurate rainfall quantile estimation is critical for urban flood management, particularly given the escalating climate change impacts. This study comprehensively compared at-site frequency analysis and regional frequency analysis for sub-hourly rainfall quantile estimation, using data from 27 sites across Seoul. The analysis focused on Seoul’s disaster prevention framework (30-year and 100-year return periods). Employing L-moment statistics and Monte Carlo simulations, the rainfall quantiles were estimated, the methodological performance was evaluated, and Seoul’s current disaster prevention standards were assessed. The analysis revealed significant spatio-temporal variability in Seoul’s precipitation, causing considerable uncertainty in individual site estimates. A performance evaluation, including the relative root mean square error and confidence interval, consistently showed regional frequency analysis superiority over at-site frequency analysis. While at-site frequency analysis demonstrated better performance only for short return periods (e.g., 2 years), regional frequency analysis exhibited a substantially lower relative root mean square error and significantly narrower confidence intervals for larger return periods (e.g., 10, 30, 100 years). This methodology reduced the average 95% confidence interval width by a factor of approximately 2.7 (26.98 mm versus 73.99 mm). This enhanced reliability stems from the information-pooling capabilities of regional frequency analysis, mitigating uncertainties due to limited record lengths and localized variabilities. Critically, regionally derived 100-year rainfall estimates consistently exceeded Seoul’s 100 mm disaster prevention threshold across most areas, suggesting that the current infrastructure may be substantially under-designed. The use of minute-scale data underscored its necessity for urban hydrological modeling, highlighting the inadequacy of conventional daily rainfall analyses. Full article
(This article belongs to the Special Issue Urban Flood Frequency Analysis and Risk Assessment)
Show Figures

Figure 1

16 pages, 678 KiB  
Article
Evaluating the Gaps in the Diagnosis and Treatment in Extra-Pulmonary Tuberculosis Patients Under National Tuberculosis Elimination Programme (NTEP) Guidelines: A Multicentric Cohort Study
by Sanjeev Sinha, Renuka Titiyal, Prasanta R. Mohapatra, Rajesh K. Palvai, Itishree Kar, Baijayantimala Mishra, Anuj Ajayababu, Akanksha Sinha, Sourin Bhuniya and Shivam Pandey
Trop. Med. Infect. Dis. 2025, 10(8), 206; https://doi.org/10.3390/tropicalmed10080206 - 24 Jul 2025
Viewed by 255
Abstract
Extra-pulmonary tuberculosis (EPTB) can affect any organ of the body, producing a wide variety of clinical manifestations that make the diagnosis and treatment of EPTB challenging. The optimum treatment varies depending on the site of EPTB, its severity, and response to treatment. There [...] Read more.
Extra-pulmonary tuberculosis (EPTB) can affect any organ of the body, producing a wide variety of clinical manifestations that make the diagnosis and treatment of EPTB challenging. The optimum treatment varies depending on the site of EPTB, its severity, and response to treatment. There is often uncertainty about the best management practices, with a significant departure from national guidelines. This study aims to identify gaps and barriers in adhering to the national guidelines for the diagnosis and treatment of EPTB. We included 433 patients having EPTB and followed up at predefined intervals of 2 months, 6 months, 9 months, and 12 months. Questionnaire-based interviews of the treating physician and the patients in different departments were conducted. For confirmatory diagnosis, heavy dependence on clinical-radiological diagnosis without microbiological support was observed, which is a deviation from National Tuberculosis Elimination Programme (NTEP) guidelines and raises concerns about the potential for misdiagnosis and overtreatment. Apart from patient delays, long health system delays in EPTB were observed. The median patient delay, health system delay, and total treatment delay times were 4.2, 4, and 10.1 weeks, respectively. To enhance EPTB diagnosis and management, there is a pressing need for improved access to microbiological testing, enhanced physician training on adherence to NTEP guidelines, and greater utilisation of imaging and histopathological techniques. Full article
(This article belongs to the Special Issue Tuberculosis Control in Africa and Asia)
Show Figures

Figure 1

15 pages, 1262 KiB  
Article
Epidemiology and Future Burden of Vertebral Fractures: Insights from the Global Burden of Disease 1990–2021
by Youngoh Bae, Minyoung Kim, Woonyoung Jeong, Suho Jang and Seung Won Lee
Healthcare 2025, 13(15), 1774; https://doi.org/10.3390/healthcare13151774 - 22 Jul 2025
Viewed by 247
Abstract
Background/Objectives: Vertebral fractures (VFs) are a global health issue caused by traumatic or pathological factors that compromise spinal integrity. The burden of VFs is increasing, particularly in older adults. Methods: Data from the Global Burden of Disease 2021 were analyzed to [...] Read more.
Background/Objectives: Vertebral fractures (VFs) are a global health issue caused by traumatic or pathological factors that compromise spinal integrity. The burden of VFs is increasing, particularly in older adults. Methods: Data from the Global Burden of Disease 2021 were analyzed to estimate the prevalence, mortality, and years lived with disability due to VFs from 1990 to 2021. Estimates were stratified according to age, sex, and region. Bayesian meta-regression models were used to generate age-standardized rates, and projections for 2050 were calculated using demographic trends and the sociodemographic index. Das Gupta’s decomposition assessed the relative contributions of population growth, aging, and prevalence changes to future case numbers. Results: In 2021, approximately 5.37 million people (95% Uncertainty Interval [UI]: 4.70–6.20 million) experienced VFs globally, with an age-standardized prevalence of 65 per 100,000. Although the rates have declined slightly since 1990, the absolute burden has increased owing to population aging. VF prevalence was the highest in Eastern and Western Europe and in high-income regions. Males had higher VF rates until 70 years of age, after which females surpassed them, reflecting postmenopausal osteoporosis. Falls and road injuries were the leading causes of VF. By 2050, the number of VF cases is expected to increase to 8.01 million (95% UI: 6.57–8.64 million). Conclusions: While the age-standardized VF rates have decreased slightly, the global burden continues to increase. Targeted strategies for the early diagnosis, osteoporosis management, and fall prevention are necessary to reduce the impact of VFs. Full article
(This article belongs to the Topic Public Health and Healthcare in the Context of Big Data)
Show Figures

Figure 1

20 pages, 4503 KiB  
Article
Comparative Validation of the fBrake Method with the Conventional Brake Efficiency Test Under UNE 26110 Using Roller Brake Tester Data
by Víctor Romero-Gómez and José Luis San Román
Sensors 2025, 25(14), 4522; https://doi.org/10.3390/s25144522 - 21 Jul 2025
Viewed by 202
Abstract
In periodic technical inspections (PTIs), evaluating the braking efficiency of light passenger vehicles at their Maximum Authorized Mass (MAM) presents a practical challenge, as bringing laden vehicles to inspection is often unfeasible due to logistical and infrastructure limitations. The fBrake method is proposed [...] Read more.
In periodic technical inspections (PTIs), evaluating the braking efficiency of light passenger vehicles at their Maximum Authorized Mass (MAM) presents a practical challenge, as bringing laden vehicles to inspection is often unfeasible due to logistical and infrastructure limitations. The fBrake method is proposed to overcome this issue by estimating braking efficiency at MAM based on measurements taken from vehicles in more accessible loading conditions. In this study, the fBrake method is validated by demonstrating the equivalence of its efficiency estimates extrapolated from two distinct configurations: an unladen state near the curb weight and a partially laden condition closer to MAM. Following the UNE 26110 standard (Road vehicles. Criteria for the assessment of the equivalence of braking efficiency test methods in relation to the methods defined in ISO 21069), roller brake tester measurements were used to obtain force data under both conditions. The analysis showed that the extrapolated efficiencies agree within combined uncertainty limits, with normalized errors below 1 in all segments tested. Confidence intervals were reduced by up to 74% after electronics update. These results confirm the reliability of the fBrake method for M1 and N1 vehicles and support its adoption as an equivalent procedure in compliance with UNE 26110, particularly when fully laden testing is impractical. Full article
(This article belongs to the Special Issue Advanced Sensing and Analysis Technology in Transportation Safety)
Show Figures

Figure 1

12 pages, 607 KiB  
Article
A Modified Two-Temperature Calibration Method and Facility for Emissivity Measurement
by Shufang He, Shuai Li, Caihong Dai, Jinyuan Liu, Yanfei Wang, Ruoduan Sun, Guojin Feng and Jinghui Wang
Materials 2025, 18(14), 3392; https://doi.org/10.3390/ma18143392 - 19 Jul 2025
Viewed by 217
Abstract
Measuring the emissivity of an infrared radiant sample with high accuracy is important. Previous studies reported on the multi- or two-temperature calibration methods, which used a reference blackbody (or blackbodies) to eliminate the background radiation, and assumed that the background radiation was independent [...] Read more.
Measuring the emissivity of an infrared radiant sample with high accuracy is important. Previous studies reported on the multi- or two-temperature calibration methods, which used a reference blackbody (or blackbodies) to eliminate the background radiation, and assumed that the background radiation was independent of temperature. However, in practical measurements, this assumption does not hold. To solve the above problems, this study proposes a modified two-temperature calibration method and facility. The two temperature points are set in a certain small interval based on the proposed calculation method; based on the indication of the approximation that the emissivities of the sample and the background radiations remain the same at these two temperatures, the emissivities can be calculated with measurement signals at these two temperatures, and a reference blackbody is not needed. An experimental facility was built up and three samples with emissivities around 0.100, 0.500, and 0.900 were measured in (8~14) μm. The relative expanded uncertainties were 9.6%, 4.0%, and 1.5% at 60 °C, respectively, and 8.8%, 5.8%, and 1.2% at 85 °C (k = 2), respectively. The experimental results showed consistency with the results obtained using other methods, indicating the effectiveness of the developed method. The developed method might be suitable for samples whose emissivities are temperature insensitive. Full article
(This article belongs to the Section Advanced Materials Characterization)
Show Figures

Figure 1

26 pages, 3112 KiB  
Article
Pre-Warning for the Remaining Time to Alarm Based on Variation Rates and Mixture Entropies
by Zijiang Yang, Jiandong Wang, Honghai Li and Song Gao
Entropy 2025, 27(7), 736; https://doi.org/10.3390/e27070736 - 9 Jul 2025
Viewed by 221
Abstract
Alarm systems play crucial roles in industrial process safety. To support tackling the accident that is about to occur after an alarm, a pre-warning method is proposed for a special class of industrial process variables to alert operators about the remaining time to [...] Read more.
Alarm systems play crucial roles in industrial process safety. To support tackling the accident that is about to occur after an alarm, a pre-warning method is proposed for a special class of industrial process variables to alert operators about the remaining time to alarm. The main idea of the proposed method is to estimate the remaining time to alarm based on variation rates and mixture entropies of qualitative trends in univariate variables. If the remaining time to alarm is no longer than the pre-warning threshold and its mixture entropy is small enough then a warning is generated to alert the operators. One challenge for the proposed method is how to determine an optimal pre-warning threshold by considering the uncertainties induced by the sample distribution of the remaining time to alarm, subject to the constraint of the required false warning rate. This challenge is addressed by utilizing Bayesian estimation theory to estimate the confidence intervals for all candidates of the pre-warning threshold, and the optimal one is selected as the one whose upper bound of the confidence interval is nearest to the required false warning rate. Another challenge is how to measure the possibility of the current trend segment increasing to the alarm threshold, and this challenge is overcome by adopting the mixture entropy as a possibility measurement. Numerical and industrial examples illustrate the effectiveness of the proposed method and the advantages of the proposed method over the existing methods. Full article
(This article belongs to the Special Issue Failure Diagnosis of Complex Systems)
Show Figures

Figure 1

38 pages, 1738 KiB  
Article
AI-Driven Bayesian Deep Learning for Lung Cancer Prediction: Precision Decision Support in Big Data Health Informatics
by Natalia Amasiadi, Maria Aslani-Gkotzamanidou, Leonidas Theodorakopoulos, Alexandra Theodoropoulou, George A. Krimpas, Christos Merkouris and Aristeidis Karras
BioMedInformatics 2025, 5(3), 39; https://doi.org/10.3390/biomedinformatics5030039 - 9 Jul 2025
Viewed by 586
Abstract
Lung-cancer incidence is projected to rise by 50% by 2035, underscoring the need for accurate yet accessible risk-stratification tools. We trained a Bayesian neural network on 300 annotated chest-CT scans from the public LIDC–IDRI cohort, integrating clinical metadata. Hamiltonian Monte-Carlo sampling (10 000 [...] Read more.
Lung-cancer incidence is projected to rise by 50% by 2035, underscoring the need for accurate yet accessible risk-stratification tools. We trained a Bayesian neural network on 300 annotated chest-CT scans from the public LIDC–IDRI cohort, integrating clinical metadata. Hamiltonian Monte-Carlo sampling (10 000 posterior draws) captured parameter uncertainty; performance was assessed with stratified five-fold cross-validation and on three independent multi-centre cohorts. On the locked internal test set, the model achieved 99.0% accuracy, AUC = 0.990 and macro-F1 = 0.987. External validation across 824 scans yielded a mean AUC of 0.933 and an expected calibration error <0.034, while eliminating false positives for benign nodules and providing voxel-level uncertainty maps. Uncertainty-aware Bayesian deep learning delivers state-of-the-art, well-calibrated lung-cancer risk predictions from a single CT scan, supporting personalised screening intervals and safe deployment in clinical workflows. Full article
Show Figures

Figure 1

33 pages, 2352 KiB  
Article
A Hybrid Approach for Battery Selection Based on Green Criteria in Electric Vehicles: DEMATEL-QFD-Interval Type-2 Fuzzy VIKOR
by Müslüm Öztürk
Sustainability 2025, 17(14), 6277; https://doi.org/10.3390/su17146277 - 9 Jul 2025
Viewed by 237
Abstract
Production involves processes such as raw material extraction, energy consumption, and waste management, which can lead to significant environmental consequences. Therefore, supplier selection based not only on technical performance but also on environmental sustainability criteria has become a fundamental component of eco-friendly manufacturing [...] Read more.
Production involves processes such as raw material extraction, energy consumption, and waste management, which can lead to significant environmental consequences. Therefore, supplier selection based not only on technical performance but also on environmental sustainability criteria has become a fundamental component of eco-friendly manufacturing strategies. Moreover, in the selection of electric vehicle batteries, it is essential to consider customer demands alongside environmental factors. Accordingly, selected suppliers should fulfill company expectations while also reflecting the “voice” of the customer. The objective of this study is to propose an integrated approach for green supplier selection by taking into account various environmental performance requirements and criteria. The proposed approach evaluates battery suppliers with respect to both customer requirements and green criteria. To construct the relational structure, the DEMATEL method was employed to analyze the interrelationships among customer requirements (CRs). Subsequently, the Quality Function Deployment (QFD) model was used to establish a central relational matrix that captures the degree of correlation between each pair of supplier selection criteria and CRs. Finally, to evaluate and rank alternative suppliers, the Interval Type-2 Fuzzy VIKOR (IT2 F-VIKOR) method was applied. The hybrid approach proposed by us, integrating DEMATEL, QFD, and IT2 F-VIKOR, offers significant improvements over traditional methods. Unlike previous approaches that focus independently on customer preferences or supplier criteria, our model provides a unified evaluation by considering both dimensions simultaneously. Furthermore, the use of Interval Type-2 Fuzzy Logic enables the model to better manage uncertainty and ambiguity in expert judgments, yielding more reliable results compared to conventional fuzzy approaches. Additionally, the applicability of the model has been demonstrated through a real-world case study, confirming its practical relevance and robustness in the selection of green suppliers for electric vehicle battery procurement. Full article
Show Figures

Figure 1

17 pages, 18340 KiB  
Article
Physics-Informed Deep Learning for Karst Spring Prediction: Integrating Variational Mode Decomposition and Long Short-Term Memory with Attention
by Liangjie Zhao, Stefano Fazi, Song Luan, Zhe Wang, Cheng Li, Yu Fan and Yang Yang
Water 2025, 17(14), 2043; https://doi.org/10.3390/w17142043 - 8 Jul 2025
Viewed by 501
Abstract
Accurately forecasting karst spring discharge remains a significant challenge due to the inherent nonstationarity and multi-scale hydrological dynamics of karst hydrological systems. This study presents a physics-informed variational mode decomposition long short-term memory (VMD-LSTM) model, enhanced with an attention mechanism and Monte Carlo [...] Read more.
Accurately forecasting karst spring discharge remains a significant challenge due to the inherent nonstationarity and multi-scale hydrological dynamics of karst hydrological systems. This study presents a physics-informed variational mode decomposition long short-term memory (VMD-LSTM) model, enhanced with an attention mechanism and Monte Carlo dropout for uncertainty quantification. Hourly discharge data (2013–2018) from the Zhaidi karst spring in southern China were decomposed using VMD to extract physically interpretable temporal modes. These decomposed modes, alongside precipitation data, were input into an attention-augmented LSTM incorporating physics-informed constraints. The model was rigorously evaluated against a baseline standalone LSTM using an 80% training, 15% validation, and 5% testing data partitioning strategy. The results demonstrate substantial improvements in prediction accuracy for the proposed framework compared to the standard LSTM model. Compared to the baseline LSTM, the RMSE during testing decreased dramatically from 0.726 to 0.220, and the NSE improved from 0.867 to 0.988. The performance gains were most significant during periods of rapid conduit flow (the peak RMSE decreased by 67%) and prolonged recession phases. Additionally, Monte Carlo dropout, using 100 stochastic realizations, effectively quantified predictive uncertainty, achieving over 96% coverage in the 95% confidence interval (CI). The developed framework provides robust, accurate, and reliable predictions under complex hydrological conditions, highlighting substantial potential for supporting karst groundwater resource management and enhancing flood early-warning capabilities. Full article
Show Figures

Figure 1

20 pages, 1261 KiB  
Article
Risk Analysis of Five-Axis CNC Water Jet Machining Using Fuzzy Risk Priority Numbers
by Ufuk Cebeci, Ugur Simsir and Onur Dogan
Symmetry 2025, 17(7), 1086; https://doi.org/10.3390/sym17071086 - 7 Jul 2025
Viewed by 350
Abstract
The reliability and safety of five-axis CNC abrasive water jet machining are critical for many industries. This study employs Failure Mode and Effects Analysis (FMEA) to identify and mitigate potential failures in this machining system. Traditional FMEA, which relies on crisp numerical values, [...] Read more.
The reliability and safety of five-axis CNC abrasive water jet machining are critical for many industries. This study employs Failure Mode and Effects Analysis (FMEA) to identify and mitigate potential failures in this machining system. Traditional FMEA, which relies on crisp numerical values, often struggles with handling uncertainty in risk assessment. To address this limitation, this paper introduces an Interval-Valued Spherical Fuzzy FMEA (IVSF-FMEA) approach, which enhances risk evaluation by incorporating membership, non-membership, and hesitancy degrees. The IVSF-FMEA method leverages the inherent rotational symmetry of interval-valued spherical fuzzy sets and the permutation symmetry among severity, occurrence, and detectability criteria, resulting in a transformation-invariant and unbiased risk assessment framework. Applying IVSF-FMEA to seven periodic failure (PF) modes in five-axis CNC water jet machining achieves a more precise prioritization of risks, leading to improved decision-making and resource allocation. The findings highlight improper fixturing of the workpiece (PF6) as the most critical failure mode, with the highest RPN value of −0.54, followed by mechanical vibrations (PF2) and tool wear and breakage (PF1). This indicates that ensuring proper fixturing stability is essential for maintaining machining accuracy and preventing defects. Comparative analysis with traditional FMEA demonstrates the superiority of the proposed fuzzy-based approach in handling subjective assessments and reducing ambiguity. The findings highlight improper fixturing, mechanical vibrations, and tool wear as the most critical failure modes, necessitating targeted risk mitigation strategies. This research contributes to advancing risk assessment methodologies in complex manufacturing environments. Full article
(This article belongs to the Special Issue Recent Developments on Fuzzy Sets Extensions)
Show Figures

Figure 1

19 pages, 4784 KiB  
Article
Accurate and Fast Numerical Estimation of Pattern Uncertainty for Mechanical Alignment Errors in High-Accuracy Spherical Near-Field Antenna Measurements
by Kyriakos Kaslis, Samel Arslanagic and Olav Breinbjerg
Sensors 2025, 25(13), 4227; https://doi.org/10.3390/s25134227 - 7 Jul 2025
Viewed by 266
Abstract
Every experimental measurement is affected by random and/or systematic error sources, causing the measurand to have an associated uncertainty quantified in terms of a confidence interval and confidence level. For high-accuracy spherical near-field antenna measurements, there are approximately 20 error sources whose individual [...] Read more.
Every experimental measurement is affected by random and/or systematic error sources, causing the measurand to have an associated uncertainty quantified in terms of a confidence interval and confidence level. For high-accuracy spherical near-field antenna measurements, there are approximately 20 error sources whose individual contributions to the measurand uncertainty must be estimated for each antenna under test; thus, this uncertainty estimation is a required task in each measurement project. The error sources associated with the mechanical alignment of the antenna under test are of particular importance, not only because the consequential pattern uncertainty differs significantly for different antennas under test, but also because the common practice of experimental uncertainty estimation is very time-consuming with separate uncertainty measurements, thus requiring the antenna under test as well as the measurement facility. We propose a numerical pattern uncertainty estimation for mechanical alignment errors based on a nominal full-sphere measurement without the need for separate uncertainty measurements. Thus, it does not occupy either the antenna under test or the measurement facilities. In addition, numerical uncertainty estimation enables the isolation of individual error sources and their contributions to pattern uncertainties. Full article
(This article belongs to the Special Issue Recent Advances in Antenna Measurement Techniques)
Show Figures

Figure 1

16 pages, 3163 KiB  
Article
Quality Control of Asphalt Mixes Using EM Density Gauge: A Statistical Evaluation of Field Durability
by M. Ariel Villanueva-Guzmán, Hugo L. Chávez-García, Elia M. Alonso-Guzmán, Wilfrido Martínez-Molina, Horacio Delgado-Alamilla, Juan F. Mendoza-Sanchez, Marco Antonio Navarrete-Seras and Mauricio Arreola-Sánchez
Appl. Sci. 2025, 15(13), 7586; https://doi.org/10.3390/app15137586 - 7 Jul 2025
Viewed by 785
Abstract
It is proposed to reduce the statistical uncertainty to make informed decisions in pavement construction, using a non-destructive method to determine the density (p) of asphalt mixtures, a decisive parameter to know the quality of the material studied and the content of voids [...] Read more.
It is proposed to reduce the statistical uncertainty to make informed decisions in pavement construction, using a non-destructive method to determine the density (p) of asphalt mixtures, a decisive parameter to know the quality of the material studied and the content of voids (air voids), contrasting the results with destructive and physical tests to specimens extracted at the test site. This was carried out in the field with the EM density gauge (electromagnetic), on a 71.2 km long stretch of road. The results of the non-destructive tests were compared with the AASHTO standards. The study was focused on a representative sample of 25.9% of the total population, obtained using intentional stratified statistical sampling; the standard deviation was taken as the decisive value of dispersion in the determination of the p-density of the mixtures. The AASHTO T343 standard establishes that the permissible standard deviation for asphalt mixtures should be 0.050 g/cm3. Supplementary statistical analysis shows that the measurement error of the EM densitometer and the core-sampling method is ±1.8%, and the correlation coefficient within the 95% confidence interval reaches 0.91. The results of the analysis show a convincing trend towards the implementation of non-destructive methods, such as EM density gauge, to guarantee the determination of the quality of asphalt mixtures in the field, reducing the time required to determine the quality of the asphalt mixes. The results of the analysis show a convincing trend towards the implementation of non-destructive methods, such as EM density gauge, to ensure the determination of the quality of asphalt mixtures in the field, reducing the time required to determine the quality of asphalt mixtures. Full article
Show Figures

Figure 1

Back to TopTop