Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (611)

Search Parameters:
Keywords = evaluation decision rules

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
15 pages, 464 KB  
Article
RAPID-CARE: Rapid Antibiotic Optimization in the ICU After Implementation of a Pneumonia Multiplex PCR Test—A Real-World Evaluation
by Montserrat Rodríguez-Gómez, Fernando Martínez-Sagasti, María Calle-Romero, Andrea Prieto-Cabrera, Patricia De La Montaña-Díaz, Irene Díaz-De la Torre, Alberto Delgado-Iribarren García-Campero, Sara Domingo-Marín, Miguel Sánchez-García and Ignacio Martín-Loeches
Antibiotics 2025, 14(11), 1084; https://doi.org/10.3390/antibiotics14111084 - 27 Oct 2025
Abstract
Background/Objectives: Lower respiratory tract infections (LRTIs) are frequent in the intensive care unit (ICU) and drive empiric broad-spectrum antibiotic use. Rapid multiplex PCR assays may improve pathogen detection and stewardship compared with conventional culture. We evaluated the real-world impact of the BioFire [...] Read more.
Background/Objectives: Lower respiratory tract infections (LRTIs) are frequent in the intensive care unit (ICU) and drive empiric broad-spectrum antibiotic use. Rapid multiplex PCR assays may improve pathogen detection and stewardship compared with conventional culture. We evaluated the real-world impact of the BioFire® FilmArray® Pneumonia Panel Plus (FA-PNEU®) on antimicrobial management in suspected nosocomial LRTI. Methods: This was a single-centre, prospective observational cohort study conducted in a tertiary ICU (Madrid, Spain) between April 2021 and March 2025. Adult patients with suspected hospital-acquired pneumonia (HAP), ventilator-associated pneumonia (VAP), or ventilator-associated tracheobronchitis (VAT) were included if paired respiratory samples underwent FA-PNEU® and conventional culture (CC). Diagnostic accuracy and prescribing changes were analysed. Results: A total of 344 samples from 236 patients were included. FA-PNEU® demonstrated high sensitivity (93.4%) and negative predictive value (97.9%) but moderate specificity (65.0%) and low positive predictive value (36.5%). False positives occurred in 85.8% of patients with prior antibiotic therapy targeting the detected organism. Antibiotic management was considered directly influenced by FA-PNEU® when any prescribing decision (initiation, escalation, de-escalation, or discontinuation) explicitly followed the panel’s results rather than other clinical or microbiological information. Using this definition, FA-PNEU® directly influenced antibiotic therapy in 57.6% of cases, while in 17.7%, prescribing was instead guided by a suspected alternative infection. In patients without prior antibiotics, treatment initiation or withholding was fully concordant with FA-PNEU® results, while in those already receiving therapy, 60.8% underwent modification, two-thirds in agreement with the panel. Conclusions: In critically ill patients with suspected nosocomial LRTI, FA-PNEU® provided rapid, high-sensitivity diagnostics that substantially influenced antimicrobial prescribing. Its greatest value lies in ruling out bacterial infection and guiding stewardship, though results must be interpreted within the full clinical and microbiological context. Full article
(This article belongs to the Section Antibiotics Use and Antimicrobial Stewardship)
Show Figures

Figure 1

18 pages, 1562 KB  
Article
NS-Assist: Nephrotic Syndrome Assistance System for Pediatric Decision-Making in Pandemic Situations
by Nada Zendaoui, Nardjes Bouchemal, Naila Bouchemal, Imane Boussebough and Galina Ivanova
Appl. Sci. 2025, 15(21), 11433; https://doi.org/10.3390/app152111433 - 26 Oct 2025
Viewed by 113
Abstract
The COVID-19 pandemic has underscored the need for telemedicine to ensure continuity of pediatric care during health emergencies. This paper presents NS-Assist, a hybrid web–mobile decision support system for managing Idiopathic Nephrotic Syndrome (INS) in children. The system combines rule-based reasoning and fuzzy [...] Read more.
The COVID-19 pandemic has underscored the need for telemedicine to ensure continuity of pediatric care during health emergencies. This paper presents NS-Assist, a hybrid web–mobile decision support system for managing Idiopathic Nephrotic Syndrome (INS) in children. The system combines rule-based reasoning and fuzzy inference to assist clinicians in diagnosis, treatment adjustment, and relapse monitoring, while enabling caregivers to record and track daily health data. Implemented using Spring Boot, ReactJS, and Flutter with a secure MySQL database, NS-Assist integrates medical expertise with computational intelligence to support remote decision-making. A pilot evaluation involving 40 participants, including clinicians and caregivers, showed improved communication, reduced consultation time, and enhanced follow-up continuity. These results highlight the system’s potential as a reliable and adaptable framework for pediatric telemedicine in resource-constrained and emergency settings. Full article
(This article belongs to the Special Issue Applications in Neural and Symbolic Artificial Intelligence)
Show Figures

Figure 1

20 pages, 611 KB  
Article
Efficient Evaluation of Sobol’ Sensitivity Indices via Polynomial Lattice Rules and Modified Sobol’ Sequences
by Venelin Todorov and Petar Zhivkov
Mathematics 2025, 13(21), 3402; https://doi.org/10.3390/math13213402 - 25 Oct 2025
Viewed by 72
Abstract
Accurate and efficient estimation of Sobol’ sensitivity indices is a cornerstone of variance-based global sensitivity analysis, providing critical insights into how uncertainties in input parameters affect model outputs. This is particularly important for large-scale environmental, engineering, and financial models, where understanding parameter influence [...] Read more.
Accurate and efficient estimation of Sobol’ sensitivity indices is a cornerstone of variance-based global sensitivity analysis, providing critical insights into how uncertainties in input parameters affect model outputs. This is particularly important for large-scale environmental, engineering, and financial models, where understanding parameter influence is essential for improving model reliability, guiding calibration, and supporting informed decision-making. However, computing Sobol’ indices requires evaluating high-dimensional integrals, presenting significant numerical and computational challenges. In this study, we present a comparative analysis of two of the best available Quasi-Monte Carlo (QMC) techniques: polynomial lattice rules (PLRs) and modified Sobol’ sequences. The performance of both approaches is systematically assessed in terms of performance and accuracy. Extensive numerical experiments demonstrate that the proposed PLR-based framework achieves superior precision for several sensitivity measures, while modified Sobol’ sequences remain competitive for lower-dimensional indices. Our results show that IPLR-α3 outperforms traditional QMC methods in estimating both dominant and weak sensitivity indices, offering a robust framework for high-dimensional models. These findings provide practical guidelines for selecting optimal QMC strategies, contributing to more reliable sensitivity analysis and enhancing the predictive power of complex computational models. Full article
26 pages, 1644 KB  
Article
Context-Aware Alerting in Elderly Care Facilities: A Hybrid Framework Integrating LLM Reasoning with Rule-Based Logic
by Nazmun Nahid, Md Atiqur Rahman Ahad and Sozo Inoue
Sensors 2025, 25(21), 6560; https://doi.org/10.3390/s25216560 (registering DOI) - 24 Oct 2025
Viewed by 296
Abstract
The rising demand for elderly care amid ongoing nursing shortages has highlighted the limitations of conventional alert systems, which frequently generate excessive alerts and contribute to alarm fatigue. The objective of this study is to develop a hybrid, context-aware nurse alerting framework for [...] Read more.
The rising demand for elderly care amid ongoing nursing shortages has highlighted the limitations of conventional alert systems, which frequently generate excessive alerts and contribute to alarm fatigue. The objective of this study is to develop a hybrid, context-aware nurse alerting framework for long-term care (LTC) facilities that minimizes redundant alarms, reduces alarm fatigue, and enhances patient safety and caregiving balance during multi-person care scenarios such as mealtimes. To do so, we aimed to intelligently suppress, delay, and validate alerts by integrating rule-based logic with Large Language Model (LLM)-driven semantic reasoning. We conducted an experimental study in a real-world LTC environment involving 28 elderly residents (6 high, 8 medium, and 14 low care levels) and four nurses across three rooms over seven days. The proposed system utilizes video-derived skeletal motion, care-level annotations, and dynamic nurse–elderly proximity for decision making. Statistical analyses were performed using F1 score, accuracy, false positive rate (FPR), and false negative rate (FNR) to evaluate performance improvements. Compared to the baseline where all nurses were notified (100% alarm load), the proposed method reduced average alarm load to 27.5%, achieving a 72.5% reduction, with suppression rates reaching 100% in some rooms for some nurses. Performance metrics further validate the system’s effectiveness: the macro F1 score improved from 0.18 (baseline) to 0.97, while accuracy rose from 0.21 (baseline) to 0.98. Compared to the baseline error rates (FPR 0.20, FNR 0.79), the proposed method achieved drastically lower values (FPR 0.005, FNR 0.023). Across both spatial (room-level) and temporal (day-level) validations, the proposed approach consistently outperformed baseline and purely rule-based methods. These findings demonstrate that the proposed approach effectively minimizes false alarms while maintaining strong operational efficiency. By integrating rule-based mechanisms with LLM-based contextual reasoning, the framework significantly enhances alert accuracy, mitigates alarm fatigue, and promotes safer, more sustainable, and human-centered care practices, making it suitable for practical deployment within real-world long-term care environments. Full article
(This article belongs to the Section Biomedical Sensors)
22 pages, 4162 KB  
Article
Evolutionary Algorithm Approaches for Cherry Fruit Classification Based on Pomological Features
by Erhan Akyol, Bilal Alatas and Inanc Ozgen
Agriculture 2025, 15(21), 2207; https://doi.org/10.3390/agriculture15212207 - 24 Oct 2025
Viewed by 206
Abstract
The cherry fruit fly (Rhagoletis cerasi L.) poses a major threat to global cherry production, with significant economic implications. This study presents an innovative approach to assist pest control strategies by classifying cherry fruit samples based on pomological data using evolutionary rule-based [...] Read more.
The cherry fruit fly (Rhagoletis cerasi L.) poses a major threat to global cherry production, with significant economic implications. This study presents an innovative approach to assist pest control strategies by classifying cherry fruit samples based on pomological data using evolutionary rule-based classification algorithms. A unique dataset comprising 396 samples from five different coloring periods was collected, focusing particularly on the second pomological period when pest activity peaks. Three evolutionary algorithms, CORE (Evolutionary Rule Extractor for Classification), DMEL (Data Mining with Evolutionary Learning for Classification) and OCEC (Organizational Evolutionary Classification), were applied to find interpretable classification rules that find whether an incoming cherry sample belongs to the second pomological period or other periods. Two distinct fitness functions were used to evaluate the algorithms’ performance. The results of the algorithms are compared with various visual graphs and the metric values are compared with visual graphs in a similar fashion. The findings highlight the potential of explainable AI models in enhancing agricultural decision-making and offer a novel, data-based methodology for integrated pest management in cherry production for the prediction of cherry fruit phenology class. Full article
(This article belongs to the Section Artificial Intelligence and Digital Agriculture)
Show Figures

Figure 1

6 pages, 3663 KB  
Interesting Images
A Multi-Modality Approach to the Assessment of a Right Atrium Mass in a Female Patient with Breast Cancer Undergoing Neoadjuvant Chemotherapy
by Małgorzata Chlabicz, Paweł Muszyński, Joanna Kruszyńska, Piotr Kazberuk, Magdalena Róg-Makal, Magdalena Lipowicz, Urszula Matys, Anna Tomaszuk-Kazberuk, Marcin Kożuch and Sławomir Dobrzycki
Diagnostics 2025, 15(21), 2683; https://doi.org/10.3390/diagnostics15212683 - 23 Oct 2025
Viewed by 195
Abstract
Echocardiography remains a vital part of the initial assessment and monitoring of oncological patients. It allows for proper treatment selection but can also reveal life-threatening complications, including impaired left ventricular function or thromboembolism. It can rarely detect intracardiac masses that require further investigation. [...] Read more.
Echocardiography remains a vital part of the initial assessment and monitoring of oncological patients. It allows for proper treatment selection but can also reveal life-threatening complications, including impaired left ventricular function or thromboembolism. It can rarely detect intracardiac masses that require further investigation. In the presented case, a 51-year-old female patient with left-sided breast cancer, who had undergone neoadjuvant chemotherapy, was hospitalised due to a right atrial mass identified via routine transthoracic echocardiography (TTE). Initial anticoagulation therapy showed no clinical improvement. Follow-up TTE revealed a 12 × 19 mm hyperechogenic, mobile mass in the right atrium (RA). Computed tomography angiography (CTA) ruled out pulmonary embolism and revealed that the mass was located close to the tip of the vascular access port. Transoesophageal echocardiography showed that the lesion was not connected to the vascular port. Based on location and mobility, the lesion was most consistent with a cardiac myxoma. After the Heart Team made a decision, endovascular intervention using a vacuum-assisted device was performed without complications. Histopathological examination excluded thrombosis and myxoma, revealing a fibro-inflammatory lesion. A multimodality approach is necessary to assess RA masses. However, even an extensive evaluation could be misleading, so treatment options should always be subject to the Heart Team’s decision. Full article
(This article belongs to the Special Issue The Future of Cardiac Imaging in the Diagnosis, 2nd Edition)
Show Figures

Figure 1

10 pages, 473 KB  
Article
Framework for In Silico Toxicity Screening of Novel Odorants
by Isaac Mohar, Brad C. Hansen, Destiny M. Hollowed and Joel D. Mainland
Toxics 2025, 13(10), 902; https://doi.org/10.3390/toxics13100902 - 21 Oct 2025
Viewed by 280
Abstract
Toxicological risk assessment of chemicals without experimental toxicity data often relies on in silico predictions. However, models designed to predict inhalation toxicity associated with exposure to volatile chemicals in solution are unavailable. The aim of this research was to develop an approach to [...] Read more.
Toxicological risk assessment of chemicals without experimental toxicity data often relies on in silico predictions. However, models designed to predict inhalation toxicity associated with exposure to volatile chemicals in solution are unavailable. The aim of this research was to develop an approach to estimate toxicology-based maximum solution concentrations for novel odorants using in silico structure-based predictions. The decision trees were adapted from established open-source models for assessing mutagenicity (rule-based, ISS in vitro mutagenicity decision tree) and systemic toxicity (revised Cramer decision tree). These were implemented using Toxtree (v3.1.0), a freely available program. Thresholds of toxicologic concern (TTC) were then assigned based on the predicted hazard classification. We then used predicted vapor pressure derived from MPBPWIN™ using US EPA EPI Suite to calculate a solution concentration where inhalation exposure to a defined headspace volume would not exceed the TTC. The approach was evaluated using a published dataset of 143 chemicals with repeat exposure inhalation toxicity data, yielding health-protective predictions for 98.6% of the test set. This demonstrates that the proposed in silico approach enables the estimation of safe toxicology-based maximum solution concentrations for chemicals using open-source models and software. Full article
(This article belongs to the Collection Predictive Toxicology)
Show Figures

Graphical abstract

20 pages, 1492 KB  
Article
Interpretable Diagnostics with SHAP-Rule: Fuzzy Linguistic Explanations from SHAP Values
by Alexandra I. Khalyasmaa, Pavel V. Matrenin and Stanislav A. Eroshenko
Mathematics 2025, 13(20), 3355; https://doi.org/10.3390/math13203355 - 21 Oct 2025
Viewed by 278
Abstract
This study introduces SHAP-Rule, a novel explainable artificial intelligence method that integrates Shapley additive explanations with fuzzy logic to automatically generate interpretable linguistic IF-THEN rules for diagnostic tasks. Unlike purely numeric SHAP vectors, which are difficult for decision-makers to interpret, SHAP-Rule translates feature [...] Read more.
This study introduces SHAP-Rule, a novel explainable artificial intelligence method that integrates Shapley additive explanations with fuzzy logic to automatically generate interpretable linguistic IF-THEN rules for diagnostic tasks. Unlike purely numeric SHAP vectors, which are difficult for decision-makers to interpret, SHAP-Rule translates feature attributions into concise explanations that humans can understand. The method was rigorously evaluated and compared with baseline SHAP and AnchorTabular explanations across three distinct and representative datasets: the CWRU Bearing dataset for industrial predictive maintenance, a dataset for failure analysis in power transformers, and the medical Pima Indians Diabetes dataset. Experimental results demonstrated that SHAP-Rule consistently provided clearer and more easily comprehensible explanations, achieving high expert ratings for simplicity and understanding. Additionally, SHAP-Rule exhibited superior computational efficiency and robust consistency compared to alternative methods, making it particularly suitable for real-time diagnostic applications. Although SHAP-Rule showed minor trade-offs in coverage, it maintained high global fidelity, often approaching 100%. These findings highlight the significant practical advantages of linguistic fuzzy explanations generated by SHAP-Rule, emphasizing its strong potential for enhancing interpretability, efficiency, and reliability in diagnostic decision-support systems. Full article
Show Figures

Figure 1

19 pages, 1977 KB  
Article
Research on the Evaluation Model for Natural Gas Pipeline Capacity Allocation Under Fair and Open Access Mode
by Xinze Li, Dezhong Wang, Yixun Shi, Jiaojiao Jia and Zixu Wang
Energies 2025, 18(20), 5544; https://doi.org/10.3390/en18205544 - 21 Oct 2025
Viewed by 249
Abstract
Compared with other fossil energy sources, natural gas is characterized by compressibility, low energy density, high storage costs, and imbalanced usage. Natural gas pipeline supply systems possess unique attributes such as closed transportation and a highly integrated upstream, midstream, and downstream structure. Moreover, [...] Read more.
Compared with other fossil energy sources, natural gas is characterized by compressibility, low energy density, high storage costs, and imbalanced usage. Natural gas pipeline supply systems possess unique attributes such as closed transportation and a highly integrated upstream, midstream, and downstream structure. Moreover, pipelines are almost the only economical means of onshore natural gas transportation. Given that the upstream of the pipeline features multi-entity and multi-channel supply including natural gas, coal-to-gas, and LNG vaporized gas, while the downstream presents a competitive landscape with multi-market and multi-user segments (e.g., urban residents, factories, power plants, and vehicles), there is an urgent social demand for non-discriminatory and fair opening of natural gas pipeline network infrastructure to third-party entities. However, after the fair opening of natural gas pipeline networks, the original “point-to-point” transaction model will be replaced by market-driven behaviors, making the verification and allocation of gas transmission capacity a key operational issue. Currently, neither pipeline operators nor government regulatory authorities have issued corresponding rules, regulations, or evaluation plans. To address this, this paper proposes a multi-dimensional quantitative evaluation model based on the Analytic Hierarchy Process (AHP), integrating both commercial and technical indicators. The model comprehensively considers six indicators: pipeline transportation fees, pipeline gas line pack, maximum gas storage capacity, pipeline pressure drop, energy consumption, and user satisfaction and constructs a quantitative evaluation system. Through the consistency check of the judgment matrix (CR = 0.06213 < 0.1), the weights of the respective indicators are determined as follows: 0.2584, 0.2054, 0.1419, 0.1166, 0.1419, and 0.1357. The specific score of each indicator is determined based on the deviation between each evaluation indicator and the theoretical optimal value under different gas volume allocation schemes. Combined with the weight proportion, the total score of each gas volume allocation scheme is finally calculated, thereby obtaining the recommended gas volume allocation scheme. The evaluation model was applied to a practical pipeline project. The evaluation results show that the AHP-based evaluation model can effectively quantify the advantages and disadvantages of different gas volume allocation schemes. Notably, the gas volume allocation scheme under normal operating conditions is not the optimal one; instead, it ranks last according to the scores, with a score 0.7 points lower than that of the optimal scheme. In addition, to facilitate rapid decision-making for gas volume allocation schemes, this paper designs a program using HTML and develops a gas volume allocation evaluation program with JavaScript based on the established model. This self-developed program has the function of automatically generating scheme scores once the proposed gas volume allocation for each station is input, providing a decision support tool for pipeline operators, shippers, and regulatory authorities. The evaluation model provides a theoretical and methodological basis for the dynamic optimization of natural gas pipeline gas volume allocation schemes under the fair opening model. It is expected to, on the one hand, provide a reference for transactions between pipeline network companies and shippers, and on the other hand, offer insights for regulatory authorities to further formulate detailed and fair gas transmission capacity transaction methods. Full article
(This article belongs to the Special Issue New Advances in Oil, Gas and Geothermal Reservoirs—3rd Edition)
Show Figures

Figure 1

17 pages, 414 KB  
Article
DQMAF—Data Quality Modeling and Assessment Framework
by Razan Al-Toq and Abdulaziz Almaslukh
Information 2025, 16(10), 911; https://doi.org/10.3390/info16100911 - 17 Oct 2025
Viewed by 404
Abstract
In today’s digital ecosystem, where millions of users interact with diverse online services and generate vast amounts of textual, transactional, and behavioral data, ensuring the trustworthiness of this information has become a critical challenge. Low-quality data—manifesting as incompleteness, inconsistency, duplication, or noise—not only [...] Read more.
In today’s digital ecosystem, where millions of users interact with diverse online services and generate vast amounts of textual, transactional, and behavioral data, ensuring the trustworthiness of this information has become a critical challenge. Low-quality data—manifesting as incompleteness, inconsistency, duplication, or noise—not only undermines analytics and machine learning models but also exposes unsuspecting users to unreliable services, compromised authentication mechanisms, and biased decision-making processes. Traditional data quality assessment methods, largely based on manual inspection or rigid rule-based validation, cannot cope with the scale, heterogeneity, and velocity of modern data streams. To address this gap, we propose DQMAF (Data Quality Modeling and Assessment Framework), a generalized machine learning–driven approach that systematically profiles, evaluates, and classifies data quality to protect end-users and enhance the reliability of Internet services. DQMAF introduces an automated profiling mechanism that measures multiple dimensions of data quality—completeness, consistency, accuracy, and structural conformity—and aggregates them into interpretable quality scores. Records are then categorized into high, medium, and low quality, enabling downstream systems to filter or adapt their behavior accordingly. A distinctive strength of DQMAF lies in integrating profiling with supervised machine learning models, producing scalable and reusable quality assessments applicable across domains such as social media, healthcare, IoT, and e-commerce. The framework incorporates modular preprocessing, feature engineering, and classification components using Decision Trees, Random Forest, XGBoost, AdaBoost, and CatBoost to balance performance and interpretability. We validate DQMAF on a publicly available Airbnb dataset, showing its effectiveness in detecting and classifying data issues with high accuracy. The results highlight its scalability and adaptability for real-world big data pipelines, supporting user protection, document and text-based classification, and proactive data governance while improving trust in analytics and AI-driven applications. Full article
(This article belongs to the Special Issue Machine Learning and Data Mining for User Classification)
Show Figures

Figure 1

34 pages, 3860 KB  
Article
Sensor-Level Anomaly Detection in DC–DC Buck Converters with a Physics-Informed LSTM: DSP-Based Validation of Detection and a Simulation Study of CI-Guided Deception
by Jeong-Hoon Moon, Jin-Hong Kim and Jung-Hwan Lee
Appl. Sci. 2025, 15(20), 11112; https://doi.org/10.3390/app152011112 - 16 Oct 2025
Viewed by 198
Abstract
Digitally controlled DC–DC converters are vulnerable to sensor-side spoofing, motivating plant-level anomaly detection that respects the converter physics. We present a physics-informed LSTM (PI–LSTM) autoencoder for a 24→12 V buck converter. The model embeds discrete-time circuit equations as residual penalties and uses a [...] Read more.
Digitally controlled DC–DC converters are vulnerable to sensor-side spoofing, motivating plant-level anomaly detection that respects the converter physics. We present a physics-informed LSTM (PI–LSTM) autoencoder for a 24→12 V buck converter. The model embeds discrete-time circuit equations as residual penalties and uses a fixed decision rule (τ=μ+3σ, N=3 consecutive samples). We study three voltage-sensing attacks (DC bias, fixed-sample delay, and narrowband noise) in MATLAB/Simulink. We then validate the detection path on a TMS320F28379 DSP. The detector attains F1 scores of 96.12%, 91.91%, and 97.50% for bias, delay, and noise (simulation); on hardware, it achieves 2.9–4.2 ms latency with an alarm-wise FPR of ≤1.2%. We also define a unified safety box for DC rail quality and regulation. In simulations, we evaluate a confusion index (CI) policy for safety-bounded performance adjustment. A operating point yields CI0.25 while remaining within the safety limits. In hardware experiments without CI actuation, the Vr,pp and IRR stayed within the limits, whereas the ±2% regulation window was occasionally exceeded under the delay attack (up to ≈2.8%). These results indicate that physics-informed detection is deployable on resource-constrained controllers with millisecond-scale latency and a low alarm-wise FPR, while the full hardware validation of CI-guided deception (safety-bounded performance adjustment) under the complete safety box is left to future work. Full article
Show Figures

Figure 1

32 pages, 25136 KB  
Article
Efficiency Evaluation of Sampling Density for Indoor Building LiDAR Point-Cloud Segmentation
by Yiquan Zou, Wenxuan Chen, Tianxiang Liang and Biao Xiong
Sensors 2025, 25(20), 6398; https://doi.org/10.3390/s25206398 - 16 Oct 2025
Viewed by 498
Abstract
Prior studies on indoor LiDAR point-cloud semantic segmentation consistently report that sampling density strongly affects segmentation accuracy as well as runtime and memory, establishing an accuracy–efficiency trade-off. Nevertheless, in practice, the density is often chosen heuristically and reported under heterogeneous protocols, which limits [...] Read more.
Prior studies on indoor LiDAR point-cloud semantic segmentation consistently report that sampling density strongly affects segmentation accuracy as well as runtime and memory, establishing an accuracy–efficiency trade-off. Nevertheless, in practice, the density is often chosen heuristically and reported under heterogeneous protocols, which limits quantitative guidance. We present a unified evaluation framework that treats density as the sole independent variable. To control architectural variability, three representative backbones—PointNet, PointNet++, and DGCNN—are each augmented with an identical Point Transformer module, yielding PointNet-Trans, PointNet++-Trans, and DGCNN-Trans trained and tested under one standardized protocol. The framework couples isotropic voxel-guided uniform down-sampling with a decision rule integrating three signals: (i) accuracy sufficiency, (ii) the onset of diminishing efficiency, and (iii) the knee of the accuracy–density curve. Experiments on scan-derived indoor point clouds (with BIM-derived counterparts for contrast) quantify the accuracy–runtime trade-off and identify an engineering-feasible operating band of 1600–2900 points/m2, with a robust setting near 2400 points/m2. Planar components saturate at moderate densities, whereas beams are more sensitive to down-sampling. By isolating density effects and enforcing one protocol, the study provides reproducible, model-agnostic guidance for scan planning and compute budgeting in indoor mapping and Scan-to-BIM workflows. Full article
(This article belongs to the Special Issue Application of LiDAR Remote Sensing and Mapping)
Show Figures

Figure 1

19 pages, 1396 KB  
Article
Sparse Keyword Data Analysis Using Bayesian Pattern Mining
by Sunghae Jun
Computers 2025, 14(10), 436; https://doi.org/10.3390/computers14100436 - 14 Oct 2025
Viewed by 291
Abstract
Keyword data analysis aims to extract and interpret meaningful relationships from large collections of text documents. A major challenge in this process arises from the extreme sparsity of document–keyword matrices, where the majority of elements are zeros due to zero inflation. To address [...] Read more.
Keyword data analysis aims to extract and interpret meaningful relationships from large collections of text documents. A major challenge in this process arises from the extreme sparsity of document–keyword matrices, where the majority of elements are zeros due to zero inflation. To address this issue, this study proposes a probabilistic framework called Bayesian Pattern Mining (BPM), which integrates Bayesian inference into association rule mining (ARM). The proposed method estimates both the expected values and credible intervals of interestingness measures such as confidence and lift, providing a probabilistic evaluation of keyword associations. Experiments conducted on 9436 quantum computing patent documents, from which 175 representative keywords were extracted, demonstrate that BPM yields more stable and interpretable associations than conventional ARM. By incorporating credible intervals, BPM reduces the risk of biased decisions under sparsity and enhances the reliability of keyword-based technology analysis, offering a rigorous approach for knowledge discovery in zero-inflated text data. Full article
Show Figures

Figure 1

18 pages, 1475 KB  
Article
Sentiment Analysis of Tourist Reviews About Kazakhstan Using a Hybrid Stacking Ensemble Approach
by Aslanbek Murzakhmetov, Maxatbek Satymbekov, Arseniy Bapanov and Nurbol Beisov
Computation 2025, 13(10), 240; https://doi.org/10.3390/computation13100240 - 13 Oct 2025
Viewed by 430
Abstract
Tourist reviews provide essential insights into travellers experiences and public perceptions of destinations. In Kazakhstan, however, sentiment analysis, particularly using ensemble learning, remains underexplored for evaluating such reviews. This study proposes a hybrid stacking ensemble for sentiment analysis of English-language tourist reviews about [...] Read more.
Tourist reviews provide essential insights into travellers experiences and public perceptions of destinations. In Kazakhstan, however, sentiment analysis, particularly using ensemble learning, remains underexplored for evaluating such reviews. This study proposes a hybrid stacking ensemble for sentiment analysis of English-language tourist reviews about Kazakhstan, integrating four complementary approaches: VADER, TextBlob, Stanza, and Local Context Focus Mechanism with Bidirectional Encoder Representations from Transformers (LCF-BERT). Each model contributes distinct analytical capabilities, including lexicon-based polarity detection, rule-based subjectivity evaluation, generalised star-rating estimation, and contextual aspect-oriented sentiment classification. The evaluation utilised a cleaned dataset of 11,454 TripAdvisor reviews collected between February 2022 and June 2025. The ensemble aggregates model outputs through majority and weighted voting strategies to enhance robustness. Experimental results (accuracy 0.891, precision 0.838, recall 0.891, and F1-score 0.852) demonstrate that the proposed method KazSATR outperforms individual models in overall classification accuracy and exhibits superior capacity for aspect-level sentiment detection. These findings underscore the potential of the hybrid ensemble as a practical and scalable tool for the tourism sector in Kazakhstan. By leveraging multiple analytical paradigms, the model enables tourism professionals and policymakers to better understand traveller preferences, identify service strengths and weaknesses, and inform strategic decision-making. The proposed approach contributes to advancing sentiment analysis applications in tourism research, particularly in underrepresented geographic contexts. Full article
(This article belongs to the Section Computational Social Science)
Show Figures

Graphical abstract

36 pages, 3154 KB  
Article
A Decision Support Framework for Solar PV System Selection in SMMEs Using a Multi-Objective Optimization by Ratio Analysis Technique
by Bonginkosi A. Thango and Fanny Saruchera
Information 2025, 16(10), 889; https://doi.org/10.3390/info16100889 - 13 Oct 2025
Viewed by 250
Abstract
South African small, medium and micro enterprises, particularly township-based spaza shops, face barriers to adopting solar photovoltaic systems due to upfront costs, regulatory uncertainty, and limited technical capacity. This article presents a reproducible methodology for evaluating and selecting solar photovoltaic systems that jointly [...] Read more.
South African small, medium and micro enterprises, particularly township-based spaza shops, face barriers to adopting solar photovoltaic systems due to upfront costs, regulatory uncertainty, and limited technical capacity. This article presents a reproducible methodology for evaluating and selecting solar photovoltaic systems that jointly considers economic, technological, and legal/policy criteria for such enterprises. We apply multi-criteria decision making using the Multi-Objective Optimization by the Ratio Analysis method, integrating simulation-derived techno-economic metrics with a formal policy-alignment score that reflects registration requirements, tax incentives, and access to green finance. Ten representative system configurations are assessed across cost and benefit criteria using vector normalization and weighted aggregation to enable transparent, like-for-like comparison. The analysis indicates that configurations aligned with interconnection and incentive frameworks are preferred over non-compliant options, reflecting the practical influence of policy eligibility on investability and risk. The framework is lightweight and auditable, designed so that institutional actors can prepare shared inputs while installers, lenders, and shop owners apply the ranking to guide decisions. Although demonstrated in a South African context, the procedure generalizes by substituting local tariffs, irradiance, load profiles, and jurisdiction-specific rules, providing a portable decision aid for small enterprise energy transitions. Full article
Show Figures

Figure 1

Back to TopTop