Due to scheduled maintenance work on our servers, there may be short service disruptions on this website between 11:00 and 12:00 CEST on March 28th.
Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (151)

Search Parameters:
Keywords = electronic alerts

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
26 pages, 14884 KB  
Review
A Review on Forest Fire Detection Techniques: Past, Present, and Sustainable Future
by Alimul Haque Khan, Ali Newaz Bahar and Khan Wahid
Sensors 2026, 26(5), 1609; https://doi.org/10.3390/s26051609 - 4 Mar 2026
Viewed by 566
Abstract
Forest fires are a major concern due to their significant impact on the environment, economy, and wildlife habitats. Efficient early detection systems can significantly mitigate their devastating effects. This paper provides a comprehensive review of forest fire detection (FFD) techniques and traces their [...] Read more.
Forest fires are a major concern due to their significant impact on the environment, economy, and wildlife habitats. Efficient early detection systems can significantly mitigate their devastating effects. This paper provides a comprehensive review of forest fire detection (FFD) techniques and traces their evolution from basic lookout-based methods to sophisticated remote sensing technologies, including recent Internet of Things (IoT)- and Unmanned Aerial Vehicle (UAV)-based sensor network systems. Historical methods, characterized primarily by human surveillance and basic electronic sensors, laid the foundation for modern techniques. Recently, there has been a noticeable shift toward ground-based sensors, automated camera systems, aerial surveillance using drones and aircraft, and satellite imaging. Moreover, the rise of Artificial Intelligence (AI), Machine Learning (ML), and the IoT introduces a new era of advanced detection capabilities. These detection systems are being actively deployed in wildfire-prone regions, where early alerts have proven critical in minimizing damage and aiding rapid response. All FFD techniques follow a common path of data collection, pre-processing, data compression, transmission, and post-processing. Providing sufficient power to complete these tasks is also an important area of research. Recent research focuses on image compression techniques, data transmission, the application of ML and AI at edge nodes and servers, and the minimization of energy consumption, among other emerging directions. However, to build a sustainable FFD model, proper sensor deployment is essential. Sensors can be either fixed at specific geographic locations or attached to UAVs. In some cases, a combination of fixed and UAV-mounted sensors may be used. Careful planning of sensor deployment is essential for the success of the model. Moreover, ensuring adequate energy supply for both ground-based and UAV-based sensors is important. Replacing sensor batteries or recharging UAVs in remote areas is highly challenging, particularly in the absence of an operator. Hence, future FFD systems must prioritize not only detection accuracy but also long-term energy autonomy and strategic sensor placement. Integrating renewable energy sources, optimizing data processing, and ensuring minimal human intervention will be key to developing truly sustainable and scalable solutions. This review aims to guide researchers and developers in designing next-generation FFD systems aligned with practical field demands and environmental resilience. Full article
(This article belongs to the Section Environmental Sensing)
Show Figures

Figure 1

18 pages, 1735 KB  
Article
A High-Precision Time-Varying Survival Model for Early Prediction of Patient Deterioration: A Retrospective Cohort Study
by Nishchay Joshi, Brian Wood, David Chapman, Martin Farrier and Thomas Ingram
J. Clin. Med. 2026, 15(5), 1690; https://doi.org/10.3390/jcm15051690 - 24 Feb 2026
Viewed by 377
Abstract
Background: Clinicians rely on clinical judgement and vital sign monitoring to identify patient deterioration, commonly supported by systems such as the National Early Warning Score 2 (NEWS2). However, NEWS2 is associated with a high false-positive burden, contributing to alert fatigue in increasingly pressured [...] Read more.
Background: Clinicians rely on clinical judgement and vital sign monitoring to identify patient deterioration, commonly supported by systems such as the National Early Warning Score 2 (NEWS2). However, NEWS2 is associated with a high false-positive burden, contributing to alert fatigue in increasingly pressured clinical environments. Consequently, there is a growing need for early warning systems (EWS) that not only detect deterioration but do so with higher precision to prioritise clinically meaningful alerts. We aimed to develop and validate a prognostic EWS capable of predicting real-time clinical deterioration in hospitalised adult patients. Methods: We conducted a retrospective observational cohort study using routinely collected Electronic Patient Record (EPR) data. A Cox proportional hazards model with time-varying covariates was developed to estimate dynamic risk of deterioration. Deterioration was defined as unplanned transfer to intensive care, unplanned surgery, or in-hospital death. Data for model development comprised 37,989 adult inpatient episodes admitted between January 2022 and October 2024, and were initially split into training, temporal validation and test datasets. An extended evaluation period included 11,048 patients admitted through September 2025. Model performance was compared with NEWS2 at the emergency-response threshold (≥7). Results: The final model produced a tiered “traffic-light” risk profile and demonstrated substantially higher precision than NEWS2 while maintaining comparable recall in our test data. At the red alert threshold, precision was 60% compared with 16% for NEWS2 ≥7, with 82% versus 43% of alerts occurring within 24 h of deterioration. Performance remained consistent across the extended evaluation period. Conclusions: A survival-based EWS incorporating time-varying covariates achieved higher precision and improved temporal alignment with deterioration events compared with NEWS2. A tiered amber–red alert framework may support more targeted escalation, reduce alert fatigue, and enhance early identification of clinical deterioration. Full article
(This article belongs to the Section Intensive Care)
Show Figures

Figure 1

19 pages, 6054 KB  
Article
A Smart App for the Prevention of Gender-Based Violence Using Artificial Intelligence
by Agostino Giorgio
Electronics 2026, 15(1), 197; https://doi.org/10.3390/electronics15010197 - 1 Jan 2026
Viewed by 623
Abstract
Gender-based violence is a widespread and persistent social scourge. The most effective strategy to reduce its impact is prevention, which has led to the adoption of a hand gesture conventionally recognized as a request for help. In addition, in cases of confirmed risk, [...] Read more.
Gender-based violence is a widespread and persistent social scourge. The most effective strategy to reduce its impact is prevention, which has led to the adoption of a hand gesture conventionally recognized as a request for help. In addition, in cases of confirmed risk, a Judge may order the potential aggressor to wear an electronic bracelet to prevent them from approaching the victim. However, these measures have proven largely insufficient, as incidents of gender-based violence continue to recur. To address this limitation, the author developed an application, named “no pAIn app”, based on artificial intelligence (AI), designed to create a virtual shield for potential victims. The app, which can run on both smartphones and smartwatches, automatically sends help requests with geolocation data when AI detects a real danger situation. The process is fully autonomous and does not require any user intervention, ensuring fast, discreet, and reliable assistance even when the victim cannot act directly. Scenario-based tests in realistic domestic environments showed that configured danger keywords were reliably detected in the vast majority of test cases, with end-to-end alert delivery typically completed within two seconds. Preliminary battery profiling indicated approximately 5% consumption over 24 h of continuous operation confirming the feasibility of long-term daily use. Full article
Show Figures

Figure 1

22 pages, 5131 KB  
Review
Nurses’ Experience Using Telehealth in the Follow-Up Care of Patients with Inflammatory Bowel Disease—A Scoping Review
by Nanda Kristin Sæterøy-Hansen and Marit Hegg Reime
Nurs. Rep. 2026, 16(1), 11; https://doi.org/10.3390/nursrep16010011 - 29 Dec 2025
Viewed by 1141
Abstract
Background: Due to the lack of curative treatments for inflammatory bowel disease (IBD), patients need lifelong follow-up care. Telehealth offers a valuable solution to balance routine visits with necessary monitoring. Objectives: To map what is known about the benefits and barriers encountered by [...] Read more.
Background: Due to the lack of curative treatments for inflammatory bowel disease (IBD), patients need lifelong follow-up care. Telehealth offers a valuable solution to balance routine visits with necessary monitoring. Objectives: To map what is known about the benefits and barriers encountered by nurses in their use of telehealth for the follow-up care of patients with IBD. Methods: Following the methodology from the Joanna Briggs Institute, we conducted a scoping review across four electronic databases from June 2024 to September 2025. Key search terms included “inflammatory bowel disease,” “nurse experience,” and “telehealth.” A content analysis was employed to summarize the key findings. Results: We screened 1551 records, ultimately including four original research articles from four countries. Benefits identified were as follows: (1) the vital contributions of IBD telenursing in empowering patients by bridging health literacy and self-care skills; (2) optimal use of staffing time supports patient-centred care; and (3) ease of use. Barriers included the following: (1) increased workload and task imbalances; (2) the need for customized interventions; (3) technical issues and concerns regarding the security of digital systems; (4) telehealth as a supplementary option or a standard procedure; and (5) concerns related to the patient–nurse relationship. Conclusions: Nurses view telehealth as a promising approach that enhances patients’ health literacy and self-care skills and improves patient outcomes through effective monitoring. To fully realize telehealth’s potential, implementing strategies like triage protocols, algorithmic alerts, electronic health record integration, and comprehensive nurse training to enhance patient care and engagement may be beneficial. This scoping review highlights the need for more research on nurses’ experiences with telehealth in IBD due to limited publications. Full article
Show Figures

Figure 1

45 pages, 6429 KB  
Review
RTIMS: Real-Time Indoor Monitoring Systems: A Comprehensive Review
by Mohammed Faeik Ruzaij Al-Okby, Steffen Junginger, Thomas Roddelkopf and Kerstin Thurow
Appl. Sci. 2025, 15(24), 13217; https://doi.org/10.3390/app152413217 - 17 Dec 2025
Viewed by 2060
Abstract
Real-time indoor monitoring systems (RTIMS) are a key component of modern technological infrastructures in smart and automated buildings and facilities. They enable the continuous collection, analysis, and response to environmental data under strict time constraints, ensuring optimal system performance. These systems are designed [...] Read more.
Real-time indoor monitoring systems (RTIMS) are a key component of modern technological infrastructures in smart and automated buildings and facilities. They enable the continuous collection, analysis, and response to environmental data under strict time constraints, ensuring optimal system performance. These systems are designed to operate with high accuracy and low latency, making them essential in situations and events where timely decision-making is critical. Their applications range from industrial automation and production line monitoring to smart cities, smart homes, and healthcare for the elderly and disabled. The significant advances in electronics, communications, and software—particularly in Internet of Things (IoT) technologies and data transfer protocols—are reflected in the diversity of real-time monitoring systems, in terms of the parameters that can be monitored, the control and command systems that can be used, and the actuators that respond to commands. In this paper, the concepts, design, components, and working methods of these systems are discussed in detail. The latest research on real-time indoor monitoring systems published over the past five years is reviewed, resulting in the selection of 143 studies that met the inclusion criteria. This review synthesizes the technologies used for data capture, transmission, processing, storage, and visualization, as well as the approaches employed for alerts and system integration. By presenting these technical insights in a structured manner, the article provides a practical reference for researchers and practitioners aiming to design and implement real-time monitoring systems more efficiently and effectively. Full article
Show Figures

Figure 1

16 pages, 1776 KB  
Review
Artificial Intelligence and the Future of Cardiac Implantable Electronic Devices: Diagnostics, Monitoring, and Therapy
by Ibrahim Antoun, Alkassem Alkhayer, Ahmed Abdelrazik, Mahmoud Eldesouky, Kaung Myat Thu, Harshil Dhutia, Riyaz Somani and G. André Ng
J. Clin. Med. 2025, 14(24), 8824; https://doi.org/10.3390/jcm14248824 - 13 Dec 2025
Cited by 3 | Viewed by 1115
Abstract
Cardiac implantable electronic devices (CIEDs) such as pacemakers, implantable cardioverter-defibrillators (ICDs), and cardiac resynchronisation therapy (CRT) devices are generating unprecedented volumes of data in both inpatient and remote settings. Artificial intelligence (AI) techniques are increasingly being applied to enhance the management of these [...] Read more.
Cardiac implantable electronic devices (CIEDs) such as pacemakers, implantable cardioverter-defibrillators (ICDs), and cardiac resynchronisation therapy (CRT) devices are generating unprecedented volumes of data in both inpatient and remote settings. Artificial intelligence (AI) techniques are increasingly being applied to enhance the management of these devices and the patients who rely on them. Recent advances demonstrate that machine learning (ML) and deep learning (DL) can improve diagnostic capabilities (for example, by detecting arrhythmias and predicting clinical events), streamline remote monitoring workflows, and optimise device-based therapies. Key applications include AI-driven algorithms that accurately detect true arrhythmias while filtering out false alerts from pacemakers and implantable monitors, neural network models that predict ventricular arrhythmias weeks before ICD shocks, and personalised models that forecast which heart failure patients will respond to CRT. Moreover, novel approaches such as natural language processing (NLP) and reinforcement learning are being explored to integrate diverse data sources and to enable devices to self-adjust their programming. This narrative review summarises the major applications of AI in the CIED domain—diagnostics, remote monitoring, and therapy optimisation—with an emphasis on the recent literature over the past five years. The review highlights important studies and randomised trials in each area, discusses the variety of AI techniques employed, and outlines future directions and challenges (including data standardisation, validation in clinical trials, and regulatory considerations) for translating these innovations into routine clinical care. Full article
(This article belongs to the Special Issue Application of Artificial Intelligence in Cardiology)
Show Figures

Figure 1

15 pages, 1493 KB  
Study Protocol
Protocol for a Single-Arm Pilot Clinical Trial: Developing and Evaluating a Machine Learning Opioid Prediction & Risk-Stratification E-Platform (DEMONSTRATE)
by Je-Won J. Hong, Debbie L. Wilson, Khoa Nguyen, Walid F. Gellad, Julie Diiulio, Laura Militello, Shunhua Yan, Christopher A. Harle, Danielle Nelson, Eric I. Rosenberg, Siegfried Schmidt, Chung-Chou Ho Chang, Gerald Cochran, Yonghui Wu, Stephanie A. S. Staras, Courtney Kuza and Wei-Hsuan Lo-Ciganic
J. Clin. Med. 2025, 14(23), 8522; https://doi.org/10.3390/jcm14238522 - 1 Dec 2025
Cited by 1 | Viewed by 793
Abstract
Background/Objectives: The Developing and Evaluating a Machine Learning Opioid Prediction & Risk-Stratification E-Platform (DEMONSTRATE) trial aims to assess the usability, acceptability, feasibility, and effectiveness of implementing a machine learning (ML)-based clinical decision support (CDS) tool—the Overdose Prevention Alert—which predicts a patient’s risk [...] Read more.
Background/Objectives: The Developing and Evaluating a Machine Learning Opioid Prediction & Risk-Stratification E-Platform (DEMONSTRATE) trial aims to assess the usability, acceptability, feasibility, and effectiveness of implementing a machine learning (ML)-based clinical decision support (CDS) tool—the Overdose Prevention Alert—which predicts a patient’s risk of opioid overdose within three months. Methods: This single-arm study uses a pre–post implementation design with mixed-methods evaluation in 13 University of Florida Health, Gainesville, internal medicine and family medicine clinics. Eligible patients are aged ≥18 years, received an opioid prescription within the year prior to their upcoming primary care visit, are not receiving hospice care, do not have a malignant cancer diagnosis, and are identified by the ML algorithm as high risk for overdose. The Overdose Prevention Alert triggers when a primary care provider (PCP) signs an opioid order in electronic health records. We will evaluate effectiveness by comparing pre- and post-implementation outcomes using a composite patient-level measure defined by the presence of any of the following 6 favorable indicators: (1) evidence of naloxone access; (2) absence of opioid overdose diagnoses and naloxone administration; (3) absence of emergency department (ED) visits or hospitalizations due to opioid overdose or opioid use disorder (OUD); (4) absence of overlapping opioid and benzodiazepine use within a 7-day window; (5) absence of opioid use ≥50 morphine milligram equivalent daily average; (6) receipt of referrals to non-pharmacological pain management. Additional quantitative metrics will include alert penetration, usage patterns, and clinical actions taken. Usability and acceptability will be assessed using a 12-item questionnaire for PCPs and semi-structured interviews. Expected Results: The trial will provide insights into real-world ML-driven CDS implementation and inform future strategies to reduce opioid-related harm. Full article
(This article belongs to the Section Pharmacology)
Show Figures

Figure 1

19 pages, 2710 KB  
Article
Internet of Things-Based Electromagnetic Compatibility Monitoring (IEMCM) Architecture for Biomedical Devices
by Chiedza Hwata, Gerard Rushingabigwi, Omar Gatera, Didacienne Mukalinyigira, Celestin Twizere, Bolaji N. Thomas and Diego H. Peluffo-Ord’onez
Appl. Sci. 2025, 15(22), 12337; https://doi.org/10.3390/app152212337 - 20 Nov 2025
Cited by 1 | Viewed by 818
Abstract
Electromagnetic compatibility is the capability of electrical and electronic equipment to function properly around devices radiating electromagnetic energy, without mutual disturbance. Hospital environments contain numerous devices operating simultaneously and sharing resources. Undetected electromagnetic interference can cause medical devices’ malfunctions, exposing patients and staff. [...] Read more.
Electromagnetic compatibility is the capability of electrical and electronic equipment to function properly around devices radiating electromagnetic energy, without mutual disturbance. Hospital environments contain numerous devices operating simultaneously and sharing resources. Undetected electromagnetic interference can cause medical devices’ malfunctions, exposing patients and staff. Traditional monitoring is time-consuming and relies on expert interpretation. An Internet of Things-enabled embedded system architecture for remote and real-time monitoring of electromagnetic fields from medical devices is proposed. It integrates frequency probes, a Raspberry Pi 4, and a communication module. A three-month study conducted at Muhima District Hospital, Kigali, Rwanda, demonstrated the system’s effectiveness in monitoring electromagnetic field levels and cloud transmission. The signals were benchmarked against International Electrotechnical Commission and Rwanda Standards Board standards. Alerts are triggered when thresholds are exceeded, with results plotted on website and mobile interfaces. Emissions were highest at noon when the equipment was most active and lower after 1:30 PM, indicating reduced activity. The sample recorded statistics of electric fields include mean (1.0028), minimum (0.7228), and maximum (1.3515). Among the five filters evaluated, the Savitzky–Golay performed better, with MSE (0.235) and SNR (9.308). A 412 ms average latency and 24 h operation was achieved, offering a portable solution for hospital safety and equipment optimization. Full article
Show Figures

Figure 1

14 pages, 1478 KB  
Article
Autoimmune Metaplastic Atrophic Gastritis Reporting: Are Pathologists and Endoscopists on the Same Page?
by Nicole Vienneau, Hwajeong Lee, Xulang Zhang, Eundong Park, Madeline Cleary, Jing Zhou, Shunsa Tarar, Meng Liu and Micheal Tadros
Diagnostics 2025, 15(22), 2906; https://doi.org/10.3390/diagnostics15222906 - 17 Nov 2025
Viewed by 1440
Abstract
Background/Objectives: Autoimmune metaplastic atrophic gastritis (AMAG) is a chronic, autoimmune-mediated condition associated with increased risk of malignancy and nutritional deficiencies, yet diagnostic and follow-up processes remain inconsistent and unclear. This study investigates follow-up testing performance in patients with AMAG and neuroendocrine tumors [...] Read more.
Background/Objectives: Autoimmune metaplastic atrophic gastritis (AMAG) is a chronic, autoimmune-mediated condition associated with increased risk of malignancy and nutritional deficiencies, yet diagnostic and follow-up processes remain inconsistent and unclear. This study investigates follow-up testing performance in patients with AMAG and neuroendocrine tumors (NET), as well as the correlation between endoscopic impressions and histologic findings. Methods: We retrospectively analyzed 65 gastric biopsies with final diagnoses or comments mentioning the possibility of AMAG, 12 of which included well-differentiated WHO grade 1 NET arising in AMAG. H&E slides were reviewed to assess atrophy severity, the presence or absence of enterochromaffin-like (ECL) cell hyperplasia, and Helicobacter organisms. The final diagnostic line or comments made were scored from 1 to 5, based on the strength of the language used to alert the treating clinician to the likelihood of AMAG. Corresponding endoscopy reports were scored from 1 to 5 based on the likelihood of the reports documenting AMAG features. Data regarding follow-up laboratory testing relevant to AMAG and biopsy performance were collected from the electronic medical records. Results: Endoscopy scores showed no significant associations with the histology comment score or atrophy grade. The histology comment score was positively associated with performing at least a total of three laboratory tests (p = 0.03). No association was found between the presence or absence of follow-up biopsy and histology comment score (p = 0.60). Follow-up biopsy was more common in patients with NET than those with AMAG without NET (p < 0.001). Conclusions: Poor endoscopic–histologic correlation with variable follow-up practices highlights the need for standardized protocols in AMAG management. Enhanced adherence to biopsy guidelines, standardized pathology reporting, and consistent surveillance, particularly for patients with AMAG without NET, are imperative to improve diagnosis and outcomes. Future research should focus on optimizing endoscopic techniques, standardizing serological tests, and establishing evidence-based surveillance protocols for AMAG patients. Full article
Show Figures

Figure 1

9 pages, 1301 KB  
Proceeding Paper
IoT-Based System for Detecting and Monitoring LPG Leaks in Residential Settings
by E. Freddy Robalino P., Andrés Llerena, Luis Antonio Flores, Fabricio Trujillo, Luigi O. Freire and Fernando Lara
Eng. Proc. 2025, 115(1), 9; https://doi.org/10.3390/engproc2025115009 - 15 Nov 2025
Viewed by 2316
Abstract
This project presents the design and implementation of an IoT-based system for early detection of Liquefied Petroleum Gas (LPG) leaks in residential environments. Three functional prototypes were developed, integrating gas sensors, microcontrollers, actuators, and wireless modules. The system achieved 89.48% service availability and [...] Read more.
This project presents the design and implementation of an IoT-based system for early detection of Liquefied Petroleum Gas (LPG) leaks in residential environments. Three functional prototypes were developed, integrating gas sensors, microcontrollers, actuators, and wireless modules. The system achieved 89.48% service availability and response times under 2 s from leak detection to data storage and visualization. Operating under a distributed architecture, it enables continuous monitoring, automatic shut-off, and real-time alerts. The novelty lies in the applied integration of electronics, embedded systems, and automation into an affordable, replicable solution, that enhances household safety, promotes preventive behavior, and supports the adoption of affordable, replicable technologies in vulnerable domestic settings. Full article
(This article belongs to the Proceedings of The XXXIII Conference on Electrical and Electronic Engineering)
Show Figures

Figure 1

5 pages, 420 KB  
Proceeding Paper
Low-Cost IoT-Based Smart Grain Monitoring System for Sustainable Storage Management
by Saleimah Alyammahi, Aisha Alhmoudi, Maryam Alawadhi and Fatima Alqaydi
Eng. Proc. 2025, 118(1), 90; https://doi.org/10.3390/ECSA-12-26545 - 7 Nov 2025
Viewed by 1234
Abstract
Efficient grain storage is critical for ensuring food security, particularly in regions with hot and humid climates where environmental fluctuations can accelerate spoilage. This study presents the development of a low-cost, Arduino-based microcontroller platform Smart Grain Monitoring System designed to continuously monitor key [...] Read more.
Efficient grain storage is critical for ensuring food security, particularly in regions with hot and humid climates where environmental fluctuations can accelerate spoilage. This study presents the development of a low-cost, Arduino-based microcontroller platform Smart Grain Monitoring System designed to continuously monitor key storage parameters. The system integrates sensors to measure temperature, relative humidity, air quality, and the weight of stored grains—factors essential for the early detection of microbial activity, fermentation, or structural degradation. Data is transmitted wirelessly in real time to a mobile application via the Blynk Internet of Things (IoT) platform, allowing for remote access, alerts, and trend analysis. The system is designed to be affordable, scalable, and easy to deploy in agricultural settings with limited infrastructure. To enhance mechanical performance and usability, the sensor system is housed in a reflective glass silo enclosure that provides both thermal insulation and visual grain access. A three-dimensional computer-aided design (3D CAD)model was developed to optimize the placement of electronics and ensure structural integrity. Key features include custom mounts for sensors and electronics, a top lid for grain refill and hygiene, and a stable base for load cell installation. This integrated framework offers a reliable, real-time monitoring solution that supports proactive grain management and reduces post-harvest losses in rural storage environments. Full article
Show Figures

Figure 1

8 pages, 485 KB  
Proceeding Paper
IoT-Enabled Sensor Glove for Communication and Health Monitoring in Paralysed Patients
by Angshuman Khan, Uttam Narendra Thakur and Sikta Mandal
Eng. Proc. 2025, 118(1), 28; https://doi.org/10.3390/ECSA-12-26518 - 7 Nov 2025
Viewed by 668
Abstract
Due to their limited mobility and vocal limitations, paralysed individuals frequently struggle with communication and health monitoring. This work introduces an Internet of Things (IoT)-based system that combines continuous health monitoring with a sensor-based smart glove to enhance patient care. The glove detects [...] Read more.
Due to their limited mobility and vocal limitations, paralysed individuals frequently struggle with communication and health monitoring. This work introduces an Internet of Things (IoT)-based system that combines continuous health monitoring with a sensor-based smart glove to enhance patient care. The glove detects falls, sends emergency messages via hand gestures, and monitors vital indicators, including SpO2, heart rate, and body temperature. The smart glove uses Arduino UNO (RoboCraze, Bengaluru, India) and ESP8266 (RoboCraze, Bengaluru, India) modules with MPU6050 (RoboCraze, Bengaluru, India), MAX30100 (RoboCraze, Bengaluru, India), LM35 (Bombay Electronics, Mumbai, India), and flex sensors for these functions. MPU6050 detects falls precisely, while MAX30100 and flex sensors measure gestures, SpO2, heart rate, and body temperature. The flex sensor interprets hand motions as emergency alerts sent via Wi-Fi to a cloud platform for remote monitoring. The experimental results confirmed the superiority and validated the efficacy of the suggested module. Scalability, data logging, and real-time access are guaranteed by IoT integration. The actual test cases were predicted using a Support Vector Machine, achieving an average accuracy of 81.98%. The suggested module is affordable, non-invasive, easy to use, and appropriate for clinical and residential use. The system meets the essential needs of disabled people, enhancing both their quality of life and carer connectivity. Advanced machine learning for dynamic gesture detection and telemedicine integration is a potential future improvement. Full article
Show Figures

Figure 1

18 pages, 709 KB  
Article
Machine Learning Models for Point-of-Care Diagnostics of Acute Kidney Injury
by Chun-You Chen, Te-I Chang, Cheng-Hsien Chen, Shih-Chang Hsu, Yen-Ling Chu, Nai-Jen Huang, Yuh-Mou Sue, Tso-Hsiao Chen, Feng-Yen Lin, Chun-Ming Shih, Po-Hsun Huang, Hui-Ling Hsieh and Chung-Te Liu
Diagnostics 2025, 15(21), 2801; https://doi.org/10.3390/diagnostics15212801 - 5 Nov 2025
Viewed by 832
Abstract
Background/Objectives: Computerized diagnostic algorithms could achieve early detection of acute kidney injury (AKI) only with available baseline serum creatinine (SCr). To tackle this weakness, we tried to construct a machine learning model for AKI diagnosis based on point-of-care clinical features regardless of baseline [...] Read more.
Background/Objectives: Computerized diagnostic algorithms could achieve early detection of acute kidney injury (AKI) only with available baseline serum creatinine (SCr). To tackle this weakness, we tried to construct a machine learning model for AKI diagnosis based on point-of-care clinical features regardless of baseline SCr. Methods: Patients with SCr > 1.3 mg/dL were recruited retrospectively from Wan Fang Hospital, Taipei. A Dataset A (n = 2846) was used as the training dataset and a Dataset B (n = 1331) was used as the testing dataset. Point-of-care features, including laboratory data and physical readings, were inputted into machine learning models. The repeated machine learning models randomly used 70% and 30% of Dataset A as training dataset and testing dataset for 1000 rounds, respectively. The single machine learning models used Dataset A as training dataset and Dataset B as testing dataset. A computerized algorithm for AKI diagnosis based on 1.5× increase in SCr and clinician’s AKI diagnosis compared to machine learning models. Results: On an independent, unbalanced test set (n = 1331), our machine learning models achieved AUROC values ranging from 0.67 to 0.74. A pre-existing computerized algorithm performed best (AUROC = 0.94). Crucially, all machine learning models significantly outperformed the routine clinician’s diagnosis (AUROC ~0.74 vs. 0.53, p < 0.05). For context, a pre-existing computerized algorithm, which requires available baseline SCr data, achieved an AUROC of 0.94 on a relevant subset of the data, highlighting the performance benchmark when baseline data is available. Formal statistical comparisons revealed that the top-performing models (e.g., Random Forest, SVM) were often statistically indistinguishable. Model performance was highly dependent on the test scenario, with precision and F1 scores improving markedly on a balanced dataset. Conclusions: In the absence of baseline SCr, machine learning models can diagnose AKI with significantly greater accuracy than routine clinical diagnoses. Our robust statistical analysis suggests that several advanced algorithms achieve a similarly high level of performance. Full article
(This article belongs to the Section Machine Learning and Artificial Intelligence in Diagnostics)
Show Figures

Figure 1

24 pages, 1582 KB  
Article
Future Internet Applications in Healthcare: Big Data-Driven Fraud Detection with Machine Learning
by Konstantinos P. Fourkiotis and Athanasios Tsadiras
Future Internet 2025, 17(10), 460; https://doi.org/10.3390/fi17100460 - 8 Oct 2025
Cited by 2 | Viewed by 1652
Abstract
Hospital fraud detection has often relied on periodic audits that miss evolving, internet-mediated patterns in electronic claims. An artificial intelligence and machine learning pipeline is being developed that is leakage-safe, imbalance aware, and aligned with operational capacity for large healthcare datasets. The preprocessing [...] Read more.
Hospital fraud detection has often relied on periodic audits that miss evolving, internet-mediated patterns in electronic claims. An artificial intelligence and machine learning pipeline is being developed that is leakage-safe, imbalance aware, and aligned with operational capacity for large healthcare datasets. The preprocessing stack integrates four tables, engineers 13 features, applies imputation, categorical encoding, Power transformation, Boruta selection, and denoising autoencoder representations, with class balancing via SMOTE-ENN evaluated inside cross-validation folds. Eight algorithms are compared under a fraud-oriented composite productivity index that weighs recall, precision, MCC, F1, ROC-AUC, and G-Mean, with per-fold threshold calibration and explicit reporting of Type I and Type II errors. Multilayer perceptron attains the highest composite index, while CatBoost offers the strongest control of false positives with high accuracy. SMOTE-ENN provides limited gains once representations regularize class geometry. The calibrated scores support prepayment triage, postpayment audit, and provider-level profiling, linking alert volume to expected recovery and protecting investigator workload. Situated in the Future Internet context, this work targets internet-mediated claim flows and web-accessible provider registries. Governance procedures for drift monitoring, fairness assessment, and change control complete an internet-ready deployment path. The results indicate that disciplined preprocessing and evaluation, more than classifier choice alone, translate AI improvements into measurable economic value and sustainable fraud prevention in digital health ecosystems. Full article
Show Figures

Figure 1

18 pages, 267 KB  
Article
‘Making the System Work’: A Multi-Site Qualitative Study of Dietitians’ Use of iEMR to Support Nutrition Care Transitions for Older Adults with Malnutrition
by Kristin Gomes, Shelley Roberts, Ben Desbrow and Jack Bell
Healthcare 2025, 13(17), 2227; https://doi.org/10.3390/healthcare13172227 - 5 Sep 2025
Cited by 1 | Viewed by 1211
Abstract
Background: Older adults with malnutrition (≥65 years) require coordinated nutrition care during hospital-to-home transitions. A key purpose of integrated electronic medical record (iEMR) systems is to support clinicians in ensuring continuity of care across settings, yet little is known about their use in [...] Read more.
Background: Older adults with malnutrition (≥65 years) require coordinated nutrition care during hospital-to-home transitions. A key purpose of integrated electronic medical record (iEMR) systems is to support clinicians in ensuring continuity of care across settings, yet little is known about their use in nutrition care discharge practices. This study explored how clinical dietitians use the iEMR to support nutrition care discharge practices for older adults with malnutrition and identified opportunities for optimisation to enhance care continuity. Methods: Semi-structured interviews were conducted with 16 clinical dietitians (11 frontline clinicians, 5 senior leaders) from 10 public hospitals across Queensland, Australia. Analysis combined deductive coding using the Consolidated Framework for Implementation Research 2.0 with inductive thematic analysis to identify system-level, organisational and behavioural influences on iEMR use and optimisation opportunities. Results: Four themes and ten subthemes were identified. System fragmentation, policy constraints and documentation burden limited dietitians’ ability to coordinate discharge care. Workarounds were common and reflected both practical adaptation and conditional trust in iEMR. Discharge practices were also shaped by local culture, professional norms and variable expectations for iEMR use. Despite these constraints, participants expressed aspirations for an optimised iEMR with embedded referral tools, real-time alerts and analytics to support improved service delivery. Conclusions: This study identified key factors influencing iEMR use by clinical dietitians to support nutrition care transitions for older adults with malnutrition. While current systems present significant challenges, optimising iEMR alongside organisational and policy enablers holds potential to strengthen nutrition care discharge practices and care continuity. Full article
(This article belongs to the Special Issue Nutrition in Patient Care)
Back to TopTop