Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (505)

Search Parameters:
Keywords = cloud computing scalability

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
18 pages, 3461 KB  
Article
Real Time IoT Low-Cost Air Quality Monitoring System
by Silvian-Marian Petrică, Ioana Făgărășan, Nicoleta Arghira and Iulian Munteanu
Sustainability 2026, 18(2), 1074; https://doi.org/10.3390/su18021074 - 21 Jan 2026
Abstract
This paper proposes a complete solution, implementing a low-cost, energy-independent, network-connected, and scalable environmental air parameter monitoring system. It features a remote sensing module which provides environmental data to a cloud-based server and a software application for real-time and historical data processing, standardized [...] Read more.
This paper proposes a complete solution, implementing a low-cost, energy-independent, network-connected, and scalable environmental air parameter monitoring system. It features a remote sensing module which provides environmental data to a cloud-based server and a software application for real-time and historical data processing, standardized air quality indices computations, and a comprehensive visualization of environmental parameters evolutions. A fully operational prototype was built around a low-cost micro-controller connected to low-cost air parameter sensors and a GSM modem, powered by a stand-alone renewable energy-based power supply. The associated software platform has been developed by using Microsoft Power Platform technologies. The collected data is transmitted from sensors to a remote server via the GSM modem using custom-built JSON structures. From there, data is extracted and forwarded to a database accessible to users through a dedicated application. The overall accuracy of the air quality monitoring system has been thoroughly validated both in controlled indoor environment and against a trusted outdoor air quality reference station. The proposed air parameters monitoring solution paves the way for future research actions, such as the classification of polluted sites or prediction of air parameter variations in the site of interest. Full article
(This article belongs to the Section Air, Climate Change and Sustainability)
Show Figures

Figure 1

25 pages, 7167 KB  
Article
Edge-Enhanced YOLOV8 for Spacecraft Instance Segmentation in Cloud-Edge IoT Environments
by Ming Chen, Wenjie Chen, Yanfei Niu, Ping Qi and Fucheng Wang
Future Internet 2026, 18(1), 59; https://doi.org/10.3390/fi18010059 - 20 Jan 2026
Abstract
The proliferation of smart devices and the Internet of Things (IoT) has led to massive data generation, particularly in complex domains such as aerospace. Cloud computing provides essential scalability and advanced analytics for processing these vast datasets. However, relying solely on the cloud [...] Read more.
The proliferation of smart devices and the Internet of Things (IoT) has led to massive data generation, particularly in complex domains such as aerospace. Cloud computing provides essential scalability and advanced analytics for processing these vast datasets. However, relying solely on the cloud introduces significant challenges, including high latency, network congestion, and substantial bandwidth costs, which are critical for real-time on-orbit spacecraft services. Cloud-edge Internet of Things (cloud-edge IoT) computing emerges as a promising architecture to mitigate these issues by pushing computation closer to the data source. This paper proposes an improved YOLOV8-based model specifically designed for edge computing scenarios within a cloud-edge IoT framework. By integrating the Cross Stage Partial Spatial Pyramid Pooling Fast (CSPPF) module and the WDIOU loss function, the model achieves enhanced feature extraction and localization accuracy without significantly increasing computational cost, making it suitable for deployment on resource-constrained edge devices. Meanwhile, by processing image data locally at the edge and transmitting only the compact segmentation results to the cloud, the system effectively reduces bandwidth usage and supports efficient cloud-edge collaboration in IoT-based spacecraft monitoring systems. Experimental results show that, compared to the original YOLOV8 and other mainstream models, the proposed model demonstrates superior accuracy and instance segmentation performance at the edge, validating its practicality in cloud-edge IoT environments. Full article
(This article belongs to the Special Issue Convergence of IoT, Edge and Cloud Systems)
Show Figures

Figure 1

26 pages, 3132 KB  
Article
An Unsupervised Cloud-Centric Intrusion Diagnosis Framework Using Autoencoder and Density-Based Learning
by Suresh K. S, Thenmozhi Elumalai, Radhakrishnan Rajamani, Anubhav Kumar, Balamurugan Balusamy, Sumendra Yogarayan and Kaliyaperumal Prabu
Future Internet 2026, 18(1), 54; https://doi.org/10.3390/fi18010054 - 19 Jan 2026
Viewed by 38
Abstract
Cloud computing environments generate high-dimensional, large-scale, and highly dynamic network traffic, making intrusion diagnosis challenging due to evolving attack patterns, severe traffic imbalance, and limited availability of labeled data. To address these challenges, this study presents an unsupervised, cloud-centric intrusion diagnosis framework that [...] Read more.
Cloud computing environments generate high-dimensional, large-scale, and highly dynamic network traffic, making intrusion diagnosis challenging due to evolving attack patterns, severe traffic imbalance, and limited availability of labeled data. To address these challenges, this study presents an unsupervised, cloud-centric intrusion diagnosis framework that integrates autoencoder-based representation learning with density-based attack categorization. A dual-stage autoencoder is trained exclusively on benign traffic to learn compact latent representations and to identify anomalous flows using reconstruction-error analysis, enabling effective anomaly detection without prior attack labels. The detected anomalies are subsequently grouped using density-based learning to uncover latent attack structures and support fine-grained multiclass intrusion diagnosis under varying attack densities. Experiments conducted on the large-scale CSE-CIC-IDS2018 dataset demonstrate that the proposed framework achieves an anomaly detection accuracy of 99.46%, with high recall and low false-negative rates in the optimal latent-space configuration. The density-based classification stage achieves an overall multiclass attack classification accuracy of 98.79%, effectively handling both majority and minority attack categories. Clustering quality evaluation reports a Silhouette Score of 0.9857 and a Davies–Bouldin Index of 0.0091, indicating strong cluster compactness and separability. Comparative analysis against representative supervised and unsupervised baselines confirms the framework’s scalability and robustness under highly imbalanced cloud traffic, highlighting its suitability for future Internet cloud security ecosystems. Full article
(This article belongs to the Special Issue Cloud and Edge Computing for the Next-Generation Networks)
Show Figures

Figure 1

50 pages, 3712 KB  
Article
Explainable AI and Multi-Agent Systems for Energy Management in IoT-Edge Environments: A State of the Art Review
by Carlos Álvarez-López, Alfonso González-Briones and Tiancheng Li
Electronics 2026, 15(2), 385; https://doi.org/10.3390/electronics15020385 - 15 Jan 2026
Viewed by 126
Abstract
This paper reviews Artificial Intelligence techniques for distributed energy management, focusing on integrating machine learning, reinforcement learning, and multi-agent systems within IoT-Edge-Cloud architectures. As energy infrastructures become increasingly decentralized and heterogeneous, AI must operate under strict latency, privacy, and resource constraints while remaining [...] Read more.
This paper reviews Artificial Intelligence techniques for distributed energy management, focusing on integrating machine learning, reinforcement learning, and multi-agent systems within IoT-Edge-Cloud architectures. As energy infrastructures become increasingly decentralized and heterogeneous, AI must operate under strict latency, privacy, and resource constraints while remaining transparent and auditable. The study examines predictive models ranging from statistical time series approaches to machine learning regressors and deep neural architectures, assessing their suitability for embedded deployment and federated learning. Optimization methods—including heuristic strategies, metaheuristics, model predictive control, and reinforcement learning—are analyzed in terms of computational feasibility and real-time responsiveness. Explainability is treated as a fundamental requirement, supported by model-agnostic techniques that enable trust, regulatory compliance, and interpretable coordination in multi-agent environments. The review synthesizes advances in MARL for decentralized control, communication protocols enabling interoperability, and hardware-aware design for low-power edge devices. Benchmarking guidelines and key performance indicators are introduced to evaluate accuracy, latency, robustness, and transparency across distributed deployments. Key challenges remain in stabilizing explanations for RL policies, balancing model complexity with latency budgets, and ensuring scalable, privacy-preserving learning under non-stationary conditions. The paper concludes by outlining a conceptual framework for explainable, distributed energy intelligence and identifying research opportunities to build resilient, transparent smart energy ecosystems. Full article
Show Figures

Figure 1

34 pages, 10017 KB  
Article
U-H-Mamba: An Uncertainty-Aware Hierarchical State-Space Model for Lithium-Ion Battery Remaining Useful Life Prediction Using Hybrid Laboratory and Real-World Datasets
by Zhihong Wen, Xiangpeng Liu, Wenshu Niu, Hui Zhang and Yuhua Cheng
Energies 2026, 19(2), 414; https://doi.org/10.3390/en19020414 - 14 Jan 2026
Viewed by 198
Abstract
Accurate prognosis of the remaining useful life (RUL) for lithium-ion batteries is critical for mitigating range anxiety and ensuring the operational safety of electric vehicles. However, existing data-driven methods often struggle to maintain robustness when transferring from controlled laboratory conditions to complex, sensor-limited, [...] Read more.
Accurate prognosis of the remaining useful life (RUL) for lithium-ion batteries is critical for mitigating range anxiety and ensuring the operational safety of electric vehicles. However, existing data-driven methods often struggle to maintain robustness when transferring from controlled laboratory conditions to complex, sensor-limited, real-world environments. To bridge this gap, this study presents U-H-Mamba, a novel uncertainty-aware hierarchical framework trained on a massive hybrid repository comprising over 146,000 charge–discharge cycles from both laboratory benchmarks and operational electric vehicle datasets. The proposed architecture employs a two-level design to decouple degradation dynamics, where a Multi-scale Temporal Convolutional Network functions as the base encoder to extract fine-grained electrochemical fingerprints, including derived virtual impedance proxies, from high-frequency intra-cycle measurements. Subsequently, an enhanced Pressure-Aware Multi-Head Mamba decoder models the long-range inter-cycle degradation trajectories with linear computational complexity. To guarantee reliability in safety-critical applications, a hybrid uncertainty quantification mechanism integrating Monte Carlo Dropout with Inductive Conformal Prediction is implemented to generate calibrated confidence intervals. Extensive empirical evaluations demonstrate the framework’s superior performance, achieving a RMSE of 3.2 cycles on the NASA dataset and 5.4 cycles on the highly variable NDANEV dataset, thereby outperforming state-of-the-art baselines by 20–40%. Furthermore, SHAP-based interpretability analysis confirms that the model correctly identifies physics-informed pressure dynamics as critical degradation drivers, validating its zero-shot generalization capabilities. With high accuracy and linear scalability, the U-H-Mamba model offers a viable and physically interpretable solution for cloud-based prognostics in large-scale electric vehicle fleets. Full article
(This article belongs to the Section F5: Artificial Intelligence and Smart Energy)
Show Figures

Figure 1

14 pages, 617 KB  
Article
Integrating ESP32-Based IoT Architectures and Cloud Visualization to Foster Data Literacy in Early Engineering Education
by Jael Zambrano-Mieles, Miguel Tupac-Yupanqui, Salutar Mari-Loardo and Cristian Vidal-Silva
Computers 2026, 15(1), 51; https://doi.org/10.3390/computers15010051 - 13 Jan 2026
Viewed by 143
Abstract
This study presents the design and implementation of a full-stack IoT ecosystem based on ESP32 microcontrollers and web-based visualization dashboards to support scientific reasoning in first-year engineering students. The proposed architecture integrates a four-layer model—perception, network, service, and application—enabling students to deploy real-time [...] Read more.
This study presents the design and implementation of a full-stack IoT ecosystem based on ESP32 microcontrollers and web-based visualization dashboards to support scientific reasoning in first-year engineering students. The proposed architecture integrates a four-layer model—perception, network, service, and application—enabling students to deploy real-time environmental monitoring systems for agriculture and beekeeping. Through a sixteen-week Project-Based Learning (PBL) intervention with 91 participants, we evaluated how this technological stack influences technical proficiency. Results indicate that the transition from local code execution to cloud-based telemetry increased perceived learning confidence from μ=3.9 (Challenge phase) to μ=4.6 (Reflection phase) on a 5-point scale. Furthermore, 96% of students identified the visualization dashboards as essential Human–Computer Interfaces (HCI) for debugging, effectively bridging the gap between raw sensor data and evidence-based argumentation. These findings demonstrate that integrating open-source IoT architectures provides a scalable mechanism to cultivate data literacy in early engineering education. Full article
Show Figures

Figure 1

44 pages, 7079 KB  
Editorial
Mobile Network Softwarization: Technological Foundations and Impact on Improving Network Energy Efficiency
by Josip Lorincz, Amar Kukuruzović and Dinko Begušić
Sensors 2026, 26(2), 503; https://doi.org/10.3390/s26020503 - 12 Jan 2026
Viewed by 225
Abstract
This paper provides a comprehensive overview of mobile network softwarization, emphasizing the technological foundations and its transformative impact on the energy efficiency of modern and future mobile networks. In the paper, a detailed analysis of communication concepts known as software-defined networking (SDN) and [...] Read more.
This paper provides a comprehensive overview of mobile network softwarization, emphasizing the technological foundations and its transformative impact on the energy efficiency of modern and future mobile networks. In the paper, a detailed analysis of communication concepts known as software-defined networking (SDN) and network function virtualization (NFV) is presented, with a description of their architectural principles, operational mechanisms, and the associated interfaces and management frameworks that enable programmability, virtualization, and centralized control in modern mobile networks. The study further explores the role of cloud computing, virtualization platforms, distributed SDN controllers, and resource orchestration systems, outlining how they collectively support mobile network scalability, automation, and service agility. To assess the maturity and evolution of mobile network softwarization, the paper reviews contemporary research directions, including SDN security, machine-learning-assisted traffic management, dynamic service function chaining, virtual network function (VNF) placement and migration, blockchain-based trust mechanisms, and artificial intelligence (AI)-enabled self-optimization. The analysis also evaluates the relationship between mobile network softwarization and energy consumption, presenting the main SDN- and NFV-based techniques that contribute to reducing mobile network power usage, such as traffic-aware control, rule placement optimization, end-host-aware strategies, VNF consolidation, and dynamic resource scaling. Findings indicate that although fifth-generation (5G) mobile network standalone deployments capable of fully exploiting softwarization remain limited, softwarized SDN/NFV-based architectures provide measurable benefits in reducing network operational costs and improving energy efficiency, especially when combined with AI-driven automation. The paper concludes that mobile network softwarization represents an essential enabler for sustainable 5G and future beyond-5G systems, while highlighting the need for continued research into scalable automation, interoperable architectures, and energy-efficient softwarized network designs. Full article
(This article belongs to the Special Issue Energy-Efficient Communication Networks and Systems: 2nd Edition)
Show Figures

Figure 1

32 pages, 3734 KB  
Article
A Hierarchical Framework Leveraging IIoT Networks, IoT Hub, and Device Twins for Intelligent Industrial Automation
by Cornelia Ionela Bădoi, Bilge Kartal Çetin, Kamil Çetin, Çağdaş Karataş, Mehmet Erdal Özbek and Savaş Şahin
Appl. Sci. 2026, 16(2), 645; https://doi.org/10.3390/app16020645 - 8 Jan 2026
Viewed by 290
Abstract
Industrial Internet of Things (IIoT) networks, Microsoft Azure Internet of Things (IoT) Hub, and device twins (DvT) are increasingly recognized as core enablers of adaptive, data-driven manufacturing. This paper proposes a hierarchical IIoT framework that integrates industrial IoT networking, DvT for asset-level virtualisation, [...] Read more.
Industrial Internet of Things (IIoT) networks, Microsoft Azure Internet of Things (IoT) Hub, and device twins (DvT) are increasingly recognized as core enablers of adaptive, data-driven manufacturing. This paper proposes a hierarchical IIoT framework that integrates industrial IoT networking, DvT for asset-level virtualisation, system-level digital twins (DT) for cell orchestration, and cloud-native services to support the digital transformation of brownfield, programmable logic controller (PLC)-centric modular automation (MA) environments. Traditional PLC/supervisory control and data acquisition (SCADA) paradigms struggle to meet interoperability, observability, and adaptability requirements at scale, motivating architectures in which DvT and IoT Hub underpin real-time orchestration, virtualisation, and predictive-maintenance workflows. Building on and extending a previously introduced conceptual model, the present work instantiates a multilayered, end-to-end design that combines a federated Message Queuing Telemetry Transport (MQTT) mesh on the on-premises side, a ZigBee-based backup mesh, and a secure bridge to Azure IoT Hub, together with a systematic DvT modelling and orchestration strategy. The methodology is supported by a structured analysis of relevant IIoT and DvT design choices and by a concrete implementation in a nine-cell MA laboratory featuring a robotic arm predictive-maintenance scenario. The resulting framework sustains closed-loop monitoring, anomaly detection, and control under realistic workloads, while providing explicit envelopes for telemetry volume, buffering depth, and latency budgets in edge-cloud integration. Overall, the proposed architecture offers a transferable blueprint for evolving PLC-centric automation toward more adaptive, secure, and scalable IIoT systems and establishes a foundation for future extensions toward full DvT ecosystems, tighter artificial intelligence/machine learning (AI/ML) integration, and fifth/sixth generation (5G/6G) and time-sensitive networking (TSN) support in industrial networks. Full article
(This article belongs to the Special Issue Novel Technologies of Smart Manufacturing)
Show Figures

Figure 1

30 pages, 1905 KB  
Article
A System-Based Framework for Reducing the Digital Divide in Critical Mineral Supply Chains
by Shibo Xu, Nan Bai, Keun-sik Park and Miao Su
Systems 2026, 14(1), 53; https://doi.org/10.3390/systems14010053 - 5 Jan 2026
Viewed by 158
Abstract
The widening digital divide within the Global Critical Mineral Resource Supply Chain (GCMRS) 4.0 creates significant barriers to cross-border governance and operational efficiency. To quantify and address this disparity, this study identifies 20 Critical Success Factors (CSFs) through expert interviews with 15 industry [...] Read more.
The widening digital divide within the Global Critical Mineral Resource Supply Chain (GCMRS) 4.0 creates significant barriers to cross-border governance and operational efficiency. To quantify and address this disparity, this study identifies 20 Critical Success Factors (CSFs) through expert interviews with 15 industry specialists in South Korea. A hybrid multi-criteria decision-making framework integrating Fuzzy DEMATEL, Analytic Network Process (ANP), and the Choquet integral is developed to map causal relationships and determine factor weights. The empirical results reveal a distinct ‘technology-first’ dependency. Specifically, Scalable Technical Solutions and Cloud Computing Access emerge as the primary driving forces with the highest global weights, while Digital Investment Subsidies serve as the central hub for resource allocation. Unlike generic governance models, this study provides a quantifiable decision-making basis for policymakers. It demonstrates that bridging the hard infrastructure gap is a prerequisite for the effectiveness of soft collaborative mechanisms in the critical mineral sector. Full article
(This article belongs to the Section Supply Chain Management)
Show Figures

Figure 1

28 pages, 1463 KB  
Article
PUF-Based Secure Authentication Protocol for Cloud-Assisted Wireless Medical Sensor Networks
by Minsu Kim, Taehun Kim, Deokkyu Kwon and Youngho Park
Electronics 2026, 15(1), 240; https://doi.org/10.3390/electronics15010240 - 5 Jan 2026
Viewed by 168
Abstract
Wireless medical sensor networks (WMSNs) have evolved alongside the development of communication systems, and the integration of cloud computing has enabled scalable and efficient medical data management. However, since the messages in WMSNs are transmitted over open channels, they are vulnerable to eavesdropping, [...] Read more.
Wireless medical sensor networks (WMSNs) have evolved alongside the development of communication systems, and the integration of cloud computing has enabled scalable and efficient medical data management. However, since the messages in WMSNs are transmitted over open channels, they are vulnerable to eavesdropping, replay, impersonation, and other various attacks. In response to these security concerns, Keshta et al. suggested an authentication protocol to establish secure communication in the cloud-assisted WMSNs. However, our analysis reveals their protocol cannot prevent session key disclosure, impersonation of the user and sensor node, and denial of service (DoS) attacks. Moreover, Keshta et al.’s protocol cannot support user untraceability due to fixed hidden identity. To address these weaknesses, we propose a physical unclonable function (PUF) based secure authentication protocol for cloud-assisted WMSNs. The protocol uses lightweight operations, provides mutual authentication between user, cloud server, and sensor node, and supports user anonymity and untraceability. We validate the proposed protocol’s security through informal analysis on various security attacks and formal analysis including “Burrows–Abadi–Needham (BAN) logic”, “Real-or-Random (RoR) model” for session key security, and “Automated Validation of Internet Security Protocols and Application (AVISPA) simulations”. Performance evaluation demonstrates lower communication cost and computation overhead compared with existing protocols, making the proposed protocol suitable for WMSN environments. Full article
(This article belongs to the Special Issue Trends in Information Systems and Security)
Show Figures

Figure 1

42 pages, 5531 KB  
Article
DRL-TinyEdge: Energy- and Latency-Aware Deep Reinforcement Learning for Adaptive TinyML at the 6G Edge
by Saad Alaklabi and Saleh Alharbi
Future Internet 2026, 18(1), 31; https://doi.org/10.3390/fi18010031 - 4 Jan 2026
Viewed by 456
Abstract
Various TinyML models face a constantly challenging environment when running on emerging sixth-generation (6G) edge networks, with volatile wireless environments, limited computing power, and highly constrained energy use. This paper introduces DRL-TinyEdge, a latency- and energy-sensitive deep reinforcement learning (DRL) platform optimised for [...] Read more.
Various TinyML models face a constantly challenging environment when running on emerging sixth-generation (6G) edge networks, with volatile wireless environments, limited computing power, and highly constrained energy use. This paper introduces DRL-TinyEdge, a latency- and energy-sensitive deep reinforcement learning (DRL) platform optimised for the 6G edge of adaptive TinyML. The suggested on-device DRL controller autonomously decides on the execution venue (local, partial, or cloud) and model configuration (depth, quantization, and frequency) in real time to trade off accuracy, latency, and power savings. To assure safety during adaptation to changing conditions, the multi-objective reward will be a combination of p95 latency, per-inference energy, preservation of accuracy and policy stability. The system is tested under two workloads representative of classical applications, including image classification (CIFAR-10) and sensor analytics in an industrial IoT system, on a low-power platform (ESP32, Jetson Nano) connected to a simulated 6G mmWave testbed. Findings indicate uniform improvements, with up to a 28 per cent decrease in p95 latency and a 43 per cent decrease in energy per inference, and with accuracy differences of less than 1 per cent compared to baseline models. DRL-TinyEdge offers better adaptability, stability, and scalability when using a CPU < 5 and a decision latency < 10 ms, compared to Static-Offload, Heuristic-QoS, or TinyNAS/QAT. Code, hyperparameter settings, and measurement programmes will also be published at the time of acceptance to enable reproducibility and open benchmarking. Full article
Show Figures

Figure 1

26 pages, 2431 KB  
Article
Multi-Objective Deep Reinforcement Learning for Dynamic Task Scheduling Under Time-of-Use Electricity Price in Cloud Data Centers
by Xiao Liao, Yiqian Li, Luyao Liu, Lihao Deng, Jinlong Hu and Xiaofei Wu
Electronics 2026, 15(1), 232; https://doi.org/10.3390/electronics15010232 - 4 Jan 2026
Viewed by 254
Abstract
The high energy consumption and substantial electricity costs of cloud data centers pose significant challenges related to carbon emissions and operational expenses for service providers. The temporal variability of electricity pricing in real-world scenarios adds complexity to this problem while simultaneously offering novel [...] Read more.
The high energy consumption and substantial electricity costs of cloud data centers pose significant challenges related to carbon emissions and operational expenses for service providers. The temporal variability of electricity pricing in real-world scenarios adds complexity to this problem while simultaneously offering novel opportunities for mitigation. This study addresses the task scheduling optimization problem under time-of-use pricing conditions in cloud computing environments by proposing an innovative task scheduling approach. To balance the three competing objectives of electricity cost, energy consumption, and task delay, we formulate a price-aware, multi-objective task scheduling optimization problem and establish a Markov decision process model. By integrating prioritized experience replay with a multi-objective preference vector selection mechanism, we design a dynamic, multi-objective deep reinforcement learning algorithm named TEPTS. The simulation results demonstrate that TEPTS achieves superior convergence and diversity compared to three other multi-objective optimization methods while exhibiting excellent scalability across varying test durations and system workload intensities. Specifically, under the TOU pricing scenario, the task migration rate during peak periods exceeds 33.90%, achieving a 13.89% to 36.89% reduction in energy consumption and a 14.09% to 45.33% reduction in electricity costs. Full article
Show Figures

Figure 1

26 pages, 461 KB  
Systematic Review
A Systematic Review of Federated and Cloud Computing Approaches for Predicting Mental Health Risks
by Iram Fiaz, Nadia Kanwal and Amro Al-Said Ahmad
Sensors 2026, 26(1), 229; https://doi.org/10.3390/s26010229 - 30 Dec 2025
Viewed by 528
Abstract
Mental health disorders affect large numbers of people worldwide and are a major cause of long-term disability. Digital health technologies such as mobile apps and wearable devices now generate rich behavioural data that could support earlier detection and more personalised care. However, these [...] Read more.
Mental health disorders affect large numbers of people worldwide and are a major cause of long-term disability. Digital health technologies such as mobile apps and wearable devices now generate rich behavioural data that could support earlier detection and more personalised care. However, these data are highly sensitive and distributed across devices and platforms, which makes privacy protection and scalable analysis challenging; federated learning offers a way to train models across devices while keeping raw data local. When combined with edge, fog, or cloud computing, federated learning offers a way to support near-real-time mental health analysis while keeping raw data local. This review screened 1104 records, assessed 31 full-text articles using a five-question quality checklist, and retained 17 empirical studies that achieved a score of at least 7/10 for synthesis. The included studies were compared in terms of their FL and edge/cloud architectures, data sources, privacy and security techniques, and evidence for operation in real-world settings. The synthesis highlights innovative but fragmented progress, with limited work on comorbidity modelling, deployment evaluation, and common benchmarks, and identifies priorities for the development of scalable, practical, and ethically robust FL systems for digital mental health. Full article
(This article belongs to the Special Issue Secure AI for Biomedical Sensing and Imaging Applications)
Show Figures

Figure 1

29 pages, 1050 KB  
Article
A Lightweight Authentication and Key Distribution Protocol for XR Glasses Using PUF and Cloud-Assisted ECC
by Wukjae Cha, Hyang Jin Lee, Sangjin Kook, Keunok Kim and Dongho Won
Sensors 2026, 26(1), 217; https://doi.org/10.3390/s26010217 - 29 Dec 2025
Viewed by 344
Abstract
The rapid convergence of artificial intelligence (AI), cloud computing, and 5G communication has positioned extended reality (XR) as a core technology bridging the physical and virtual worlds. Encompassing virtual reality (VR), augmented reality (AR), and mixed reality (MR), XR has demonstrated transformative potential [...] Read more.
The rapid convergence of artificial intelligence (AI), cloud computing, and 5G communication has positioned extended reality (XR) as a core technology bridging the physical and virtual worlds. Encompassing virtual reality (VR), augmented reality (AR), and mixed reality (MR), XR has demonstrated transformative potential across sectors such as healthcare, industry, education, and defense. However, the compact architecture and limited computational capabilities of XR devices render conventional cryptographic authentication schemes inefficient, while the real-time transmission of biometric and positional data introduces significant privacy and security vulnerabilities. To overcome these challenges, this study introduces PXRA (PUF-based XR authentication), a lightweight and secure authentication and key distribution protocol optimized for cloud-assisted XR environments. PXRA utilizes a physically unclonable function (PUF) for device-level hardware authentication and offloads elliptic curve cryptography (ECC) operations to the cloud to enhance computational efficiency. Authenticated encryption with associated data (AEAD) ensures message confidentiality and integrity, while formal verification through ProVerif confirms the protocol’s robustness under the Dolev–Yao adversary model. Experimental results demonstrate that PXRA reduces device-side computational overhead by restricting XR terminals to lightweight PUF and hash functions, achieving an average authentication latency below 15 ms sufficient for real-time XR performance. Formal analysis verifies PXRA’s resistance to replay, impersonation, and key compromise attacks, while preserving user anonymity and session unlinkability. These findings establish the feasibility of integrating hardware-based PUF authentication with cloud-assisted cryptographic computation to enable secure, scalable, and real-time XR systems. The proposed framework lays a foundation for future XR applications in telemedicine, remote collaboration, and immersive education, where both performance and privacy preservation are paramount. Our contribution lies in a hybrid PUF–cloud ECC architecture, context-bound AEAD for session-splicing resistance, and a noise-resilient BCH-based fuzzy extractor supporting up to 15% BER. Full article
(This article belongs to the Special Issue Feature Papers in the Internet of Things Section 2025)
Show Figures

Figure 1

27 pages, 452 KB  
Article
Evaluation of Digital Technologies in Food Logistics: MCDM Approach from the Perspective of Logistics Providers
by Aleksa Maravić, Vukašin Pajić and Milan Andrejić
Logistics 2026, 10(1), 6; https://doi.org/10.3390/logistics10010006 - 26 Dec 2025
Viewed by 270
Abstract
Background: In the era of rapid digital transformation, efficient food logistics (FL) is critical for sustainability and competitiveness. Maintaining food quality, minimizing waste, and optimizing costs are complex challenges that advanced digital technologies aim to address, particularly amid growing e-commerce and last-mile delivery [...] Read more.
Background: In the era of rapid digital transformation, efficient food logistics (FL) is critical for sustainability and competitiveness. Maintaining food quality, minimizing waste, and optimizing costs are complex challenges that advanced digital technologies aim to address, particularly amid growing e-commerce and last-mile delivery demands. This underscores the need for a structured, quantitative evaluation of technological solutions to ensure operational reliability, efficiency, and sustainability. Methods: This study employs a Multi-Criteria Decision Making (MCDM) model combining Criterion Impact LOSs (CILOS) and Multi-Objective Optimization on the basis of Simple Ratio Analysis (MOOSRA) to evaluate key FL technologies: IoT, blockchain, Big Data analytics, automation and robotics, and cloud/edge computing. Nine evaluation criteria relevant to logistics providers were used, covering operational efficiency, flexibility, sustainability, food safety, data reliability, KPI support, scalability, costs, and implementation speed. CILOS determined criteria weights by considering interdependencies, and MOOSRA ranked technologies by benefits-to-costs ratios. Sensitivity analysis validated result robustness. Results: Automation and robotics ranked highest for enhancing efficiency, reducing errors, and improving handling and safety. Blockchain was second, supporting traceability and data security. Big Data analytics was third, enabling demand prediction and inventory optimization. IoT ranked fourth, providing real-time monitoring, while cloud/edge computing ranked fifth due to indirect operational impact. Conclusions: The CILOS–MOOSRA model enables transparent, structured evaluation, integrating quantitative metrics with logistics providers’ priorities. Results highlight technologies that enhance efficiency, reliability, and sustainability while revealing integration challenges, providing a strategic foundation for digital transformation in FL. Full article
Show Figures

Figure 1

Back to TopTop