Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (1,782)

Search Parameters:
Keywords = IoT cloud

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
33 pages, 16564 KB  
Article
Design and Implementation of an Off-Grid Smart Street Lighting System Using LoRaWAN and Hybrid Renewable Energy for Energy-Efficient Urban Infrastructure
by Seyfettin Vadi
Sensors 2025, 25(17), 5579; https://doi.org/10.3390/s25175579 (registering DOI) - 6 Sep 2025
Abstract
The growing demand for electricity and the urgent need to reduce environmental impact have made sustainable energy utilization a global priority. Street lighting, as a significant consumer of urban electricity, requires innovative solutions to enhance efficiency and reliability. This study presents an off-grid [...] Read more.
The growing demand for electricity and the urgent need to reduce environmental impact have made sustainable energy utilization a global priority. Street lighting, as a significant consumer of urban electricity, requires innovative solutions to enhance efficiency and reliability. This study presents an off-grid smart street lighting system that combines solar photovoltaic generation with battery storage and Internet of Things (IoT)-based control to ensure continuous and efficient operation. The system integrates Long Range Wide Area Network (LoRaWAN) communication technology for remote monitoring and control without internet connectivity and employs the Perturb and Observe (P&O) maximum power point tracking (MPPT) algorithm to maximize energy extraction from solar sources. Data transmission from the LoRaWAN gateway to the cloud is facilitated through the Message Queuing Telemetry Transport (MQTT) protocol, enabling real-time access and management via a graphical user interface. Experimental results demonstrate that the proposed system achieves a maximum MPPT efficiency of 97.96%, supports reliable communication over distances of up to 10 km, and successfully operates four LED streetlights, each spaced 400 m apart, across an open area of approximately 1.2 km—delivering a practical, energy-efficient, and internet-independent solution for smart urban infrastructure. Full article
Show Figures

Figure 1

20 pages, 2480 KB  
Article
Development of Real-Time Water-Level Monitoring System for Agriculture
by Gaukhar Borankulova, Gabit Altybayev, Aigul Tungatarova, Bakhyt Yeraliyeva, Saltanat Dulatbayeva, Aslanbek Murzakhmetov and Samat Bekbolatov
Sensors 2025, 25(17), 5564; https://doi.org/10.3390/s25175564 (registering DOI) - 6 Sep 2025
Abstract
Water resource management is critical for sustainable agriculture, especially in regions like Kazakhstan that face significant water scarcity challenges. This paper presents the development of a real-time water-level monitoring system designed to optimize water use in agriculture. The system integrates IoT sensors and [...] Read more.
Water resource management is critical for sustainable agriculture, especially in regions like Kazakhstan that face significant water scarcity challenges. This paper presents the development of a real-time water-level monitoring system designed to optimize water use in agriculture. The system integrates IoT sensors and cloud technologies, and analyzes data on water levels, temperature, humidity, and other environmental parameters. The architecture comprises a data collection layer with solar-powered sensors, a network layer for data transmission, a storage and integration layer for data management, a data processing layer for analysis and forecasting, and a user interface for visualization and interaction. The system was tested at the Left Bypass Canal in Taraz, Kazakhstan, demonstrating its effectiveness in providing real-time data for informed decision-making. The results indicate that the system significantly improves water use efficiency, reduces non-productive losses, and supports sustainable agricultural practices. Full article
(This article belongs to the Special Issue Recent Advances in Sensor Technology and Robotics Integration)
Show Figures

Figure 1

26 pages, 6191 KB  
Article
A Personalized 3D-Printed Smart Splint with Integrated Sensors and IoT-Based Control: A Proof-of-Concept Study for Distal Radius Fracture Management
by Yufeng Ma, Haoran Tang, Baojian Wang, Jiashuo Luo and Xiliang Liu
Electronics 2025, 14(17), 3542; https://doi.org/10.3390/electronics14173542 - 5 Sep 2025
Abstract
Conventional static fixation for distal radius fractures (DRF) is clinically challenging, with methods often leading to complications such as malunion and pressure-related injuries. These issues stem from uncontrolled pressure and a lack of real-time biomechanical feedback, resulting in suboptimal functional recovery. To overcome [...] Read more.
Conventional static fixation for distal radius fractures (DRF) is clinically challenging, with methods often leading to complications such as malunion and pressure-related injuries. These issues stem from uncontrolled pressure and a lack of real-time biomechanical feedback, resulting in suboptimal functional recovery. To overcome these limitations, we engineered an intelligent, adaptive orthopedic device. The system is built on a patient-specific, 3D-printed architecture for a lightweight, personalized fit. It embeds an array of thin-film pressure sensors at critical anatomical sites to continuously quantify biomechanical forces. This data is transmitted via an Internet of Things (IoT) module to a cloud platform, enabling real-time remote monitoring by clinicians. The core innovation is a closed-loop feedback controller governed by a robust Interval Type-2 Fuzzy Logic (IT2-FLC) algorithm. This system autonomously adjusts servo-driven straps to dynamically regulate fixation pressure, adapting to changes in limb swelling. In a preliminary clinical evaluation, the group receiving the integrated treatment protocol, which included the smart splint and TCM herbal therapy, demonstrated superior anatomical restoration and functional recovery, evidenced by higher Cooney scores (91.65 vs. 83.15) and lower VAS pain scores. This proof-of-concept study validates a new paradigm for adaptive orthopedic devices, showing high potential for clinical translation. Full article
28 pages, 8417 KB  
Article
Democratizing IoT for Smart Irrigation: A Cost-Effective DIY Solution Proposal Evaluated in an Actinidia Orchard
by David Pascoal, Telmo Adão, Agnieszka Chojka, Nuno Silva, Sandra Rodrigues, Emanuel Peres and Raul Morais
Algorithms 2025, 18(9), 563; https://doi.org/10.3390/a18090563 - 5 Sep 2025
Abstract
Proper management of water resources in agriculture is of utmost importance for sustainable productivity, especially under the current context of climate change. However, many smart agriculture systems, including for managing irrigation, involve costly, complex tools for most farmers, especially small/medium-scale producers, despite the [...] Read more.
Proper management of water resources in agriculture is of utmost importance for sustainable productivity, especially under the current context of climate change. However, many smart agriculture systems, including for managing irrigation, involve costly, complex tools for most farmers, especially small/medium-scale producers, despite the availability of user-friendly and community-accessible tools supported by well-established providers (e.g., Google). Hence, this paper proposes an irrigation management system integrating low-cost Internet of Things (IoT) sensors with community-accessible cloud-based data management tools. Specifically, it resorts to sensors managed by an ESP32 development board to monitor several agroclimatic parameters and employs Google Sheets for data handling, visualization, and decision support, assisting operators in carrying out proper irrigation procedures. To ensure reproducibility for both digital experts but mainly non-technical professionals, a comprehensive set of guidelines is provided for the assembly and configuration of the proposed irrigation management system, aiming to promote a democratized dissemination of key technical knowledge within a do-it-yourself (DIY) paradigm. As part of this contribution, a market survey identified numerous e-commerce platforms that offer the required components at competitive prices, enabling the system to be affordably replicated. Furthermore, an irrigation management prototype was tested in a real production environment, consisting of a 2.4-hectare yellow kiwi orchard managed by an association of producers from July to September 2021. Significant resource reductions were achieved by using low-cost IoT devices for data acquisition and the capabilities of accessible online tools like Google Sheets. Specifically, for this study, irrigation periods were reduced by 62.50% without causing water deficits detrimental to the crops’ development. Full article
Show Figures

Figure 1

31 pages, 9235 KB  
Article
Anomaly Detection and Segmentation in Measurement Signals on Edge Devices Using Artificial Neural Networks
by Jerzy Dembski, Bogdan Wiszniewski and Agata Kołakowska
Sensors 2025, 25(17), 5526; https://doi.org/10.3390/s25175526 - 5 Sep 2025
Abstract
In this paper, three alternative solutions to the problem of detecting and cleaning anomalies in soil signal time series, involving the use of artificial neural networks deployed on in situ data measurement end devices, are proposed and investigated. These models are designed to [...] Read more.
In this paper, three alternative solutions to the problem of detecting and cleaning anomalies in soil signal time series, involving the use of artificial neural networks deployed on in situ data measurement end devices, are proposed and investigated. These models are designed to perform calculations on MCUs, characterized by significantly limited computing capabilities and a limited supply of electrical power. Training of neural network models is carried out based on data from multiple sensors in the supporting computing cloud instance, while detection and removal of anomalies with a trained model takes place on the constrained end devices. With such a distribution of work, it is necessary to achieve a sound compromise between prediction accuracy and the computational complexity of the detection process. In this study, neural-primed heuristic (NPH), autoencoder-based (AEB), and U-Net-based (UNB) approaches were tested, which were found to vary regarding both prediction accuracy and computational complexity. Labeled data were used to train the models, transforming the detection task into an anomaly segmentation task. The obtained results reveal that the UNB approach presents certain advantages; however, it requires a significant volume of training data and has a relatively high time complexity which, in turn, translates into increased power consumption by the end device. For this reason, the other two approaches—NPH and AEB—may be worth considering as reasonable alternatives when developing in situ data cleaning solutions for IoT measurement systems. Full article
(This article belongs to the Special Issue Tiny Machine Learning-Based Time Series Processing)
Show Figures

Figure 1

29 pages, 5213 KB  
Article
Design and Implementation of a Novel Intelligent Remote Calibration System Based on Edge Intelligence
by Quan Wang, Jiliang Fu, Xia Han, Xiaodong Yin, Jun Zhang, Xin Qi and Xuerui Zhang
Symmetry 2025, 17(9), 1434; https://doi.org/10.3390/sym17091434 - 3 Sep 2025
Viewed by 221
Abstract
Calibration of power equipment has become an essential task in modern power systems. This paper proposes a distributed remote calibration prototype based on a cloud–edge–end architecture by integrating intelligent sensing, Internet of Things (IoT) communication, and edge computing technologies. The prototype employs a [...] Read more.
Calibration of power equipment has become an essential task in modern power systems. This paper proposes a distributed remote calibration prototype based on a cloud–edge–end architecture by integrating intelligent sensing, Internet of Things (IoT) communication, and edge computing technologies. The prototype employs a high-precision frequency-to-voltage conversion module leveraging satellite signals to address traceability and value transmission challenges in remote calibration, thereby ensuring reliability and stability throughout the process. Additionally, an environmental monitoring module tracks parameters such as temperature, humidity, and electromagnetic interference. Combined with video surveillance and optical character recognition (OCR), this enables intelligent, end-to-end recording and automated data extraction during calibration. Furthermore, a cloud-edge task scheduling algorithm is implemented to offload computational tasks to edge nodes, maximizing resource utilization within the cloud–edge collaborative system and enhancing service quality. The proposed prototype extends existing cloud–edge collaboration frameworks by incorporating calibration instruments and sensing devices into the network, thereby improving the intelligence and accuracy of remote calibration across multiple layers. Furthermore, this approach facilitates synchronized communication and calibration operations across symmetrically deployed remote facilities and reference devices, providing solid technical support to ensure that measurement equipment meets the required precision and performance criteria. Full article
(This article belongs to the Section Computer)
Show Figures

Figure 1

22 pages, 1688 KB  
Article
LumiCare: A Context-Aware Mobile System for Alzheimer’s Patients Integrating AI Agents and 6G
by Nicola Dall’Ora, Lorenzo Felli, Stefano Aldegheri, Nicola Vicino and Romeo Giuliano
Electronics 2025, 14(17), 3516; https://doi.org/10.3390/electronics14173516 - 2 Sep 2025
Viewed by 192
Abstract
Alzheimer’s disease is a growing global health concern, demanding innovative solutions for early detection, continuous monitoring, and patient support. This article reviews recent advances in Smart Wearable Medical Devices (SWMDs), Internet of Things (IoT) systems, and mobile applications used to monitor physiological, behavioral, [...] Read more.
Alzheimer’s disease is a growing global health concern, demanding innovative solutions for early detection, continuous monitoring, and patient support. This article reviews recent advances in Smart Wearable Medical Devices (SWMDs), Internet of Things (IoT) systems, and mobile applications used to monitor physiological, behavioral, and cognitive changes in Alzheimer’s patients. We highlight the role of wearable sensors in detecting vital signs, falls, and geolocation data, alongside IoT architectures that enable real-time alerts and remote caregiver access. Building on these technologies, we present LumiCare, a conceptual, context-aware mobile system that integrates multimodal sensor data, chatbot-based interaction, and emerging 6G network capabilities. LumiCare uses machine learning for behavioral analysis, delivers personalized cognitive prompts, and enables emergency response through adaptive alerts and caregiver notifications. The system includes the LumiCare Companion, an interactive mobile app designed to support daily routines, cognitive engagement, and safety monitoring. By combining local AI processing with scalable edge-cloud architectures, LumiCare balances latency, privacy, and computational load. While promising, this work remains at the design stage and has not yet undergone clinical validation. Our analysis underscores the potential of wearable, IoT, and mobile technologies to improve the quality of life for Alzheimer’s patients, support caregivers, and reduce healthcare burdens. Full article
(This article belongs to the Special Issue Smart Bioelectronics, Wearable Systems and E-Health)
Show Figures

Figure 1

12 pages, 304 KB  
Article
LoRA-INT8 Whisper: A Low-Cost Cantonese Speech Recognition Framework for Edge Devices
by Lusheng Zhang, Shie Wu and Zhongxun Wang
Sensors 2025, 25(17), 5404; https://doi.org/10.3390/s25175404 - 1 Sep 2025
Viewed by 329
Abstract
To address the triple bottlenecks of data scarcity, oversized models, and slow inference that hinder Cantonese automatic speech recognition (ASR) in low-resource and edge-deployment settings, this study proposes a cost-effective Cantonese ASR system based on LoRA fine-tuning and INT8 quantization. First, Whisper-tiny is [...] Read more.
To address the triple bottlenecks of data scarcity, oversized models, and slow inference that hinder Cantonese automatic speech recognition (ASR) in low-resource and edge-deployment settings, this study proposes a cost-effective Cantonese ASR system based on LoRA fine-tuning and INT8 quantization. First, Whisper-tiny is parameter-efficiently fine-tuned on the Common Voice zh-HK training set using LoRA with rank = 8. Only 1.6% of the original weights are updated, reducing the character error rate (CER) from 49.5% to 11.1%, a performance close to full fine-tuning (10.3%), while cutting the training memory footprint and computational cost by approximately one order of magnitude. Next, the fine-tuned model is compressed into a 60 MB INT8 checkpoint via dynamic quantization in ONNX Runtime. On a MacBook Pro M1 Max CPU, the quantized model achieves an RTF = 0.20 (offline inference 5 × real-time) and 43% lower latency than the FP16 baseline; on an NVIDIA A10 GPU, it reaches RTF = 0.06, meeting the requirements of high-concurrency cloud services. Ablation studies confirm that the LoRA-INT8 configuration offers the best trade-off among accuracy, speed, and model size. Limitations include the absence of spontaneous-speech noise data, extreme-hardware validation, and adaptive LoRA structure optimization. Future work will incorporate large-scale self-supervised pre-training, tone-aware loss functions, AdaLoRA architecture search, and INT4/NPU quantization, and will establish an mJ/char energy–accuracy curve. The ultimate goal is to achieve CER ≤ 8%, RTF < 0.1, and mJ/char < 1 for low-power real-time Cantonese ASR in practical IoT scenarios. Full article
(This article belongs to the Section Electronic Sensors)
Show Figures

Figure 1

22 pages, 1672 KB  
Article
Optimizing Robotic Disassembly-Assembly Line Balancing with Directional Switching Time via an Improved Q(λ) Algorithm in IoT-Enabled Smart Manufacturing
by Qi Zhang, Yang Xing, Man Yao, Xiwang Guo, Shujin Qin, Haibin Zhu, Liang Qi and Bin Hu
Electronics 2025, 14(17), 3499; https://doi.org/10.3390/electronics14173499 - 1 Sep 2025
Viewed by 266
Abstract
With the growing adoption of circular economy principles in manufacturing, efficient disassembly and reassembly of end-of-life (EOL) products has become a key challenge in smart factories. This paper addresses the Disassembly and Assembly Line Balancing Problem (DALBP), which involves scheduling robotic tasks across [...] Read more.
With the growing adoption of circular economy principles in manufacturing, efficient disassembly and reassembly of end-of-life (EOL) products has become a key challenge in smart factories. This paper addresses the Disassembly and Assembly Line Balancing Problem (DALBP), which involves scheduling robotic tasks across workstations while minimizing total operation time and accounting for directional switching time between disassembly and assembly phases. To solve this problem, we propose an improved reinforcement learning algorithm, IQ(λ), which extends the classical Q(λ) method by incorporating eligibility trace decay, a dynamic Action Table mechanism to handle non-conflicting parallel tasks, and switching-aware reward shaping to penalize inefficient task transitions. Compared with standard Q(λ), these modifications enhance the algorithm’s global search capability, accelerate convergence, and improve solution quality in complex DALBP scenarios. While the current implementation does not deploy live IoT infrastructure, the architecture is modular and designed to support future extensions involving edge-cloud coordination, trust-aware optimization, and privacy-preserving learning in Industrial Internet of Things (IIoT) environments. Four real-world disassembly-assembly cases (flashlight, copier, battery, and hammer drill) are used to evaluate the algorithm’s effectiveness. Experimental results show that IQ(λ) consistently outperforms traditional Q-learning, Q(λ), and Sarsa in terms of solution quality, convergence speed, and robustness. Furthermore, ablation studies and sensitivity analysis confirm the importance of the algorithm’s core design components. This work provides a scalable and extensible framework for intelligent scheduling in cyber-physical manufacturing systems and lays a foundation for future integration with secure, IoT-connected environments. Full article
(This article belongs to the Section Networks)
Show Figures

Figure 1

29 pages, 1990 KB  
Review
Real-Time Digital Twins for Intelligent Fault Diagnosis and Condition-Based Monitoring of Electrical Machines
by Shahin Hedayati Kia, Larisa Dunai, José Alfonso Antonino-Daviu and Hubert Razik
Energies 2025, 18(17), 4637; https://doi.org/10.3390/en18174637 - 31 Aug 2025
Viewed by 362
Abstract
This article presents an overview of selected research focusing on digital real-time simulation (DRTS) in the context of digital twin (DT) realization with the primary aim of enabling the intelligent fault diagnosis (FD) and condition-based monitoring (CBM) of electrical machines. The concept of [...] Read more.
This article presents an overview of selected research focusing on digital real-time simulation (DRTS) in the context of digital twin (DT) realization with the primary aim of enabling the intelligent fault diagnosis (FD) and condition-based monitoring (CBM) of electrical machines. The concept of standalone DTs in conventional multiphysics digital offline simulations (DoSs) is widely utilized during the conceptualization and development phases of electrical machine manufacturing and processing, particularly for virtual testing under both standard and extreme operating conditions, as well as for aging assessments and lifecycle analysis. Recent advancements in data communication and information technologies, including virtual reality, cloud computing, parallel processing, machine learning, big data, and the Internet of Things (IoT), have facilitated the creation of real-time DTs based on physics-based (PHYB), circuit-oriented lumped-parameter (COLP), and data-driven approaches, as well as physics-informed machine learning (PIML), which is a combination of these models. These models are distinguished by their ability to enable real-time bidirectional data exchange with physical electrical machines. This article proposes a predictive-level framework with a particular emphasis on real-time multiphysics modeling to enhance the efficiency of the FD and CBM of electrical machines, which play a crucial role in various industrial applications. Full article
Show Figures

Figure 1

37 pages, 3366 KB  
Article
Golden Seal Project: An IoT-Driven Framework for Marine Litter Monitoring and Public Engagement in Tourist Areas
by Dimitra Tzanetou, Stavros Ponis, Eleni Aretoulaki, George Plakas and Antonios Kitsantas
Appl. Sci. 2025, 15(17), 9564; https://doi.org/10.3390/app15179564 - 30 Aug 2025
Viewed by 233
Abstract
This paper presents the research outcomes of the Golden Seal project, which addresses the omnipresent issue of plastic pollution in coastal areas while enhancing their touristic value through the deployment of Internet of Things (IoT) technologies integrated into a gamified recycling framework. The [...] Read more.
This paper presents the research outcomes of the Golden Seal project, which addresses the omnipresent issue of plastic pollution in coastal areas while enhancing their touristic value through the deployment of Internet of Things (IoT) technologies integrated into a gamified recycling framework. The developed system employs an IoT-enabled Wireless Sensor Network (WSN) to systematically collect, transmit, and analyze environmental data. A centralized, cloud-based platform supports real-time monitoring and data integration from Unmanned Aerial and Surface Vehicles (UAV and USV) equipped with sensors and high-resolution cameras. The system also introduces the Beach Cleanliness Index (BCI), a composite indicator that integrates quantitative environmental metrics with user-generated feedback to assess coastal cleanliness in real time. A key innovation of the project’s architecture is the incorporation of a Serious Game (SG), designed to foster public awareness and encourage active participation by local communities and municipal authorities in sustainable waste management practices. Pilot implementations were conducted at selected sites characterized by high tourism activity and accessibility. The results demonstrated the system’s effectiveness in detecting and classifying plastic waste in both coastal and terrestrial settings, while also validating the potential of the Golden Seal initiative to promote sustainable tourism and support marine ecosystem protection. Full article
Show Figures

Figure 1

40 pages, 696 KB  
Review
Embedded Artificial Intelligence: A Comprehensive Literature Review
by Xiaoyuan Huang, Hongcheng Wang, Shiyin Qin and Su-Kit Tang
Electronics 2025, 14(17), 3468; https://doi.org/10.3390/electronics14173468 - 29 Aug 2025
Viewed by 198
Abstract
Embedded Artificial Intelligence (EAI) integrates AI technologies with resource-constrained embedded systems, overcoming the limitations of cloud AI in aspects such as latency and energy consumption, thereby empowering edge devices with autonomous decision-making and real-time intelligence. This review provides a comprehensive overview of this [...] Read more.
Embedded Artificial Intelligence (EAI) integrates AI technologies with resource-constrained embedded systems, overcoming the limitations of cloud AI in aspects such as latency and energy consumption, thereby empowering edge devices with autonomous decision-making and real-time intelligence. This review provides a comprehensive overview of this rapidly evolving field, systematically covering its definition, hardware platforms, software frameworks and tools, core algorithms (including lightweight models), and detailed deployment processes. It also discusses its widespread applications in key areas like autonomous driving and smart Internet of Things (IoT), as well as emerging directions. By analyzing its core challenges and innovative opportunities in algorithms, hardware, and frameworks, this review aims to provide relevant researchers and developers with a practical guidance framework, promoting technological innovation and adoption. Full article
(This article belongs to the Section Artificial Intelligence)
Show Figures

Figure 1

21 pages, 4084 KB  
Article
Integration of Cloud-Based Central Telemedicine System and IoT Device Networks
by Parin Sornlertlamvanich, Chatdanai Phakaket, Panya Hantula, Sarunya Kanjanawattana, Nuntawut Kaoungku and Komsan Srivisut
Computers 2025, 14(9), 357; https://doi.org/10.3390/computers14090357 - 29 Aug 2025
Viewed by 365
Abstract
The growing challenges in healthcare services, such as hospital congestion and a persistent shortage of medical personnel, significantly impede effective service delivery. This particularly presents significant challenges for the continuous monitoring of patients with chronic diseases. In comparison, Internet of Things (IoT) based [...] Read more.
The growing challenges in healthcare services, such as hospital congestion and a persistent shortage of medical personnel, significantly impede effective service delivery. This particularly presents significant challenges for the continuous monitoring of patients with chronic diseases. In comparison, Internet of Things (IoT) based telemonitoring systems offer a promising solution to alleviate these challenges. However, transmitting sensitive and confidential patient health data requires a strong focus on end-to-end security. This includes securing sensitive data within the patient’s home network, during internet transmission, at the endpoint system, and managing sensitive data. In this study, we propose a secure and scalable architecture for a remote health monitoring system that integrates telemedicine technology with the IoT. The proposed solution included a portable remote health monitoring device, an IoT Gateway appliance (IoT GW) in the patient’s home, and an IoT Application Gateway Endpoint (App GW Endpoint) on a cloud infrastructure. A secure communication channel was established by implementing a multi-layered security protocol stack that uses HTTPS over Quick UDP Internet Connection (QUIC), with a focus on optimal security and compatibility, prioritizing cipher suites for data confidentiality and device authentication. The cloud architecture is designed based on the Well-Architected Framework principles to ensure security, high availability, and scalability. Our study shows that storing patient health information is reliable and efficient. Furthermore, the results for processing and transmission times clearly demonstrate that the additional encryption mechanisms have a negligible effect on data transmission latency, while significantly improving security. Full article
(This article belongs to the Section Cloud Continuum and Enabled Applications)
Show Figures

Figure 1

25 pages, 5957 KB  
Article
Benchmarking IoT Simulation Frameworks for Edge–Fog–Cloud Architectures: A Comparative and Experimental Study
by Fatima Bendaouch, Hayat Zaydi, Safae Merzouk and Saliha Assoul
Future Internet 2025, 17(9), 382; https://doi.org/10.3390/fi17090382 - 26 Aug 2025
Viewed by 544
Abstract
Current IoT systems are structured around Edge, Fog, and Cloud layers to manage data and resource constraints more effectively. Although several studies have examined IoT simulators from a functional angle, few have combined technical comparisons with experimental validation under realistic conditions. This lack [...] Read more.
Current IoT systems are structured around Edge, Fog, and Cloud layers to manage data and resource constraints more effectively. Although several studies have examined IoT simulators from a functional angle, few have combined technical comparisons with experimental validation under realistic conditions. This lack of integration limits the practical value of prior results and complicates tool selection for distributed architectures. This work introduces a selection and evaluation methodology for simulators that explicitly represent the Edge–Fog–Cloud continuum. Thirteen open-source tools are analyzed based on functional, technical, and operational features. Among them, iFogSim2 and FogNetSim++ are selected for a detailed experimental comparison on their support of mobility, resource allocation, and energy modeling across all layers. A shared hybrid IoT scenario is simulated using eight key metrics: execution time, application loop delay, CPU processing time per tuple, energy consumption, cloud execution cost, network usage, scalability, and robustness. The analysis reveals distinct modeling strategies: FogNetSim++ reduces loop latency by 48% and maintains stable performance at scale but shows high data loss under overload. In contrast, iFogSim2 consumes up to 80% less energy and preserves message continuity in stressful conditions, albeit with longer execution times. These outcomes reflect the trade-offs between modeling granularity, performance stability, and system resilience. Full article
Show Figures

Figure 1

10 pages, 2169 KB  
Proceeding Paper
Comparative Performance Analysis of Data Transmission Protocols for Sensor-to-Cloud Applications: An Experimental Evaluation
by Filip Tsvetanov and Martin Pandurski
Eng. Proc. 2025, 104(1), 35; https://doi.org/10.3390/engproc2025104035 - 25 Aug 2025
Viewed by 263
Abstract
This paper examines some of the most popular protocols for transmitting sensor data to cloud structures from publish/subscribe and request/response IoT models. The selection of a highly efficient message transmission protocol is essential, as it depends on the specific characteristics and purpose of [...] Read more.
This paper examines some of the most popular protocols for transmitting sensor data to cloud structures from publish/subscribe and request/response IoT models. The selection of a highly efficient message transmission protocol is essential, as it depends on the specific characteristics and purpose of the developed IoT system, which includes communication requirements, message size and format, energy efficiency, reliability, and cloud specifications. No standardized protocol can cover all the diverse application scenarios; therefore, for each developed project, the most appropriate protocol must be selected that meets the project’s specific requirements. This work focuses on finding the most appropriate protocol for integrating sensor data into a suitable open-source IoT platform, ThingsBoard. First, we conduct a comparative analysis of the studied protocols. Then, we propose a project that represents an experiment for transmitting data from a stationary XBee sensor network to the ThingsBoard cloud via HTTP, MQTT-SN, and CoAP protocols. We observe the parameters’ influence on the delayed transmission of packets and their load on the CPU and RAM. The results of the experimental studies for stationary sensor networks collecting environmental data give an advantage to the MQTT-SN protocol. This protocol is preferable to the other two protocols due to the lower delay and load on the processor and memory, which leads to higher energy efficiency and longer life of the sensors and sensor networks. These results can help users make operational judgments for their IoT applications. Full article
Show Figures

Figure 1

Back to TopTop