Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (753)

Search Parameters:
Keywords = internet speed

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
22 pages, 4650 KiB  
Article
IoT Monitoring and Evaluating System for the Construction Quality of Foundation Pile
by Kai Wu, Peng Zhang, Jiejun Yuan, Xiaqing Qian and Runen Qi
Buildings 2025, 15(15), 2660; https://doi.org/10.3390/buildings15152660 - 28 Jul 2025
Viewed by 239
Abstract
The quality of foundation pile is greatly influenced by human factors, and quality assessment is delayed. This paper introduces a new evaluation system based on Internet of Things (IoT) monitoring data of the foundation pile construction process. First, an IoT monitoring system of [...] Read more.
The quality of foundation pile is greatly influenced by human factors, and quality assessment is delayed. This paper introduces a new evaluation system based on Internet of Things (IoT) monitoring data of the foundation pile construction process. First, an IoT monitoring system of foundation pile construction process quality is established to monitor the key parameters for quality control in the foundation pile construction process, such as pile length, position, verticality, water–cement ratio, grouting volume, drilling/lifting speed, etc. Next, the absolute gray relational degree analysis method and the analytic hierarchy process (AHP) entropy-weighted combination weighting method are used to divide the monitoring data into different levels and determine the weight coefficients for quality indicators during foundation pile construction. Last, the IoT monitoring and evaluation system of the foundation piles construction process quality is applied to engineering. The results indicate that the monitoring system is convenient and efficient, and the quality evaluation method is reliable. The construction process quality of cement-mixing piles is rated as excellent. The construction process quality of bored piles Z0103 and Z0232 is excellent, and pile Z0012 is qualified. Full article
(This article belongs to the Section Building Structures)
Show Figures

Figure 1

14 pages, 1179 KiB  
Article
Dual-Core Hierarchical Fuzzing Framework for Efficient and Secure Firmware Over-the-Air
by Na-Hyun Kim, Jin-Min Lee and Il-Gu Lee
Electronics 2025, 14(14), 2886; https://doi.org/10.3390/electronics14142886 - 18 Jul 2025
Viewed by 202
Abstract
As the use of Internet of Things (IoT) devices becomes extensive, ensuring their security has become a critical issue for both individuals and organizations, particularly as these devices collect, transmit, and analyze diverse data. The firmware of IoT devices plays a key role [...] Read more.
As the use of Internet of Things (IoT) devices becomes extensive, ensuring their security has become a critical issue for both individuals and organizations, particularly as these devices collect, transmit, and analyze diverse data. The firmware of IoT devices plays a key role in ensuring system security; any vulnerabilities in the firmware can expose the system to threats such as hacking or malware infections. Consequently, fuzzing is used to analyze firmware vulnerabilities during the update process. However, conventional single-core and random fuzzing-based firmware vulnerability analysis techniques suffer from low efficiency, limited security, and high memory usage. Each time the firmware is updated, the entire file—including previously analyzed code—must be reanalyzed. Moreover, given that the firmware is not layered, unaffected code segments are redundantly reanalyzed. To address these limitations, this study proposes a dual-core-based hierarchical partial fuzzing technique for wireless networks using dual cores. Experimental results show that the proposed technique detects 11 more unique crashes within 300 s and finds 2435 more total crashes than that of the conventional scheme. It also reduces memory usage by 35 KiB. The proposed technique improves the speed, effectiveness, and reliability of firmware updates and vulnerability detection. Full article
(This article belongs to the Special Issue IoT Security in the Age of AI: Innovative Approaches and Technologies)
Show Figures

Figure 1

21 pages, 2065 KiB  
Article
Enhancing Security in 5G and Future 6G Networks: Machine Learning Approaches for Adaptive Intrusion Detection and Prevention
by Konstantinos Kalodanis, Charalampos Papapavlou and Georgios Feretzakis
Future Internet 2025, 17(7), 312; https://doi.org/10.3390/fi17070312 - 18 Jul 2025
Viewed by 351
Abstract
The evolution from 4G to 5G—and eventually to the forthcoming 6G networks—has revolutionized wireless communications by enabling high-speed, low-latency services that support a wide range of applications, including the Internet of Things (IoT), smart cities, and critical infrastructures. However, the unique characteristics of [...] Read more.
The evolution from 4G to 5G—and eventually to the forthcoming 6G networks—has revolutionized wireless communications by enabling high-speed, low-latency services that support a wide range of applications, including the Internet of Things (IoT), smart cities, and critical infrastructures. However, the unique characteristics of these networks—extensive connectivity, device heterogeneity, and architectural flexibility—impose significant security challenges. This paper introduces a comprehensive framework for enhancing the security of current and emerging wireless networks by integrating state-of-the-art machine learning (ML) techniques into intrusion detection and prevention systems. It also thoroughly explores the key aspects of wireless network security, including architectural vulnerabilities in both 5G and future 6G networks, novel ML algorithms tailored to address evolving threats, privacy-preserving mechanisms, and regulatory compliance with the EU AI Act. Finally, a Wireless Intrusion Detection Algorithm (WIDA) is proposed, demonstrating promising results in improving wireless network security. Full article
(This article belongs to the Special Issue Advanced 5G and Beyond Networks)
Show Figures

Figure 1

16 pages, 2354 KiB  
Proceeding Paper
Design and Implementation of a Passive Optical Network for a Small Town
by Fatima Sapundzhi, Boyko Zarev, Slavi Georgiev, Snezhinka Zaharieva, Metodi Popstoilov and Meglena Lazarova
Eng. Proc. 2025, 100(1), 40; https://doi.org/10.3390/engproc2025100040 - 15 Jul 2025
Viewed by 236
Abstract
The increasing demand for high-speed internet and advanced digital services necessitates the deployment of robust and scalable broadband infrastructure, particularly in smaller urban and rural areas. This paper presents the design and implementation of a passive optical network (PON) based on a gigabit-capable [...] Read more.
The increasing demand for high-speed internet and advanced digital services necessitates the deployment of robust and scalable broadband infrastructure, particularly in smaller urban and rural areas. This paper presents the design and implementation of a passive optical network (PON) based on a gigabit-capable passive optical network (GPON) standard to deliver fiber-to-the-home (FTTH) services in a small-town setting. The proposed solution prioritizes cost-effectiveness, scalability, and minimal energy consumption by leveraging passive splitters and unpowered network elements. We detail the topology planning, splitter architecture, installation practices, and technical specifications that ensure efficient signal distribution and future network expansion. The results demonstrate the successful implementation of an optical access infrastructure that supports high-speed internet, Internet Protocol television (IPTV), and voice services while maintaining flexibility for diverse urban layouts and housing types. Full article
Show Figures

Figure 1

18 pages, 1184 KiB  
Article
A Confidential Transmission Method for High-Speed Power Line Carrier Communications Based on Generalized Two-Dimensional Polynomial Chaotic Mapping
by Zihan Nie, Zhitao Guo and Jinli Yuan
Appl. Sci. 2025, 15(14), 7813; https://doi.org/10.3390/app15147813 - 11 Jul 2025
Viewed by 297
Abstract
The deep integration of smart grid and Internet of Things technologies has made high-speed power line carrier communication a key communication technology in energy management, industrial monitoring, and smart home applications, owing to its advantages of requiring no additional wiring and offering wide [...] Read more.
The deep integration of smart grid and Internet of Things technologies has made high-speed power line carrier communication a key communication technology in energy management, industrial monitoring, and smart home applications, owing to its advantages of requiring no additional wiring and offering wide coverage. However, the inherent characteristics of power line channels, such as strong noise, multipath fading, and time-varying properties, pose challenges to traditional encryption algorithms, including low key distribution efficiency and weak anti-interference capabilities. These issues become particularly pronounced in high-speed transmission scenarios, where the conflict between data security and communication reliability is more acute. To address this problem, a secure transmission method for high-speed power line carrier communication based on generalized two-dimensional polynomial chaotic mapping is proposed. A high-speed power line carrier communication network is established using a power line carrier routing algorithm based on the minimal connected dominating set. The autoregressive moving average model is employed to determine the degree of transmission fluctuation deviation in the high-speed power line carrier communication network. Leveraging the complex dynamic behavior and anti-decoding capability of generalized two-dimensional polynomial chaotic mapping, combined with the deviation, the communication key is generated. This process yields encrypted high-speed power line carrier communication ciphertext that can resist power line noise interference and signal attenuation, thereby enhancing communication confidentiality and stability. By applying reference modulation differential chaotic shift keying and integrating the ciphertext of high-speed power line carrier communication, a secure transmission scheme is designed to achieve secure transmission in high-speed power line carrier communication. The experimental results demonstrate that this method can effectively establish a high-speed power line carrier communication network and encrypt information. The maximum error rate obtained by this method is 0.051, and the minimum error rate is 0.010, confirming its ability to ensure secure transmission in high-speed power line carrier communication while improving communication confidentiality. Full article
Show Figures

Figure 1

21 pages, 1207 KiB  
Article
Flash-Attention-Enhanced Multi-Agent Deep Deterministic Policy Gradient for Mobile Edge Computing in Digital Twin-Powered Internet of Things
by Yuzhe Gao, Xiaoming Yuan, Songyu Wang, Lixin Chen, Zheng Zhang and Tianran Wang
Mathematics 2025, 13(13), 2164; https://doi.org/10.3390/math13132164 - 2 Jul 2025
Viewed by 321
Abstract
Offloading decisions and resource allocation problems in mobile edge computing (MEC) emerge as key challenges as they directly impact system performance and user experience in dynamic and resource-constrained Internet of Things (IoT) environments. This paper constructs a comprehensive and layered digital twin (DT) [...] Read more.
Offloading decisions and resource allocation problems in mobile edge computing (MEC) emerge as key challenges as they directly impact system performance and user experience in dynamic and resource-constrained Internet of Things (IoT) environments. This paper constructs a comprehensive and layered digital twin (DT) model for MEC, enabling real-time cooperation with the physical world and intelligent decision making. Within this model, a novel Flash-Attention-enhanced Multi-Agent Deep Deterministic Policy Gradient (FA-MADDPG) algorithm is proposed to effectively tackle MEC problems. It enhances the model by arming a critic network with attention to provide a high-quality decision. It also changes a matrix operation in a mathematical way to speed up the training process. Experiments are performed in our proposed DT environment, and results demonstrate that FA-MADDPG has good convergence. Compared with other algorithms, it achieves excellent performance in delay and energy consumption under various settings, with high time efficiency. Full article
Show Figures

Figure 1

28 pages, 2850 KiB  
Article
Quantification and Evolution of Online Public Opinion Heat Considering Interactive Behavior and Emotional Conflict
by Zhengyi Sun, Deyao Wang and Zhaohui Li
Entropy 2025, 27(7), 701; https://doi.org/10.3390/e27070701 - 29 Jun 2025
Viewed by 360
Abstract
With the rapid development of the Internet, the speed and scope of sudden public events disseminating in cyberspace have grown significantly. Current methods of quantifying public opinion heat often neglect emotion-driven factors and user interaction behaviors, making it difficult to accurately capture fluctuations [...] Read more.
With the rapid development of the Internet, the speed and scope of sudden public events disseminating in cyberspace have grown significantly. Current methods of quantifying public opinion heat often neglect emotion-driven factors and user interaction behaviors, making it difficult to accurately capture fluctuations during dissemination. To address these issues, first, this study addressed the complexity of interaction behaviors by introducing an approach that employs the information gain ratio as a weighting indicator to measure the “interaction heat” contributed by different interaction attributes during event evolution. Second, this study built on SnowNLP and expanded textual features to conduct in-depth sentiment mining of large-scale opinion texts, defining the variance of netizens’ emotional tendencies as an indicator of emotional fluctuations, thereby capturing “emotional heat”. We then integrated interactive behavior and emotional conflict assessment to achieve comprehensive heat index to quantification and dynamic evolution analysis of online public opinion heat. Subsequently, we used Hodrick–Prescott filter to separate long-term trends and short-term fluctuations, extract six key quantitative features (number of peaks, time of first peak, maximum amplitude, decay time, peak emotional conflict, and overall duration), and applied K-means clustering algorithm (K-means) to classify events into three propagation patterns, which are extreme burst, normal burst, and long-tail. Finally, this study conducted ablation experiments on critical external intervention nodes to quantify the distinct contribution of each intervention to the propagation trend by observing changes in the model’s goodness-of-fit (R2) after removing different interventions. Through an empirical analysis of six representative public opinion events from 2024, this study verified the effectiveness of the proposed framework and uncovered critical characteristics of opinion dissemination, including explosiveness versus persistence, multi-round dissemination with recurring emotional fluctuations, and the interplay of multiple driving factors. Full article
(This article belongs to the Special Issue Statistical Physics Approaches for Modeling Human Social Systems)
Show Figures

Figure 1

18 pages, 2290 KiB  
Article
Improving MRAM Performance with Sparse Modulation and Hamming Error Correction
by Nam Le, Thien An Nguyen, Jong-Ho Lee and Jaejin Lee
Sensors 2025, 25(13), 4050; https://doi.org/10.3390/s25134050 - 29 Jun 2025
Viewed by 426
Abstract
With the rise of the Internet of Things (IoT), smart sensors are increasingly being deployed as compact edge processing units, necessitating continuously writable memory with low power consumption and fast access times. Magnetic random-access memory (MRAM) has emerged as a promising non-volatile alternative [...] Read more.
With the rise of the Internet of Things (IoT), smart sensors are increasingly being deployed as compact edge processing units, necessitating continuously writable memory with low power consumption and fast access times. Magnetic random-access memory (MRAM) has emerged as a promising non-volatile alternative to conventional DRAM and SDRAM, offering advantages such as faster access speeds, reduced power consumption, and enhanced endurance. However, MRAM is subject to challenges including process variations and thermal fluctuations, which can induce random bit errors and result in imbalanced probabilities of 0 and 1 bits. To address these issues, we propose a novel sparse coding scheme characterized by a minimum Hamming distance of three. During the encoding process, three check bits are appended to the user data and processed using a generator matrix. If the resulting codeword fails to satisfy the sparsity constraint, it is inverted to comply with the coding requirement. This method is based on the error characteristics inherent in MRAM to facilitate effective error correction. Furthermore, we introduce a dynamic threshold detection technique that updates bit probability estimates in real time during data transmission. Simulation results demonstrate substantial improvements in both error resilience and decoding accuracy, particularly as MRAM density increases. Full article
(This article belongs to the Section Electronic Sensors)
Show Figures

Figure 1

25 pages, 7855 KiB  
Article
Latency-Sensitive Wireless Communication in Dynamically Moving Robots for Urban Mobility Applications
by Jakub Krejčí, Marek Babiuch, Jiří Suder, Václav Krys and Zdenko Bobovský
Smart Cities 2025, 8(4), 105; https://doi.org/10.3390/smartcities8040105 - 25 Jun 2025
Viewed by 713
Abstract
Reliable wireless communication is essential for mobile robotic systems operating in dynamic environments, particularly in the context of smart mobility and cloud-integrated urban infrastructures. This article presents an experimental study analyzing the impact of robot motion dynamics on wireless network performance, contributing to [...] Read more.
Reliable wireless communication is essential for mobile robotic systems operating in dynamic environments, particularly in the context of smart mobility and cloud-integrated urban infrastructures. This article presents an experimental study analyzing the impact of robot motion dynamics on wireless network performance, contributing to the broader discussion on data reliability and communication efficiency in intelligent transportation systems. Measurements were conducted using a quadruped robot equipped with an onboard edge computing device, navigating predefined trajectories in a laboratory setting designed to emulate real-world variability. Key wireless parameters, including signal strength (RSSI), latency, and packet loss, were continuously monitored alongside robot kinematic data such as speed, orientation (roll, pitch, yaw), and movement patterns. The results show a significant correlation between dynamic motion—especially high forward velocities and rotational maneuvers—and degradations in network performance. Increased robot speeds and frequent orientation changes were associated with elevated latency and greater packet loss, while static or low-motion periods exhibited more stable communication. These findings highlight critical challenges for real-time data transmission in mobile IoRT (Internet of Robotic Things) systems, and emphasize the role of network-aware robotic behavior, interoperable communication protocols, and edge-to-cloud data integration in ensuring robust wireless performance within smart city environments. Full article
(This article belongs to the Special Issue Smart Mobility: Linking Research, Regulation, Innovation and Practice)
Show Figures

Figure 1

39 pages, 1839 KiB  
Review
The Integration of the Internet of Things (IoT) Applications into 5G Networks: A Review and Analysis
by Aymen I. Zreikat, Zakwan AlArnaout, Ahmad Abadleh, Ersin Elbasi and Nour Mostafa
Computers 2025, 14(7), 250; https://doi.org/10.3390/computers14070250 - 25 Jun 2025
Cited by 1 | Viewed by 1641
Abstract
The incorporation of Internet of Things (IoT) applications into 5G networks marks a significant step towards realizing the full potential of connected systems. 5G networks, with their ultra-low latency, high data speeds, and huge interconnection, provide a perfect foundation for IoT ecosystems to [...] Read more.
The incorporation of Internet of Things (IoT) applications into 5G networks marks a significant step towards realizing the full potential of connected systems. 5G networks, with their ultra-low latency, high data speeds, and huge interconnection, provide a perfect foundation for IoT ecosystems to thrive. This connectivity offers a diverse set of applications, including smart cities, self-driving cars, industrial automation, healthcare monitoring, and agricultural solutions. IoT devices can improve their reliability, real-time communication, and scalability by exploiting 5G’s advanced capabilities such as network slicing, edge computing, and enhanced mobile broadband. Furthermore, the convergence of IoT with 5G fosters interoperability, allowing for smooth communication across diverse devices and networks. This study examines the fundamental technical applications, obstacles, and future perspectives for integrating IoT applications with 5G networks, emphasizing the potential benefits while also addressing essential concerns such as security, energy efficiency, and network management. The results of this review and analysis will act as a valuable resource for researchers, industry experts, and policymakers involved in the progression of 5G technologies and their incorporation with IT solutions. Full article
Show Figures

Figure 1

23 pages, 2431 KiB  
Article
SatScope: A Data-Driven Simulator for Low-Earth-Orbit Satellite Internet
by Qichen Wang, Guozheng Yang, Yongyu Liang, Chiyu Chen, Qingsong Zhao and Sugai Chen
Future Internet 2025, 17(7), 278; https://doi.org/10.3390/fi17070278 - 24 Jun 2025
Viewed by 396
Abstract
The rapid development of low-Earth-orbit (LEO) satellite constellations has not only provided global users with low-latency and unrestricted high-speed data services but also presented researchers with the challenge of understanding dynamic changes in global network behavior. Unlike geostationary satellites and terrestrial internet infrastructure, [...] Read more.
The rapid development of low-Earth-orbit (LEO) satellite constellations has not only provided global users with low-latency and unrestricted high-speed data services but also presented researchers with the challenge of understanding dynamic changes in global network behavior. Unlike geostationary satellites and terrestrial internet infrastructure, LEO satellites move at a relative velocity of 7.6 km/s, leading to frequent alterations in their connectivity status with ground stations. Given the complexity of the space environment, current research on LEO satellite internet primarily focuses on modeling and simulation. However, existing LEO satellite network simulators often overlook the global network characteristics of these systems. We present SatScope, a data-driven simulator for LEO satellite internet. SatScope consists of three main components, space segment modeling, ground segment modeling, and network simulation configuration, providing researchers with an interface to interact with these models. Utilizing both space and ground segment models, SatScope can configure various network topology models, routing algorithms, and load balancing schemes, thereby enabling the evaluation of optimization algorithms for LEO satellite communication systems. We also compare SatScope’s fidelity, lightweight design, scalability, and openness against other simulators. Based on our simulation results using SatScope, we propose two metrics—ground node IP coverage rate and the number of satellite service IPs—to assess the service performance of single-layer satellite networks. Our findings reveal that during each network handover, on average, 38.94% of nodes and 83.66% of links change. Full article
(This article belongs to the Section Internet of Things)
Show Figures

Figure 1

29 pages, 8644 KiB  
Review
Recent Advances in Resistive Gas Sensors: Fundamentals, Material and Device Design, and Intelligent Applications
by Peiqingfeng Wang, Shusheng Xu, Xuerong Shi, Jiaqing Zhu, Haichao Xiong and Huimin Wen
Chemosensors 2025, 13(7), 224; https://doi.org/10.3390/chemosensors13070224 - 21 Jun 2025
Viewed by 810
Abstract
Resistive gas sensors have attracted significant attention due to their simple architecture, low cost, and ease of integration, with widespread applications in environmental monitoring, industrial safety, and healthcare diagnostics. This review provides a comprehensive overview of recent advances in resistive gas sensors, focusing [...] Read more.
Resistive gas sensors have attracted significant attention due to their simple architecture, low cost, and ease of integration, with widespread applications in environmental monitoring, industrial safety, and healthcare diagnostics. This review provides a comprehensive overview of recent advances in resistive gas sensors, focusing on their fundamental working mechanisms, sensing material design, device architecture optimization, and intelligent system integration. These sensors primarily operate based on changes in electrical resistance induced by interactions between gas molecules and sensing materials, including physical adsorption, charge transfer, and surface redox reactions. In terms of materials, metal oxide semiconductors, conductive polymers, carbon-based nanomaterials, and their composites have demonstrated enhanced sensitivity and selectivity through strategies such as doping, surface functionalization, and heterojunction engineering, while also enabling reduced operating temperatures. Device-level innovations—such as microheater integration, self-heated nanowires, and multi-sensor arrays—have further improved response speed and energy efficiency. Moreover, the incorporation of artificial intelligence (AI) and Internet of Things (IoT) technologies has significantly advanced signal processing, pattern recognition, and long-term operational stability. Machine learning (ML) algorithms have enabled intelligent design of novel sensing materials, optimized multi-gas identification, and enhanced data reliability in complex environments. These synergistic developments are driving resistive gas sensors toward low-power, highly integrated, and multifunctional platforms, particularly in emerging applications such as wearable electronics, breath diagnostics, and smart city infrastructure. This review concludes with a perspective on future research directions, emphasizing the importance of improving material stability, interference resistance, standardized fabrication, and intelligent system integration for large-scale practical deployment. Full article
Show Figures

Figure 1

26 pages, 623 KiB  
Article
Significance of Machine Learning-Driven Algorithms for Effective Discrimination of DDoS Traffic Within IoT Systems
by Mohammed N. Alenezi
Future Internet 2025, 17(6), 266; https://doi.org/10.3390/fi17060266 - 18 Jun 2025
Viewed by 497
Abstract
As digital infrastructure continues to expand, networks, web services, and Internet of Things (IoT) devices become increasingly vulnerable to distributed denial of service (DDoS) attacks. Remarkably, IoT devices have become attracted to DDoS attacks due to their common deployment and limited applied security [...] Read more.
As digital infrastructure continues to expand, networks, web services, and Internet of Things (IoT) devices become increasingly vulnerable to distributed denial of service (DDoS) attacks. Remarkably, IoT devices have become attracted to DDoS attacks due to their common deployment and limited applied security measures. Therefore, attackers take advantage of the growing number of unsecured IoT devices to reflect massive traffic that overwhelms networks and disrupts necessary services, making protection of IoT devices against DDoS attacks a major concern for organizations and administrators. In this paper, the effectiveness of supervised machine learning (ML) classification and deep learning (DL) algorithms in detecting DDoS attacks on IoT networks was investigated by conducting an extensive analysis of network traffic dataset (legitimate and malicious). The performance of the models and data quality improved when emphasizing the impact of feature selection and data pre-processing approaches. Five machine learning models were evaluated by utilizing the Edge-IIoTset dataset: Random Forest (RF), Support Vector Machine (SVM), Long Short-Term Memory (LSTM), and K-Nearest Neighbors (KNN) with multiple K values, and Convolutional Neural Network (CNN). Findings revealed that the RF model outperformed other models by delivering optimal detection speed and remarkable performance across all evaluation metrics, while KNN (K = 7) emerged as the most efficient model in terms of training time. Full article
(This article belongs to the Special Issue Cybersecurity in the IoT)
Show Figures

Figure 1

24 pages, 1082 KiB  
Article
An Explainable Machine Learning Approach for IoT-Supported Shaft Power Estimation and Performance Analysis for Marine Vessels
by Yiannis Kiouvrekis, Katerina Gkirtzou, Sotiris Zikas, Dimitris Kalatzis, Theodor Panagiotakopoulos, Zoran Lajic, Dimitris Papathanasiou and Ioannis Filippopoulos
Future Internet 2025, 17(6), 264; https://doi.org/10.3390/fi17060264 - 17 Jun 2025
Cited by 1 | Viewed by 382
Abstract
In the evolving landscape of green shipping, the accurate estimation of shaft power is critical for reducing fuel consumption and greenhouse gas emissions. This study presents an explainable machine learning framework for shaft power prediction, utilising real-world Internet of Things (IoT) sensor data [...] Read more.
In the evolving landscape of green shipping, the accurate estimation of shaft power is critical for reducing fuel consumption and greenhouse gas emissions. This study presents an explainable machine learning framework for shaft power prediction, utilising real-world Internet of Things (IoT) sensor data collected from nine (9) Very Large Crude Carriers (VLCCs) over a 36-month period. A diverse set of models—ranging from traditional algorithms such as Decision Trees and Support Vector Machines to advanced ensemble methods like XGBoost and LightGBM—were developed and evaluated. Model performance was assessed using the coefficient of determination (R2) and RMSE, with XGBoost achieving the highest accuracy (R2=0.9490, RMSE 888) and LightGBM being close behind (R2=0.9474, RMSE 902), with both substantially exceeding the industry baseline model (R2=0.9028, RMSE 1500). Explainability was integrated through SHapley Additive exPlanations (SHAP), offering detailed insights into the influence of each input variable. Features such as draft, GPS speed, and time since last dry dock consistently emerged as key predictors. The results demonstrate the robustness and interpretability of tree-based methods, offering a data-driven alternative to traditional performance estimation techniques and supporting the maritime industry’s transition toward more efficient and sustainable operations. Full article
Show Figures

Figure 1

26 pages, 9010 KiB  
Article
Micro-Location Temperature Prediction Leveraging Deep Learning Approaches
by Amadej Krepek, Iztok Fister and Iztok Fister
Appl. Sci. 2025, 15(12), 6793; https://doi.org/10.3390/app15126793 - 17 Jun 2025
Viewed by 369
Abstract
Nowadays, technological progress has promoted the integration of artificial intelligence into modern human lives rapidly. On the other hand, extreme weather events in recent years have started to influence human well-being. As a result, these events have been addressed by artificial intelligence methods [...] Read more.
Nowadays, technological progress has promoted the integration of artificial intelligence into modern human lives rapidly. On the other hand, extreme weather events in recent years have started to influence human well-being. As a result, these events have been addressed by artificial intelligence methods more and more frequently. In line with this, the paper focuses on searching for predicting the air temperature in a particular Slovenian micro-location by using a weather prediction model Maximus based on a long-short term memory neural network learned by the long-term, lower-resolution dataset CERRA. During this huge experimental study, the Maximus prediction model was tested with the ICON-D2 general-purpose weather prediction model and validated with real data from the mobile weather station positioned at a specific micro-location. The weather station employs Internet of Things sensors for measuring temperature, humidity, wind speed and direction, and rain, while it is powered by solar cells. The results of comparing the Maximus proposed prediction model for predicting the air temperature in micro-locations with the general-purpose weather prediction model ICON-D2 has encouraged the authors to continue searching for an air temperature prediction model at the micro-location in the future. Full article
(This article belongs to the Special Issue Deep Learning and Data Mining: Latest Advances and Applications)
Show Figures

Figure 1

Back to TopTop