Machine Learning and Internet of Things in Industry 4.0

A special issue of Future Internet (ISSN 1999-5903). This special issue belongs to the section "Internet of Things".

Deadline for manuscript submissions: closed (31 December 2025) | Viewed by 33281

Special Issue Editors


E-Mail Website
Guest Editor
Faculty of Systems, Electronics and Industrial Engineering, Universidad Tecnica de Ambato, UTA, Ambato 180206, Ecuador Department of Systems Engineering and Automation, University of the Basque Country, EHU/UPV, 20018 Donostia, Spain
Interests: Internet of Things; wireless sensor networks; access points; communication protocol
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Faculty of Systems, Electronics and Industrial Engineering, Universidad Tecnica de Ambato, UTA, Ambato 180206, Ecuador
Interests: robotics; augmented reality; industrial protocols; neural networks
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

The Fourth Industrial Revolution, or Industry 4.0, has ushered in a paradigm shift in industrial operations, driven by the convergence of cutting-edge technologies such as machine learning (ML) and the Internet of Things (IoT). This symbiotic relationship between ML and IoT has unlocked unprecedented opportunities for optimization, automation, and efficiency across various industrial domains. This Special Issue aims to delve into the transformative impact of these technologies within the context of Industry 4.0, illuminating their potential to revolutionize industrial processes.

Machine learning algorithms have emerged as indispensable tools for extracting actionable insights from the vast repositories of data generated by IoT devices in industrial settings. From predictive maintenance and quality control to supply chain management and energy optimization, ML-driven approaches are reshaping the landscape of industrial operations, enabling data-driven decision making and driving operational excellence.

Concurrently, the Internet of Things serves as the backbone of interconnected devices, sensors, and actuators, facilitating real-time monitoring, control, and communication across the industrial ecosystem. The proliferation of IoT-enabled devices has enabled industries to collect granular data at unprecedented scales, providing a rich tapestry of information for ML algorithms to unravel and extract valuable insights.

This Special Issue invites contributions that explore the synergies between machine learning and the Internet of Things in the context of Industry 4.0. We encourage submissions covering a wide range of topics, including but not limited to, the following:

  • ML-driven predictive maintenance for industrial machinery, leveraging IoT data to anticipate failures and optimize maintenance schedules.
  • IoT-enabled smart manufacturing and process optimization, utilizing ML techniques for real-time monitoring, control, and optimization of production processes.
  • Autonomous robotics and intelligent control systems in industrial environments, integrating ML and IoT for enhanced automation and decision-making capabilities.
  • Data analytics and anomaly detection in IoT-driven industrial systems, employing ML algorithms to identify patterns, anomalies, and deviations in real-time data streams.
  • ML-based quality assurance and defect detection in manufacturing processes, utilizing advanced algorithms for automated inspection and quality control.
  • Supply chain optimization and logistics management using IoT and ML, leveraging real-time data and predictive analytics for efficient supply chain operations.
  • Energy efficiency and sustainability through IoT and machine learning applications, optimizing energy consumption and reducing environmental impact.
  • Security, privacy, and ethical considerations in ML-powered IoT deployments, addressing challenges related to data privacy, system security, and responsible AI deployment.
  • Case studies and real-world implementations showcasing the impact of ML and IoT in Industry 4.0, highlighting successful applications and lessons learned.
  • ML-based predictive control and optimization within IEC-61499 function blocks.
  • IoT-enabled multiagent systems for real-time monitoring and control.

This Special Issue serves as a platform for researchers, practitioners, and industry experts to exchange ideas, share insights, and showcase cutting-edge innovations at the intersection of machine learning and the Internet of Things in the era of Industry 4.0. We invite original research papers, review articles, and case studies that contribute to advancing our understanding and harnessing the potential of ML and IoT technologies for industrial transformation.

We encourage submissions that adhere to rigorous academic standards, presenting novel contributions, robust methodologies, and well-substantiated findings. Submissions should demonstrate a deep understanding of the theoretical underpinnings and practical implications of the proposed solutions, while addressing the challenges and limitations associated with their implementation.

We look forward to your valuable contributions and participation in this exciting endeavor, which promises to shape the future of intelligent and connected industrial systems.

Dr. Marcelo García
Dr. Paulina Ayala
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 250 words) can be sent to the Editorial Office for assessment.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Future Internet is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1800 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • machine learning
  • Internet of Things (IoT)
  • industry 4.0
  • smart manufacturing
  • predictive maintenance
  • industrial automation

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • Reprint: MDPI Books provides the opportunity to republish successful Special Issues in book format, both online and in print.

Further information on MDPI's Special Issue policies can be found here.

Related Special Issue

Published Papers (9 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Editorial

Jump to: Research, Review, Other

4 pages, 1020 KB  
Editorial
Machine Learning and IoT as Enablers of Intelligent Industrial Transformation
by Paulina Ayala and Marcelo V. Garcia
Future Internet 2026, 18(3), 141; https://doi.org/10.3390/fi18030141 - 11 Mar 2026
Viewed by 359
Abstract
The rapid expansion of cyber-physical systems, sensor networks, and data-driven automation platforms has profoundly reconfigured the operational logic of contemporary industrial environments [...] Full article
(This article belongs to the Special Issue Machine Learning and Internet of Things in Industry 4.0)

Research

Jump to: Editorial, Review, Other

25 pages, 1245 KB  
Article
Machine Learning-Driven Intrusion Detection for Securing IoT-Based Wireless Sensor Networks
by Yirga Yayeh Munaye, Abebaw Demelash Gebeyehu, Li-Chia Tai, Zemenu Alem Abebe, Aeneas Bekele Workneh, Robel Berie Tarekegn, Yenework Belayneh Chekol and Getaneh Berie Tarekegn
Future Internet 2026, 18(2), 113; https://doi.org/10.3390/fi18020113 - 21 Feb 2026
Cited by 2 | Viewed by 714
Abstract
Wireless sensor networks (WSNs) have become a critical component of modern Internet of Things (IoT) infrastructures; however, their constrained resources and distributed deployment expose them to various cyber threats. In this work, we present a machine learning-driven intrusion detection framework optimized for WSN-based [...] Read more.
Wireless sensor networks (WSNs) have become a critical component of modern Internet of Things (IoT) infrastructures; however, their constrained resources and distributed deployment expose them to various cyber threats. In this work, we present a machine learning-driven intrusion detection framework optimized for WSN-based IoT environments. The proposed approach employs the WSN-DS benchmark dataset and integrates adaptive synthetic sampling (ADASYN) to address class imbalance, followed by a hybrid feature selection strategy combining Feature Importance Selection (FIS) and Recursive Feature Elimination (RFE) to reduce dimensionality and improve learning efficiency. An XGBoost classifier is then trained using five-fold cross-validation to ensure robust generalization. The experimental results demonstrate that the proposed framework significantly outperforms baseline methods, achieving an overall accuracy of 99.87%, with substantial gains in terms of F1-score, precision, and recall. Comparative analysis against recent WSN-DS studies confirms the effectiveness of combining imbalance correction, optimized feature selection, and ensemble learning. These findings highlight the potential of the proposed model as a lightweight and highly accurate intrusion detection solution for emerging WSN-IoT deployments. Full article
(This article belongs to the Special Issue Machine Learning and Internet of Things in Industry 4.0)
Show Figures

Graphical abstract

22 pages, 1636 KB  
Article
Evaluating Reconstruction-Based and Proximity-Based Methods: A Four-Way Comparison (AE, LSTM-AE, OCSVM, IF) in SCADA Anomaly Detection Under Inverted Imbalance
by Lukasz Pawlik
Future Internet 2026, 18(2), 96; https://doi.org/10.3390/fi18020096 - 11 Feb 2026
Cited by 2 | Viewed by 523
Abstract
This article investigates and compares four unsupervised anomaly detection algorithms: the Autoencoder (AE), LSTM-Autoencoder (LSTM-AE), One-Class SVM (OCSVM), and the Isolation Forest (IF). The analysis focuses on SCADA telemetry data from an urban wind turbine, characterized by a unique case of extreme inverted [...] Read more.
This article investigates and compares four unsupervised anomaly detection algorithms: the Autoencoder (AE), LSTM-Autoencoder (LSTM-AE), One-Class SVM (OCSVM), and the Isolation Forest (IF). The analysis focuses on SCADA telemetry data from an urban wind turbine, characterized by a unique case of extreme inverted class imbalance, where operational anomalies constitute 75.7% of the records. The AE model, trained exclusively on the rare normal state, achieved the best overall performance (AUC 0.9667), maintaining balanced and high classification effectiveness for both classes (Recall Normal ≈ 95%, Recall Anomaly ≈ 88.5%; Macro F1-Score 0.8962). In contrast, the IF model, despite a strong discriminative ability (AUC 0.8616), exhibited a complete inability to correctly recognize the normal class (Recall Normal 0.00) when using the optimal F1-score threshold. This performance degradation was a direct consequence of the necessity to apply a classification threshold imposed by the statistical fraction of the anomaly-dominated dataset. These results empirically demonstrate the methodological superiority of the reconstruction-based approach (AE) in constructing a stable decision boundary independent of the statistically dominant class. The study provides quantitative guidelines for the selection and calibration of algorithms in PHM diagnostic systems where states deviating from the operational norm constitute the majority. Full article
(This article belongs to the Special Issue Machine Learning and Internet of Things in Industry 4.0)
Show Figures

Figure 1

22 pages, 2267 KB  
Article
Predicting Demand in Supply Chain Management: A Decision Support System Using Graph Convolutional Networks
by Stefani Sifuentes-Domínguez, Jose-Manuel Mejia-Muñoz, Oliverio Cruz-Mejia, Rubén Pizarro-Gurrola, Aracelí-Soledad Domínguez-Flores and Leticia Ortega-Máynez
Future Internet 2026, 18(1), 26; https://doi.org/10.3390/fi18010026 - 2 Jan 2026
Cited by 1 | Viewed by 1774
Abstract
This work addresses the problem of demand forecasting in supply chain management, where the consolidation of scattered and heterogeneous data and the lack of precise forecasting methods generate operational inefficiencies, resulting in increased backorders and high inventory costs. To tackle these challenges, we [...] Read more.
This work addresses the problem of demand forecasting in supply chain management, where the consolidation of scattered and heterogeneous data and the lack of precise forecasting methods generate operational inefficiencies, resulting in increased backorders and high inventory costs. To tackle these challenges, we propose a novel Decision Support System that jointly integrates an intelligent processing engine based on Graph Neural Networks (GNNs) for time series forecasting. Our approach lies in explicitly modeling the demand prediction task as a Multivariate Time Series forecasting problem on a causal dependency graph. Specifically, we use a GCN to process a graph where the nodes represent the target demand and key exogenous variables (Consumer Sentiment Index, Consumer Price Index, Personal Income, and Unemployment Rate), and the edges explicitly encode the interdependencies and causal relationships among these economic factors and demand. Unlike previous applications of GNNs in supply chain management, which typically focus on inventory networks or single-factor interactions, our approach uses GCN to dynamically capture the temporal interactions among multiple macroeconomic and internal series on future demand. We compare our method with other machine learning algorithms for demand forecasting. In the experiments conducted, the proposed GCN approach can accurately predict the abrupt changes that appear in demand behavior over time, whereas the other comparison methods tend to excessively smooth these transitions. Full article
(This article belongs to the Special Issue Machine Learning and Internet of Things in Industry 4.0)
Show Figures

Figure 1

38 pages, 4889 KB  
Article
Top-K Feature Selection for IoT Intrusion Detection: Contributions of XGBoost, LightGBM, and Random Forest
by Brou Médard Kouassi, Abou Bakary Ballo, Kacoutchy Jean Ayikpa, Diarra Mamadou and Minfonga Zié Jérôme Coulibaly
Future Internet 2025, 17(11), 529; https://doi.org/10.3390/fi17110529 - 19 Nov 2025
Cited by 3 | Viewed by 1553
Abstract
The rapid growth of the Internet of Things (IoT) has created vast networks of interconnected devices that are increasingly exposed to cyberattacks. Ensuring the security of such distributed systems requires efficient and adaptive intrusion detection mechanisms. However, conventional methods face limitations in processing [...] Read more.
The rapid growth of the Internet of Things (IoT) has created vast networks of interconnected devices that are increasingly exposed to cyberattacks. Ensuring the security of such distributed systems requires efficient and adaptive intrusion detection mechanisms. However, conventional methods face limitations in processing large and complex feature spaces. To address this issue, this study proposes an optimized intrusion detection approach based on Top-K feature selection combined with ensemble learning models, evaluated on the CICIoMT2024 dataset. Three algorithms, XGBoost, LightGBM, and Random Forest, were trained and tested on IoT datasets using three feature configurations: Top-10, Top-15, and the complete feature set. The results show that the Random Forest model provides the best balance between accuracy and computational efficiency, achieving 91.7% accuracy and an F1-score of 93% with the Top-10 subset while reducing processing time by 35%. These findings demonstrate that the Top-K selection strategy enhances the interpretability and performance of IDSs in IoT environments. Future work will extend this framework to real-time adaptive detection and edge computing integration for large-scale IoT deployments. Full article
(This article belongs to the Special Issue Machine Learning and Internet of Things in Industry 4.0)
Show Figures

Figure 1

16 pages, 1545 KB  
Article
Digital Twins: Strategic Guide to Utilize Digital Twins to Improve Operational Efficiency in Industry 4.0
by Italo Cesidio Fantozzi, Annalisa Santolamazza, Giancarlo Loy and Massimiliano Maria Schiraldi
Future Internet 2025, 17(1), 41; https://doi.org/10.3390/fi17010041 - 17 Jan 2025
Cited by 27 | Viewed by 8194
Abstract
The Fourth Industrial Revolution, known as Industry 4.0, has transformed the manufacturing landscape by integrating advanced digital technologies, fostering automation, interconnectivity, and data-driven decision-making. Among these innovations, Digital Twins (DTs) have emerged as a pivotal tool, enabling real-time monitoring, simulation, and optimization of [...] Read more.
The Fourth Industrial Revolution, known as Industry 4.0, has transformed the manufacturing landscape by integrating advanced digital technologies, fostering automation, interconnectivity, and data-driven decision-making. Among these innovations, Digital Twins (DTs) have emerged as a pivotal tool, enabling real-time monitoring, simulation, and optimization of production processes. This paper provides a comprehensive exploration of DT technology, offering a strategic framework for its effective implementation within Industry 4.0 environments to enhance operational efficiency. The proposed methodology integrates key enabling technologies such as the Internet of Things (IoT), Artificial Intelligence (AI), and Machine Learning to create accurate digital replicas of manufacturing systems. Through a detailed case study, this work demonstrates how DTs can optimize production processes, reduce downtime, and improve maintenance strategies. The findings highlight DTs’ transformative potential in achieving continuous improvement, competitiveness, and operational excellence. This research aims to provide organizations with actionable insights and a roadmap to leverage DT technology for sustainable industrial innovation. Full article
(This article belongs to the Special Issue Machine Learning and Internet of Things in Industry 4.0)
Show Figures

Figure 1

Review

Jump to: Editorial, Research, Other

38 pages, 3566 KB  
Review
Enhancing Industrial Processes Through Augmented Reality: A Scoping Review
by Alba Miranda, Aracely M. Vallejo, Paulina Ayala, Marcelo V. Garcia and Jose E. Naranjo
Future Internet 2025, 17(8), 358; https://doi.org/10.3390/fi17080358 - 7 Aug 2025
Cited by 2 | Viewed by 2735
Abstract
Augmented reality (AR) in industry improves training and technical assistance by overlaying digital information on real environments, facilitating the visualisation and understanding of complex processes. It also enables more effective remote collaboration, optimising problem solving and decision making in real time. This paper [...] Read more.
Augmented reality (AR) in industry improves training and technical assistance by overlaying digital information on real environments, facilitating the visualisation and understanding of complex processes. It also enables more effective remote collaboration, optimising problem solving and decision making in real time. This paper proposes a scoping review, using PRISMA guidelines, on the optimisation of industrial processes through the application of AR. The objectives of this study included characterising successful implementations of AR in various industrial processes, comparing different hardware, graphics engines, associated costs, and determining the percentage of optimisation achieved through AR. The databases included were Scopus, SpringerLink, IEEExplore, and MDPI. Eligibility criteria were defined as English-language articles published between 2019 and 2024 that provide significant contributions to AR applications in engineering. The Cochrane method was used to assess bias. The rigorous selection process resulted in the inclusion of 38 articles. Key findings indicate that AR reduces errors and execution times, improves efficiency and productivity, and optimises training and maintenance processes, leading to cost savings and quality improvement. Unity 3D is the most widely used graphics engine for AR applications. The main applications of AR are in maintenance, assembly, training and inspection, with maintenance being the most researched area. Challenges include the learning curve, high initial costs, and hardware limitations. Full article
(This article belongs to the Special Issue Machine Learning and Internet of Things in Industry 4.0)
Show Figures

Figure 1

42 pages, 9475 KB  
Review
Machine Learning and IoT-Based Solutions in Industrial Applications for Smart Manufacturing: A Critical Review
by Paolo Visconti, Giuseppe Rausa, Carolina Del-Valle-Soto, Ramiro Velázquez, Donato Cafagna and Roberto De Fazio
Future Internet 2024, 16(11), 394; https://doi.org/10.3390/fi16110394 - 26 Oct 2024
Cited by 30 | Viewed by 14427
Abstract
The Internet of Things (IoT) has radically changed the industrial world, enabling the integration of numerous systems and devices into the industrial ecosystem. There are many areas of the manufacturing industry in which IoT has contributed, including plants’ remote monitoring and control, energy [...] Read more.
The Internet of Things (IoT) has radically changed the industrial world, enabling the integration of numerous systems and devices into the industrial ecosystem. There are many areas of the manufacturing industry in which IoT has contributed, including plants’ remote monitoring and control, energy efficiency, more efficient resources management, and cost reduction, paving the way for smart manufacturing in the framework of Industry 4.0. This review article provides an up-to-date overview of IoT systems and machine learning (ML) algorithms applied to smart manufacturing (SM), analyzing four main application fields: security, predictive maintenance, process control, and additive manufacturing. In addition, the paper presents a descriptive and comparative overview of ML algorithms mainly used in smart manufacturing. Furthermore, for each discussed topic, a deep comparative analysis of the recent IoT solutions reported in the scientific literature is introduced, dwelling on the architectural aspects, sensing solutions, implemented data analysis strategies, communication tools, performance, and other characteristic parameters. This comparison highlights the strengths and weaknesses of each discussed solution. Finally, the presented work outlines the features and functionalities of future IoT-based systems for smart industry applications. Full article
(This article belongs to the Special Issue Machine Learning and Internet of Things in Industry 4.0)
Show Figures

Figure 1

Other

22 pages, 3280 KB  
Systematic Review
From IoT to AIoT: Evolving Agricultural Systems Through Intelligent Connectivity in Low-Income Countries
by Selain K. Kasereka, Alidor M. Mbayandjambe, Ibsen G. Bazie, Heriol F. Zeufack, Okurwoth V. Ocama, Esteve Hassan, Kyandoghere Kyamakya and Tasho Tashev
Future Internet 2026, 18(2), 82; https://doi.org/10.3390/fi18020082 - 3 Feb 2026
Cited by 2 | Viewed by 1033
Abstract
The convergence of Artificial Intelligence and the Internet of Things has given rise to the Artificial Intelligence of Things (AIoT), which enables connected systems to operate with greater autonomy, adaptability, and contextual awareness. In agriculture, this evolution supports precision farming, improves resource allocation, [...] Read more.
The convergence of Artificial Intelligence and the Internet of Things has given rise to the Artificial Intelligence of Things (AIoT), which enables connected systems to operate with greater autonomy, adaptability, and contextual awareness. In agriculture, this evolution supports precision farming, improves resource allocation, and strengthens climate resilience by enhancing the capacity of farming systems to anticipate, absorb, and recover from environmental shocks. This review provides a structured synthesis of the transition from IoT-based monitoring to AIoT-driven intelligent agriculture and examines key applications such as smart irrigation, pest and disease detection, soil and crop health assessment, yield prediction, and livestock management. To ensure methodological rigor and transparency, this study follows the PRISMA 2020 guidelines for systematic literature reviews. A comprehensive search and multi-stage screening procedure was conducted across major scholarly repositories, resulting in a curated selection of studies published between 2018 and 2025. These sources were analyzed thematically to identify technological enablers, implementation barriers, and contextual factors affecting adoption particularly within low-income countries where infrastructural constraints, limited digital capacity, and economic disparities shape AIoT deployment. Building on these insights, the article proposes an AIoT architecture tailored to resource-constrained agricultural environments. The architecture integrates sensing technologies, connectivity layers, edge intelligence, data processing pipelines, and decision-support mechanisms, and is supported by governance, data stewardship, and capacity-building frameworks. By combining systematic evidence with conceptual analysis, this review offers a comprehensive perspective on the transformative potential of AIoT in advancing sustainable, inclusive, and intelligent food production systems. Full article
(This article belongs to the Special Issue Machine Learning and Internet of Things in Industry 4.0)
Show Figures

Figure 1

Back to TopTop