Previous Article in Journal
Dynamic Modeling of Poultry Litter Composting in High Mountain Climates Using System Identification Techniques
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Enabling Intelligent Industrial Automation: A Review of Machine Learning Applications with Digital Twin and Edge AI Integration

School of Engineering & Engineering Technology, University of Arkansas at Little Rock, 2801 S University Ave, Little Rock, AR 72204, USA
*
Author to whom correspondence should be addressed.
Automation 2025, 6(3), 37; https://doi.org/10.3390/automation6030037
Submission received: 21 June 2025 / Revised: 26 July 2025 / Accepted: 1 August 2025 / Published: 5 August 2025
(This article belongs to the Section Industrial Automation and Process Control)

Abstract

The integration of machine learning (ML) into industrial automation is fundamentally reshaping how manufacturing systems are monitored, inspected, and optimized. By applying machine learning to real-time sensor data and operational histories, advanced models enable proactive fault prediction, intelligent inspection, and dynamic process control—directly enhancing system reliability, product quality, and efficiency. This review explores the transformative role of ML across three key domains: Predictive Maintenance (PdM), Quality Control (QC), and Process Optimization (PO). It also analyzes how Digital Twin (DT) and Edge AI technologies are expanding the practical impact of ML in these areas. Our analysis reveals a marked rise in deep learning, especially convolutional and recurrent architectures, with a growing shift toward real-time, edge-based deployment. The paper also catalogs the datasets used, the tools and sensors employed for data collection, and the industrial software platforms supporting ML deployment in practice. This review not only maps the current research terrain but also highlights emerging opportunities in self-learning systems, federated architectures, explainable AI, and themes such as self-adaptive control, collaborative intelligence, and autonomous defect diagnosis—indicating that ML is poised to become deeply embedded across the full spectrum of industrial operations in the coming years.

1. Introduction

1.1. Overview

Industrial automation is undergoing a significant transformation driven by the integration of advanced technologies, with machine learning (ML) at the forefront. Traditional automation systems, based on rigid rule sets and fixed logic, often struggled to adapt to the variability and complexity inherent in modern manufacturing environments. Over the past decade, ML-driven intelligent automation has emerged as a game-changer—offering systems the ability to learn from historical and real-time data, recognize patterns, make predictions, and continuously optimize operations without explicit reprogramming [1,2,3]. This shift has profoundly impacted several core areas of manufacturing. In Predictive Maintenance, ML models analyze sensor data to predict equipment failures before they occur, thereby reducing downtime and extending asset lifespan. In Quality Control, image-based deep learning and signal analysis enable real-time defect detection with higher accuracy than human inspection. In Process Optimization, ML algorithms adaptively fine-tune operational parameters to increase yield, minimize waste, and reduce energy consumption [4,5]. The evolution of Industry 4.0 has further accelerated this transformation, introducing a cyber–physical framework characterized by interconnectivity, decentralized decision-making, and real-time data analytics. This revolution is powered by the convergence of Industrial Internet of Things (IIoT), cloud computing, sensor networks, and advanced control systems, producing massive volumes of heterogeneous data [6]. However, generating data alone is not sufficient; it must be meaningfully processed and applied. Here, ML and AI emerge as critical tools for extracting actionable insights from complex datasets [7].
As digital transformation advances, attention is increasingly shifting toward the principles of Industry 5.0, which emphasizes human-centricity, resilience, and sustainability. Industry 5.0 advocates for the integration of ethical artificial intelligence, socially responsible innovation, and environmentally conscious manufacturing practices [8]. Within this paradigm, advanced technologies—such as Edge AI and Digital Twins (DTs)—serve as key enablers in enhancing system responsiveness, interpretability, and operational resilience while fostering greater collaboration between humans and intelligent systems [9]. Edge AI refers to deploying ML models directly on edge devices—such as embedded systems and industrial controllers—allowing for low-latency, real-time decision-making without relying on cloud infrastructure [10]. This is particularly valuable for time-sensitive tasks like defect detection and anomaly response on the production line. Digital Twins, on the other hand, are virtual replicas of physical assets or systems that continuously synchronize with real-time operational data. They serve as simulation and decision-support tools, enabling predictive analytics, what-if testing, and system optimization without interfering with live operations. When combined with ML, Edge AI and Digital Twins become intelligent, adaptive models capable of autonomous decision support and system improvement [11,12]. Together, these technologies underpin the four core design principles of Industry 4.0: interconnection, information transparency, technical assistance, and decentralized decision-making [13]. In this paper, we explore how AI technologies are advancing PdM, QC, and PO, and how the integration of Digital Twins and Edge AI is enabling scalable, responsive, and intelligent automation across industrial settings.
To ground this review in current research, papers were systematically collected from leading academic and industrial databases including Web of Science, Scopus, and IEEE Xplore. Priority was given to works demonstrating practical deployments, novel ML architectures, or integration with enabling technologies such as Digital Twins and Edge AI. This paper is structured to answer four key research questions.
RQ1: What are the emerging trends in ML models and enabling technologies (e.g., Digital Twins, Edge AI) across the domains of Predictive Maintenance, Quality Control, and Process Optimization?
RQ2: What types of datasets, sensor modalities, and input–output configurations are used in ML applications for PdM, QC, and PO?
RQ3: What are the major technical and deployment challenges faced by ML-based solutions in real-world industrial environments?
RQ4: How do enabling paradigms such as Digital Twins and Edge AI contribute to scalable, adaptive automation in industrial settings, and what are the remaining research gaps?
Together, these questions guide a comprehensive review of the role ML plays in advancing intelligent, resilient, and data-driven manufacturing systems, with particular emphasis on how Digital Twin and Edge AI technologies are enabling this transformation.

1.2. Related Works

Numerous review studies have investigated the role of machine learning (ML) in industrial automation, with varying degrees of breadth and specificity. Several works have focused on targeted ML application domains—such as Predictive Maintenance [14], fault diagnosis [15], or quality assurance [16]—while others provide broader overviews of AI and ML integration across smart manufacturing processes [17,18].
For instance, the review by Plathottam et al. categorizes ML applications across five key manufacturing domains—production, quality, maintenance, logistics, and safety—while also detailing challenges related to data availability and model explainability [19]. Similarly, Bertolini et al. explore ML challenges and trends across the manufacturing pipeline, with a strong focus on data acquisition, modeling complexity, and model interpretability [20]. Kim and Kong provide a structured classification of ML applications by task type (e.g., classification, regression, clustering), offering valuable insight into algorithm-task alignment [21].
In contrast, our work contributes a more up-to-date perspective by systematically reviewing recent ML applications with a specific focus on three high-impact domains: Predictive Maintenance, Quality Control, and Process Optimization. Unlike previous works, it integrates two advanced paradigms—Edge AI and Digital Twins—not only as enabling technologies but also as convergence points for intelligent decision-making. Our review uniquely focuses on how ML integrates with these paradigms to enable real-time, adaptive, and scalable automation solutions.
To contextualize the evolution of this field, Figure 1 illustrates a bibliographic trend analysis capturing the yearly growth of publications across PdM, QC, PO, DT, and Edge AI. The chart reveals a sharp rise in QC and PO studies from 2021 onward, while DTs and Edge AI show increasing momentum in recent years. This trajectory not only underscores the timeliness of our review but also highlights the emerging significance of distributed intelligence in industrial settings.
Furthermore, while the aforementioned papers focus either on task categorization or domain-specific use cases, our study explicitly analyzes key aspects such as data source characteristics, input–output variable design, ML model types, and deployment environments. The integration of Digital Twin frameworks and the detailed discussion on Edge AI deployment trends—largely underrepresented in earlier reviews—also set this work apart.
Thus, this paper complements and advances existing reviews by offering a focused yet integrative analysis of ML-driven industrial automation, grounded in a synthesis of PdM, QC, and PO applications, while pointing toward future research directions on federated learning, self learning Digital Twins, and adaptive Edge AI for sustainable industry ecosystems.
The remainder of the paper is organized as follows: Section 2 introduces core machine learning algorithms and reviews recent ML applications in PdM, QC, and PO; Section 3 highlights the enabling roles of Digital Twin and Edge AI technologies; Section 4 surveys datasets, data acquisition tools, and industrial platforms; Section 5 outlines the limitations of current practices and proposes future research directions, while Section 6 concludes with a summary of key insights and contributions of this study.

2. Machine Learning for Predictive Maintenance, Quality Control, and Process Optimization in Industrial Automation

2.1. Machine Learning Algorithms

Machine learning is broadly categorized into supervised learning (SL), unsupervised learning (UL), and reinforcement learning (RL). SL uses labeled data to train models for classification (categorical outputs) and regression (continuous outputs) using algorithms like neural networks (NNs), Support Vector Machine (SVM), Decision Trees (DTrees), and Logistic Regression (LR). UL works with unlabeled data to discover patterns through clustering, dimensionality reduction (e.g., PCA, Autoencoders), and density estimation. RL involves agents learning by interacting with environments to maximize rewards over time, often using Q-learning or deep reinforcement learning, with applications in Process Optimization and supply chains. Figure 2 presents a structured overview of machine learning algorithm classifications.

2.2. ML Integration in Predictive Maintenance, Quality Control, and Process Optimization

In this study, we examine the application of machine learning (ML) techniques across three key domains of industrial automation:
  • Predictive Maintenance;
  • Quality Control;
  • Process Optimization.
These domains were selected due to their critical roles in enhancing operational reliability, reducing production costs, and improving product consistency. Each represents a distinct yet complementary area where ML has demonstrated significant potential in augmenting traditional industrial practices. The following sections provide a structured analysis of recent advancements in ML-based approaches within these domains, highlighting the algorithms, methodologies, and deployment strategies that are shaping the future of intelligent manufacturing systems.

2.2.1. Predictive Maintenance

Predictive Maintenance is one of the most vibrant and rapidly evolving application areas in industrial automation and utilizes sensor technologies, data analytics, and ML algorithms to anticipate machine failures before they occur. The growing interest is driven by its direct impact on reducing unplanned downtimes, optimizing maintenance cycles, and ultimately improving system reliability and cost efficiency [22]. Recent contributions highlight notable advances in automation, interpretability, and adaptability of PdM systems, pushing the boundaries of traditional maintenance paradigms. Table 1 shows the applications of ML in PdM.
A particularly compelling development is the integration of AutoML frameworks in PdM pipelines. Tools like PyCaret and AutoKeras can significantly streamline the process of selecting and tuning machine learning models for fault classification in ball bearings. The unique contribution of this work is the elimination of intensive manual feature engineering, making it especially appealing for small to medium enterprises (SMEs) seeking low-code predictive capabilities [23]. A recent case study deployed an interpretable AutoML framework on the ball bearing benchmark dataset, demonstrating how automated feature selection and model tuning can enhance PdM accuracy while maintaining model transparency—an essential factor for industrial adoption [24]. Image-based deep learning has been used for time-series data to improve accuracy in fault detection. Multivariate sensor data is converted into time-series images and then convolutional neural networks (CNNs) are used to detect faults of conveyor belt motors. This approach leveraged spatial representation for enhanced accuracy and addressed the challenge of handling high-dimensional, asynchronous signals in industrial settings [25].
The evolution of PdM is marked by a shift from traditional signal analysis to advanced LSTM-GAN models for accurate remaining useful life (RUL) prediction in bearing systems. By combining LSTM (as a generative model) and autoencoders (as discriminators), the method tackles the common issue of error accumulation in long-term degradation forecasts, providing more stable and accurate predictions [26]. Domain-specific optimization in PdM is gaining traction, as seen in the use of SVR models to predict lubricant wear from robot sensor data, reducing reliance on manual testing [27]. This signals a broader movement toward context-aware PdM, where models are tailored not just to generalize, but to extract meaningful insights in operationally relevant terms. Cost-sensitive maintenance decision-making continues to be a priority. Unsupervised anomaly detection (e.g., Isolation Forest) has been combined with reliability models like Generalized Fault Trees to develop robust hybrid systems for tools such as injection molds. These approaches don’t just detect faults—they quantify the economic trade-offs between preventive and corrective maintenance, making them invaluable for strategic planning [28].
Corrective maintenance can lead to substantial operational expenses. It can be avoided with the help of ML techniques. Artificial neural networks (ANNs) can be used to predict the future condition of a motor and figure out the time of failure [29]. Line-start permanent magnet synchronous motors (LS-PMSM) are widely used in the industry. But broken rotor bars are one of the most significant faults of this kind of motor, leading to other secondary failures and eventually collapses. Random forest algorithms can diagnose this kind of fault in LS-PMSM by extracting features from the startup transient current signal [30]. General Electric (GE) has used Predictive Maintenance in its manufacturing facilities. By analyzing sensor data from industrial equipment, they can predict potential issues and schedule maintenance before a breakdown occurs. [31]. These studies collectively highlight how data-driven Predictive Maintenance strategies can transition industries from reactive to proactive maintenance paradigms.
Table 1. ML application in Predictive Maintenance.
Table 1. ML application in Predictive Maintenance.
Sub-AreaYear and ReferenceAlgorithmTask, Methodology, and Outcome
Fault Prediction2023, [32]LSTM, KNN, KGTask: Robot state prediction and PdM strategy generation.
Methodology: LSTM for state detection, KNN for fault prediction, and KG for decision support.
Outcome: Closed-loop PdM system for welding robots.
Fault Prediction2023, [33]RF, GB, DLTask: Predict failures in a manufacturing plant.
Methodology: ML models trained on factory equipment data.
Outcome: Improved failure prediction, reduced downtime.
Fault Prediction2022, [34]SVM, BNN, RFTask: Fault detection and classification in low voltage motors.
Methodology: Two-phase ML approach (abnormal behavior detection & fault type prediction).
Outcome: Reduced detection time, accurate fault diagnosis.
Fault Prediction2018, [35]LSTMTask: Build a smart predictive maintenance system for early fault detection and technician support.
Methodology: Used IoT sensors for data collection, LSTM/GRU for failure prediction, and AR tools (HoloLens) to guide maintenance actions.
Outcome: Improved fault prediction and reduced downtime. AR support made maintenance faster and easier for operators.
Fault Prediction2018, [36]BNTask: Develop a fault modeling and diagnosis system.
Methodology: A Bayesian Network (BN) framework was used to represent causal relationships between process parameters and faults. A hybrid learning system was created to improve fault prediction and root cause analysis.
Outcome: The system demonstrated improved fault modeling and interpretability.
Fault Prediction2018, [37]RFTask: Develop a real-time fault detection and diagnosis system in smart factory environments.
Methodology: Employed a big data pipeline integrating data acquisition, storage, preprocessing, and analytics.
Outcome: Achieved over 90% accuracy in fault classification across multiple use cases.
Fault Prediction2017, [38]ANNTask: Enable predictive maintenance in machine centers.
Methodology: Proposed a five-step framework integrating sensors, AI, CPS, and ANN for fault diagnosis and prognosis.
Outcome: Successfully predicted faults weeks in advance, enabling proactive maintenance.
Fault Prediction2023, [39]CFTask: Cooling system monitoring.
Methodology: Open-source R-based DSS with data preprocessing and predictive models.
Outcome: Cost-effective PdM for SMEs.
Condition Monitoring2021, [40]ETTask: Develop scalable PdM framework.
Methodology: Modular edge-cloud architecture with plug-and-play sensor integration and time-series ML.
Outcome: Demonstrated early condition degradation in HPC components.
Condition Monitoring2019, [41]PCA, DTree, RF, KNN, SVMTask: Predict tool wear in CNC end-milling operations using multi-sensor data.
Methodology: Time and frequency domain features were extracted and fused.
Outcome: RF achieved the best performance. Sensor fusion enhanced prediction accuracy over individual sensors.
Condition Monitoring2018, [42]LDA, ClusteringTask: Improve fault diagnosis in Fused Deposition Modeling (FDM) using acoustic emission data to monitor extruder health.
Methodology: Extracted time/frequency domain features were reduced via LDA. Unsupervised clustering (CFSFDP) was used to identify states without prior labels.
Outcome: Achieved 90.2% classification accuracy across five states using 2D feature space.
Lifetime Prediction2023, [43]RF, XGBoost, MLP, SVRTask: Remaining useful life estimation.
Methodology: Comparative ML modeling with filtering, clustering, and feature engineering.
Outcome: RF achieved best results; prevented 42% of failures.
Lifetime Prediction2021, [44]SLTask: RUL prediction for robot reducer.
Methodology: Use motor current signature analysis (MCSA) features in ML model.
Outcome: Effective health state classification.
Cost Minimization2022, [45]SLTask: Develop PdM for wiring firms.
Methodology: Expert system using ML to reduce downtime.
Outcome: Identifies AI as cost-effective alternative to PdM.
Cost Minimization2019, [46]SLTask: Optimize maintenance timing in parallel production lines.
Methodology: Used multi-agent PPO-based reinforcement learning in a simulated environment.
Outcome: Reduced breakdowns by 80%, and cut maintenance costs by 19%.

2.2.2. Quality Control

Quality Control is a systematic process employed in manufacturing and other industries to ensure that products or services meet defined quality standards and specifications. It involves the inspection and testing of products to identify defects or deviations from desired quality levels, thereby ensuring consistency and reliability in output [20]. QC represents a rapidly evolving domain within industrial automation where ML techniques, as shown in Table 2, are delivering measurable benefits in defect detection, real-time monitoring, and predictive modeling. Recent advancements are characterized by the integration of deep learning architectures, the emergence of soft sensor-based inspections, and the fusion of visual and contextual data streams for high-precision assessments.
In the printing industry, deep convolutional networks have been used for real-time industrial inspection. A deep learning-based soft sensor with a high-resolution optical camera is integrated in the industrial automation for gravure cylinder surface inspection [47]. This system not only improved defect detection accuracy but also significantly reduced manual inspection time, highlighting the potential of computer vision in quality-intensive sectors. Similarly, in the food packaging industry, pretrained convolutional networks such as-DenseNet161, ResNet50, etc., were employed to automate defect detection in thermoformed trays. The model, trained on domain-specific image datasets, showed high reliability in identifying sealing and closure anomalies—a task traditionally reliant on human inspection and prone to inconsistency [48].
Another compelling approach involves time-series-based predictive quality modeling in automotive manufacturing. A combination of supervised models (LSTM, RF, NNs) has been used to predict hole positioning in bumper beams during milling operations. By forecasting potential deviations early in the process, their system facilitated in-process adjustments, reducing tolerance violations and production scrap [49]. Emerging work on online quality assessment systems introduces lightweight models suitable for deployment in real-time production environments. WDCNN with Follow-the-Regularized-Leader, an online learning algorithm, is used for condition assessment in cars and general manufacturing. These methods adapt to evolving process data, making them especially relevant for high-mix, low-volume production contexts [50].
Across recent contributions to Quality Control research, three key patterns emerge. First, visual inspection using deep learning is becoming standard for complex surface and packaging tasks, where high-resolution, image-based data is readily available. Second, predictive modeling with time-series data facilitates early detection of dimensional or positional deviations, making it well suited for continuous manufacturing environments. Finally, online and adaptive learning approaches are being adopted to enhance model robustness in dynamic industrial settings, supporting the development of real-time QC systems for both small and large-scale operations. The Quality Control landscape is marked by a shift from retrospective quality checks to real-time, predictive, and autonomous decision-making systems.
Table 2. ML application in Quality Control.
Table 2. ML application in Quality Control.
Sub-AreaYear and ReferenceAlgorithmTask, Methodology, and Outcome
Defect Detection2024, [51]YOLOv5, OCR, CNNTask: Real-time defect detection in tuna cans. Methodology: Used YOLOv5 for can inspection, OCR for label detection, integrated with IoT stack (Node-RED, Grafana). Outcome: High-speed classification, automated sorting via robotic arm.
Defect Detection2023, [52]LSTM, RF, NNTask: Predict hole locations in bumper beams to preempt quality issues.
Methodology: Trained time-series models using previous beam measurements.
Outcome: Improved early detection of tolerance violations, enhancing QC and reducing scrap.
Defect Detection2023, [53]Custom CNNTask: Visual defect detection in casting.
Methodology: Developed custom CNN and deployed on shop floor via user-friendly app.
Outcome: Achieved better accuracy in image-based inspection for castings.
Defect Detection2022, [54]CNNTask: Visual flaw detection with explainability.
Methodology: Combined CNN for image analysis with ILP for rule-based reasoning, integrated human-in-the-loop feedback.
Outcome: Created a system offering human-verifiable justifications.
Defect Detection2019, [55]CNNTask: On-line defect recognition in Selective Laser Melting (SLM) during additive manufacturing.
Methodology: Developed a bi-stream Deep Convolutional Neural Network (DCNN) to analyze layer-wise in-process images and detect defects caused by improper SLM parameters.
Outcome: Achieved 99.4% defect classification accuracy; supports adaptive SLM process control and real-time quality assurance.
Defect Detection2018, [56]DTreeTask: Detect keyholing porosity and balling instabilities in Laser Powder Bed Fusion (LPBF).
Methodology: Applied SIFT to extract melt pool features, encoded using Bag-of-Words representation, followed by classification with SVM.
Outcome: Enabled accurate identification of melt pool defects, supporting Quality Control in LPBF processes.
Image Recognition2019, [49]SIFT, SVMTask: Monitor and predict tool wear conditions in milling operations.
Methodology: Tool condition classification was performed using a SVM. A cloud dashboard was used for monitoring and visualization.
Outcome: Enabled efficient and scalable monitoring of tool conditions, supporting timely maintenance decisions.
Image Recognition2018, [57]SVMTask: Detect anomalies and failures in industrial manufacturing processes.
Methodology: Employed an intelligent agent with a threshold-based decision algorithm and trained it using operational data.
Outcome: Enabled proactive fault detection and efficient process management, reducing unexpected downtimes.
Image Recognition2018, [58]CNNTask: Predict track width and continuity in LPBF using video analysis.
Methodology: Trained a CNN using supervised learning on 10 ms in situ video clips of the LPBF process.
Outcome: Enabled accurate prediction of track features from video, supporting real-time quality monitoring.
Online Quality Control2023, [50]WDCNN, FTRLTask: Real-time quality assessment of cars and bearings.
Methodology: Applied online learning (incremental updates) with identity parsing on streaming data using river in Python 3.9 programming environment.
Outcome: Achieved real-time classification with stable accuracy.
Online Quality Control2021, [48]CNNTask: Detect sealing and closure defects in food trays inline.
Methodology: Built a modular system using CNNs trained on domain-specific image datasets.
Outcome: Achieved near 100% defect detection rate inline, with <0.3% false positives.
Online Quality Control2019, [59]SVMTask: Enable cost-efficient real-time QC in automotive manufacturing.
Methodology: Applied an SVM considering inspection costs and error types; performance assessed via Design of Experiments.
Outcome: Effective QC with improved cost sensitivity and error handling.
Online Quality Control2019, [60]SDAETask: Perform robust pattern recognition from noisy signals.
Methodology: Used SDAE for unsupervised feature extraction and supervised regression fine-tuning.
Outcome: Improved generalization and feature robustness for classification tasks.

2.2.3. Process Optimization

Process Optimization represents a critical domain in industrial automation where ML is increasingly leveraged to improve efficiency, adapt to dynamic conditions, and minimize resource usage. As shown in Table 3, recent contributions demonstrate an encouraging trend toward the integration of ML with physical modeling, virtual environments, and human–machine interactions—highlighting a shift from static parameter tuning to adaptive, intelligent process control.
A compelling development is seen in the context of thermoplastic composites manufacturing, where a hybrid ML framework was proposed to optimize the Automated Fiber Placement (AFP) process. This approach integrates ANNs, FEA, and VSG to overcome the small-data bottleneck common in advanced manufacturing. The study demonstrated how ML models trained on both synthetic and experimental data could significantly reduce defect rates while optimizing key parameters like speed, heat, and compaction pressure [25]. The concept of an “Artificial Neural Twin” was introduced for the plastic recycling industry. This virtual twin environment combined differentiable data fusion and model predictive control (MPC) with supervised and unsupervised learning to optimize decentralized process chains. Notably, the use of simulation-based training within a Unity-developed environment allowed for safe, iterative learning—especially useful for distributed manufacturing systems where real-time experimentation may be costly or risky [61].
Human-centered optimization has also gained traction. A data-driven adaptation model is utilized for industrial HMIs, mining 151 days of interaction logs to identify user patterns and generate adaptive interface rules. The goal was to improve efficiency and decision-making in repetitive operator tasks, pointing toward a broader trend of integrating behavioral analytics and rule learning into interface design [62]. In more traditional sectors, such as water treatment and desalination, ML models such as ANFIS, BPNN, SVR, and RBF networks were applied to optimize membrane-based processes like electrodialysis and reverse osmosis. These models helped predict pollutant removal efficiency and system throughput under variable conditions, demonstrating ML’s capability to capture nonlinear, multi-parameter dependencies in physical systems [63].
Across these works, several recurring themes highlight the evolving landscape of process optimization. Hybrid modeling is increasingly adopted to combine machine learning with domain-specific physics, particularly in scenarios where experimental data is limited or costly [64]. Collectively, these trends mark a shift from static parameter tuning toward self-adaptive, learning-based systems that evolve with the production environment [65].
Table 3. ML application in Process Optimization.
Table 3. ML application in Process Optimization.
Sub-AreaYear and ReferenceAlgorithmTask, Methodology, and Outcome
Performance Prediction2024, [61]MPCTask: Optimize process chains via decentralized learning.
Methodology: Uses a quasi-neural network model with gradient-based continual learning across distributed nodes.
Outcome: Enables continual optimization without compromising data sovereignty.
Performance Prediction2022, [66]BOPCTask: Improve solar cell efficiency using data-efficient optimization.
Methodology: BO with human-in-the-loop feedback and prior knowledge constraints.
Outcome: Achieved 18.5% PCE with only 100 tests—faster than conventional methods.
Performance Prediction2019, [63]ANN, GA, RBF, BPNN, ANFIS, SVRTask: Optimize and model desalination and treatment processes.
Methodology: Benchmarked ANN/GA vs. classical models for ion rejection, flux prediction, pollutant removal, etc.
Outcome: ANN-based tools achieved superior prediction accuracy and process adaptability.
Performance Prediction2019, [67]NNTask: Predict temperature and density evolution from laser trajectories.
Methodology: Used three neural networks with a localized trajectory decomposition technique.
Outcome: Enabled spatially-aware predictions for process monitoring.
Performance Prediction2018, [68]CNNTask: Identify geometries that are hard to manufacture.
Methodology: Applied a 3D CNN with a secondary interpretability method to analyze feature contribution.
Outcome: Accurately predicted and explained manufacturability issues.
Performance Prediction2018, [69]RF, SVMTask: Predict lead time in variable-demand flow shops.
Methodology: Employed a twin model with frequent retraining and online learning.
Outcome: Achieved adaptive and accurate lead time forecasts.
Process Control2025, [70]RSM-GA, ANN-GA, ANFIS-GATask: Maximize tensile, flexural, and compressive strengths in FDM parts.
Methodology: Used hybrid optimization combining RSM and AI methods on experimental design.
Outcome: Hybrid models improved strength by up to 8.86% across mechanical tests.
Process Control2024, [71]TD3, PPOTask: Develop autonomous process control in injection molding.
Methodology: Combines supervised learning and DRL in a Digital Twin framework.
Outcome: Real-time optimization with reduced human involvement and improved quality/cost–efficiency balance.
Process Control2022, [64]ANNTask: Optimize AFP process to reduce defects and improve ILSS.
Methodology: Combined ANN with photonic sensors, VSG, and FEA simulations.
Outcome: Developed a decision-support tool to automate parameter tuning and defect minimization.
Process Control2020, [72]QLrnTask: Optimize control in nonlinear, uncertain manufacturing processes.
Methodology: Applied Q-learning for independent decision-making under partial observability.
Outcome: Achieved adaptive control despite randomness and incomplete information.
Process Control2019, [73]SVMTask: Improve grinding parameters for helical flutes.
Methodology: Combined simulation, SVM prediction, and simulated annealing to optimize feed rate and grinder speed.
Outcome: Enhanced surface quality and process efficiency.
Scheduling2019, [74]QLrnTask: Minimize makespan in robotic assembly lines.
Methodology: Used multi-agent reinforcement learning for dynamic planning and task scheduling.
Outcome: Improved scheduling efficiency in multi-robot systems.
Scheduling2018, [75]Bagging, BoostingTask: Optimize job shop scheduling via dispatching rule selection.
Methodology: Evaluated bagging, boosting, and stacking for rule selection.
Outcome: Reduced mean tardiness and flow time.

3. Machine Learning-Driven Digital Twins and Edge AI for Industrial Automation

Digital Twins and Edge artificial intelligence are key technological enablers in advancing intelligent industrial systems. DTs facilitate the creation of virtual replicas of physical assets, enabling real-time monitoring, simulation, and predictive decision-making. Meanwhile, Edge AI enables localized data processing at the source, minimizing latency and bandwidth consumption while supporting fast, autonomous responses. Together, these technologies form a powerful synergy for optimizing operations, reducing downtime, and enhancing productivity across industrial domains. This section presents a comprehensive overview of how DTs and Edge AI are being applied across various functions—ranging from predictive maintenance to adaptive quality control—highlighting their integration, implementation strategies, and industrial impact.

3.1. ML-Driven Digital Twin Applications for Predictive Maintenance, Quality Control, and Process Optimization

Digital Twin technology has emerged as one of the most transformative innovations in industrial automation. A DT is a dynamic virtual representation of a physical system, process, or asset—ranging from individual machines to entire facilities—that enables bidirectional communication between the physical and virtual domains. This closed-loop connection, facilitated by real-time data from sensors and industrial IoT devices, enhances visibility, predictability, and operational control.
What distinguishes a full-fledged Digital Twin from basic simulations or static models is its virtual-to-physical feedback loop. Insights generated in the digital environment—through simulation, prediction, or learning—are transmitted back to the physical system for action [76]. For instance, in predictive maintenance applications, a DT can identify early signs of equipment degradation and proactively trigger maintenance operations, thereby improving system uptime and reliability [77].
The convergence of DTs with ML has further elevated their utility. ML-driven DTs transform from passive mirrors into active agents capable of predictive insight and adaptive decision-making. These systems can learn from both real-time and historical data streams, enabling detection of anomalies, estimation of remaining useful life (RUL), and optimization of process parameters [38,78,79,80]. Figure 3 illustrates a schematic representation of a DT integrated with AI components.
Digital Twins vary in complexity and intelligence, ranging from descriptive (replicating current states) and diagnostic (identifying root causes) to predictive, prescriptive, and autonomous types that leverage AI for self-adaptive decision-making. These types reflect increasing levels of functional integration with analytics and control systems [82]. A comprehensive Digital Twin (DT) is structured across five core dimensions, as illustrated in Figure 4: (a) the Physical Entity being mirrored, (b) its Virtual Entity capturing behavior and structure, (c) data enabling real-time bidirectional flow, (d) services providing analytics and ML-driven intelligence, and (e) connections integrating these components across the system lifecycle [83]. This multidimensional architecture forms a closed-loop framework for synchronized, intelligent industrial operations. The impact of this synergy is particularly evident in Predictive Maintenance, Quality Control, and Process Optimization, where the fusion of DTs and ML is driving measurable improvements in efficiency, reliability, and autonomy, as shown in Table 4.
In Predictive Maintenance, ML-empowered DTs enable real-time diagnostics and long-term health-state monitoring. For instance, in photovoltaic (PV) systems, a DT integrated with ensemble ML models and neural networks achieved high-accuracy fault diagnostics by comparing digital outputs with field data. A cloud-based DT used XGBoost to estimate PV system health from sensor telemetry, enabling timely interventions [84,85]. In smart forging, a PPO-based DRL agent was trained within a DT environment to dynamically adjust induction heating coil power, reducing temperature variation and improving product quality [86].
In Quality Control, novel frameworks like QUILT have demonstrated the integration of DTs with ML using side-channel signals (acoustic, magnetic, vibration) to detect print anomalies in legacy 3D printers without hardware modification [87]. Another multi-domain DT system provided real-time detection of fatigue, abnormal behavior, and energy anomalies in textile manufacturing environments [88].
In Process Optimization, DTs serve as simulation-enhanced environments to train and validate AI models. The Artificial Neural Twin combined supervised and unsupervised learning with MPC to optimize decentralized manufacturing lines via Unity-based simulations [61]. A DT-driven CatBoost model optimized energy use in an industrial drying process, resulting in annual energy savings of 3.7 MWh [89]. Ant Colony Optimization integrated within a robotic DT facilitated collision-free path planning for industrial arms, demonstrating the potential of heuristic RL agents in real-time control [90].
A broader view reveals that DTs are now integral to Sim-to-Real transfer workflows, where agents trained virtually are deployed in physical systems with minimal retraining. DRL agents trained in a DT environment for robotic grasping achieved high transfer success rates, aided by integrated safety modules [91]. Furthermore, DTs are enabling cross-domain orchestration, as seen in the SMS-DT framework, which leveraged supervised and unsupervised ML models across edge, platform, and enterprise layers to manage cyber–physical security and operational resilience in smart factories [88].
Several unifying themes emerge from these applications. First, simulation-enhanced learning supports AI model development in data-scarce environments using synthetic data generated from the DT [86,92]. Second, real-time adaptability is achieved through continuous live data streaming, enabling the models to evolve alongside the system. Third, in safety-critical environments, DTs combined with physics-informed models and explainable AI (XAI) improve transparency and decision trustworthiness [93]. Lastly, the shift toward edge-compatible DTs supports decentralized intelligence in resource-constrained settings.
Table 4. AI-driven Digital Twins in industrial automation.
Table 4. AI-driven Digital Twins in industrial automation.
AreaSub-AreaPublication Year and ReferenceAlgorithmTask, Methodology, and Outcome
Predictive MaintenanceFault Prediction2024, [94]LSTM, CNNTask: Predict early failure of SiC/GaN semiconductors.
Methodology: Built a DT for thermal monitoring and used ML for degradation prediction.
Outcome: Enabled early detection and extended device lifespan.
Predictive MaintenanceFault Prediction2022, [95]BPNNTask: Improve fault prediction and diagnosis for large-diameter auger rigs in coal mining.
Methodology: Developed a Digital Twin model with geometric, physical, and behavioral layers using Unity3D and ANSYS. Trained a BP neural network on fault data (4 fault types) with expert-assisted feedback correction.
Outcome: Model showed strong performance in identifying drill pipe bend/fracture, bearing fault, and overpressure events.
Predictive MaintenanceFault Prediction2021, [96]IRF, HC, TLTask: Improve fault detection and classification on intelligent production lines.
Methodology: Proposed IRF by filtering RF trees via hierarchical clustering, then applied transfer learning to fine-tune with physical data.
Outcome: Achieved better accuracy; outperformed KNN, ANN, LSTM, SVM; effective in diagnosing conveyor, tightening, and alignment faults with low-latency online analysis.
Predictive MaintenanceFault Prediction2021, [97]SLTask: Predict surface defects in HPDC castings.
Methodology: Converted HPDC process images into pixel-based tabular data; applied SVD and edge detection for dimensionality reduction.
Outcome: Better accuracy; crack location precisely identified in test images. The model enabled lightweight, distributed, low-latency defect prediction without large-scale computation.
Predictive MaintenanceFault Prediction2020, [98]NN, RF, RRTask: Predict generator oil temperature and detect early anomalies to prevent aircraft No-Go events.
Methodology: Segmented time-series data from 606 anomaly-free flights and applied Fourier/Haar basis expansion. NN chosen for best generalization. Anomalies detected by monitoring divergence from reference MSE over consecutive flights.
Outcome: Detected failures 5 to 9 flights before actual events; NN-Fourier DT showed good anomaly sensitivity with minimal false positives.
Predictive MaintenanceFault Prediction2019, [99]DNNTask: Perform real-time fault diagnosis under data-scarce and distribution-shifting conditions in smart manufacturing.
Methodology: The proposed DFDD framework trains a DNN, leveraging a Process Visibility System (PVS) to extract shop-floor data without additional sensors.
Outcome: DFDD achieved better accuracy, on virtual or physical data. Robust against imbalanced and distribution-shifted test sets.
Predictive MaintenanceLifetime Prediction2022, [100]LASSO, SVR, XGBoostTask: Achieve full-lifecycle monitoring and predictive maintenance for locomotives.
Methodology: Proposed a 3-layer ML-integrated DT architecture to forecast axle temperature trends.
Outcome: Detected locomotive bearing faults  1 week in advance. Enabled proactive fault alerts and life-cycle optimization.
Predictive MaintenanceLifetime Prediction2021, [101]LSTMTask: Enhance predictive maintenance of aero-engines through data-driven Digital Twin modeling.
Methodology: Developed an implicit Digital Twin (IDT) using sensor data and historical operation data, integrated with LSTM for RUL prediction.
Outcome: Achieved RMSE of 13.12 for RUL prediction, outperforming other methods; optimal performance at 80% training data.
Predictive MaintenanceHealth Monitoring2024, [94]DNNTask: Monitor WBG semiconductor health using a Digital Twin.
Methodology: Combined thermal–electrical simulation and ML models to predict degradation.
Outcome: Enabled accurate lifetime estimation and failure prediction using hybrid DT-MML approach.
Quality ControlDefect Detection2022, [102]SVR, GPRTask: Identify bearing crack type and size under variable speed.
Methodology: Modeled AE signals using autoregression, SVR, and GPR combined with Laguerre filters. Estimated unknown signals using a strict-feedback backstepping DT with fuzzy logic.
Outcome: Achieved 97.13% accuracy in crack type diagnosis and 96.9% in crack size classification across eight bearing conditions and multiple speeds.
Quality ControlDefect Detection2022, [103]LR, K-means ClusteringTask: Detect anomalies in a pasteurization system at a food plant using ML-enhanced Digital Twin.
Methodology: Built a LabVIEW-Python based DT of a pilot pasteurizer using real-time pressure and flow data.
Outcome: MLP reached 96–99% accuracy across fluids; DT enabled remote monitoring and decision support.
Quality ControlImage Recognition2022, [104]CNNTask: Monitor and classify the quality of banana fruit.
Methodology: Developed a Digital Twin system using thermal images (FLIR One camera) labeled into four classes.
Outcome: Enabled real-time classification and inventory decision-making.
Quality ControlImage Recognition2020, [105]Inception-v3 CNN with TLTask: Classify orientation (“up” or “down”) of 3D-printed parts in robotic pick-and-place system.
Methodology: Synthetic images generated using DT simulations in Blender. Labeled with Python script. Inception-v3 CNN retrained using TensorFlow.
Outcome: Achieved 100% accuracy on real-world images; validated DT-generated data for robust model training.
Quality ControlImage Recognition2020, [106]CNNTask: Monitor and control weld joint growth and penetration.
Methodology: Built DT using weld images processed by CNN for BSBW and image processing for TSBW. Unity GUI for visualization.
Outcome: Real-time monitoring via visualization.
Quality ControlImage Recognition2020, [107]MobileNet, UNet, TLTask: Enable low-cost, high-precision plant disease/nutrient deficiency detection.
Methodology: LoRaWAN WSN collected sensor data; used MobileNet and UNet on PlantVillage dataset. Simulated WSN in OMNeT++ and FLoRa; image downsampling for efficiency.
Outcome: 95.67% validation accuracy; enabled rural deployment via energy-efficient LoRa-based WSN.
Quality ControlOnline Quality Control2022, [108]PointNetTask: Real-time object detection and pose estimation in robotic DT system.
Methodology: Built DT with ROS and Unity for ABB IRB 120. Used LineMod and PointNet for object recognition/pose estimation. Collected data with Blensor and RealSense D435i.
Outcome: 100% classification accuracy,  3° pose error; real-time DT sync with <0.1 ms delay.
Quality ControlOnline Quality Control2022, [109]YOLOv4-M2Task: Improve small object detection in complex smart manufacturing.
Methodology: Designed a hybrid model using MobileNetv2 & YOLOv4 for object detection and OpenPose for long-range human posture detection.
Outcome: Achieved better accuracy and precision.
Quality ControlOnline Quality Control2021, [110]FFT, PCA, SVMTask: Enhance welder training and performance using VR-based DT.
Methodology: Captured motion via VR, transmitted to UR5 robot. Used FFT-PCA-SVM to classify welding skill.
Outcome: 94.44% classification accuracy; enabled immersive feedback and performance monitoring.
Process OptimizationPerformance Prediction2023, [111]ANN, k-NN, Symbolic RegressionTask: Predict and optimize workstation productivity using DT.
Methodology: Combined Production Planning and Control (PPC)) and ML to forecast throughput from failure/downtime data.
Outcome: Achieved adaptive PPC decisions.
Process OptimizationPerformance Prediction2022, [112]CNN, Spatio-Temporal GCNTask: Predict road behavior and secure data transfer in autonomous cars.
Methodology: Combined CNN and DT with spatio-temporal GCN and load balancing.
Outcome: 92.7% prediction accuracy, 80% delivery rate, low delay and leakage.
Process OptimizationPerformance Prediction2022, [113],BCDDPG, LSTMTask: Enable robust and energy-efficient flocking of UAV swarms.
Methodology: Developed DT-enabled framework using BCDDPG and LSTM for dynamic feature learning. Trained in simulation and deployed to UAVs.
Outcome: Outperformed baselines in 8 metrics including arrival rate >80% and energy efficiency.
Process OptimizationTask Modelling2022, [114]DDQNTask: Minimize energy in UAV-based mobile edge computing.
Methodology: DT-based offloading with DDQN, closed-form power solutions, and iterative CPU allocation.
Outcome: Reduced energy and delay vs. baselines; scalable under dynamic loads.
Process OptimizationProcess Control2024, [115]CNN, YOLOv3Task: Object detection in factories.
Methodology: Trained YOLOv3 on synthetic data from factory DT.
Outcome: Enabled robust object recognition without real datasets.
Process OptimizationProcess Control2022, [116]VGG-16Task: Enable intuitive robot programming.
Methodology: DT system with Hololens MR, Unity simulation, and CNN for object pose estimation.
Outcome: Real-time gesture control with ±1–2 cm error.
Process OptimizationProcess Control2022, [117]PDQN, DQNTask: Optimize smart conveyor control.
Methodology: Built DT-ACS and introduced PDQN to improve control performance.
Outcome: Faster convergence, better robustness, reduced cost under dynamic loads.
Process OptimizationProcess Control2021, [118]K-Means, KNNTask: Improve monitoring and prediction in chemical plants.
Methodology: Preprocessed data (IQR, normalization), clustered via K-Means, and built KNN models. Deployed model to cloud with WebSocket interface.
Outcome: 16.6% data reduction, 99.74% classification accuracy, R2 = 0.96 for regression.
Process OptimizationProcess Control2019, [119]XGBoost, RFTask: Optimize yield in catalytic cracking units.
Methodology: 5-step DT framework using IoT and ML; trained 4 models with ensemble methods.
Outcome: Real-world deployment increased light oil yield by 0.5%.
Process OptimizationScheduling2022, [120]Q-Learning, SARSA, DNNTask: Improve shipyard scheduling and Quality of Service (QoS) management.
Methodology: Built 3-layer DTN; trained DNN for latency prediction; tested RL variants.
Outcome: Parallel RL had best performance; DT enabled real-time decisions and resource efficiency.
Process OptimizationScheduling2021, [121]ANNTask: Enhance planning in fast fashion lines.
Methodology: DT system with ANN for demand forecast, Discrete Event Simulation (DES) for simulating operations, and dashboard visualization.
Outcome: Lead time reduced by 28%, operator use up 37%, staffing optimized.
Process OptimizationScheduling2020, [122]RLTask: Optimize scheduling in manual assembly.
Methodology: Built Python-based adaptive simulation and used RL for recommendation refinement.
Outcome: Identified bottlenecks and improved efficiency.

3.2. Edge AI in Predictive Maintenance, Quality Control, and Process Optimization

Industrial automation environments generate vast volumes of real-time data, traditionally processed in centralized cloud infrastructures. However, dependence on cloud-based computing introduces significant challenges, including high latency, bandwidth constraints, security vulnerabilities, and increased energy consumption. As a response, edge computing has emerged as a critical architectural paradigm that relocates computation closer to the data source. By enabling localized processing on edge devices—such as industrial controllers, microcontrollers, or embedded platforms—edge computing minimizes cloud dependency and enhances responsiveness, privacy, and robustness [123,124].
When artificial intelligence (AI) models are deployed directly on these edge devices, the resulting paradigm is known as Edge AI [125]. This integration enables on-device intelligence for real-time inference, decision-making, and control. Edge AI eliminates the need for constant cloud connectivity, offering benefits such as ultra-low latency, reduced data transmission, enhanced data privacy, and significantly lower power consumption [126,127]. Devices such as the NVIDIA Jetson Nano, Raspberry Pi 5, and Texas Instruments F28P55x have emerged as cost-effective platforms capable of executing ML models at the edge, making advanced industrial AI both accessible and scalable [4,128,129]. Edge AI has been recognized by leading industry analysts such as Gartner and Deloitte as a transformative force in the evolution of Industry 4.0, offering unprecedented capabilities in decentralized, intelligent automation [130,131]. Table 5 demonstrates how Edge AI is being operationalized across multiple domains—most notably Predictive Maintenance, Quality Control, and Process Optimization.
In the domain of Predictive Maintenance, one of the most impactful applications of Edge AI is remaining useful life estimation for machinery components. Using the NASA C-MAPSS dataset, LSTM, CNN-RNN, and hybrid models were implemented on edge hardware (e.g., Raspberry Pi and NVIDIA Jetson) to predict degradation states of aircraft engines. The study demonstrated that inference times remained within practical limits even on constrained hardware, showing how deep learning models can be adapted for real-time RUL assessment in industrial settings [132]. Another application focuses on thermal error prediction in Computer Numerical Control (CNC) machines, where LSTM networks and hybrid models (e.g., improved Grey Relational Analysis) were deployed on FPGA-based edge systems [5]. The Genetic Algorithm is used on IIoT sensor data to enable predictive maintenance and energy scheduling to reduce energy use by 28.1% in [133].
In Quality Control, Edge AI has proven instrumental in acoustic defect detection. A real-world deployment in a beverage factory used LSTM and SVM-based models running on edge servers to classify acoustic signatures of glass bottles on a high-speed conveyor. The system achieved high accuracy in identifying defective units and replaced human-based inspection with an autonomous, high-throughput solution—illustrating the scalability of Edge AI in visually and acoustically intensive QC tasks [134]. Similarly, in railway bearing monitoring, fuzzy logic-based decision systems combined with FFT processing were executed on fog/edge computing platforms to detect overheating in real time. By processing thermal and vibration signals locally, the system enabled timely fault classification and reduced the reliance on centralized diagnostic hubs [135].
From a Process Optimization perspective, Edge AI supports real-time feedback and dynamic decision-making. The iRobot-Factory study showcased an intelligent robotic system that integrates ML and DL models at the fog layer to manage distributed robotic arms and manufacturing cells. Tactile and audiovisual inputs were processed at the edge to determine operator emotions and optimize task assignments—highlighting Edge AI’s role not just in performance optimization but also in human–machine interaction enhancement [136]. Several key trends highlight the transformative role of Edge AI in smart manufacturing. Low-latency learning enables time-sensitive predictions and decisions to be made directly at the data source, minimizing reliance on centralized processing [4,134,135]. Hardware-aware model design is gaining traction, with compact architectures and deployment strategies optimized for FPGA and ARM-based edge devices to meet stringent computational and energy constraints [4,5,136]. Multimodal sensor integration is becoming standard, allowing edge systems to process diverse input streams—such as thermal, audio, and vibration data—for richer machine learning inference. Additionally, autonomous operation is increasingly common, with many systems functioning independently of cloud connectivity, enhancing resilience and preserving data privacy.
Table 5. Edge AI in industrial automation.
Table 5. Edge AI in industrial automation.
AreaSub-AreaYear and ReferenceAlgorithmTask, Methodology, and Outcome
Predictive MaintenanceFault Prediction2024, [128]SVM, RF, KNN, CNNTask: Detect tool wear in milling.
Methodology: Developed an Edge AI system running 5 SL models on low-cost hardware.
Outcome: CNN outperformed others in wear classification, enabling efficient on-device inference.
Predictive MaintenanceFault Prediction2020, [137]DNNTask: Accurately detect faults in IIoT manufacturing facilities using edge AI with minimal latency.
Methodology: Transforms fault detection into a classification task using a multi-block Gaussian–Bernoulli Restricted Boltzmann Machine (GBRBM) for feature extraction and deep autoencoder for training. The architecture enables low-latency classification directly at the edge.
Outcome: Achieved 88.39% accuracy; significantly outperformed SVM, LDA, LR, QDA, and FNN baselines.
Predictive MaintenanceFault Prediction2020, [138]1D-CNNTask: Accurately detect gear and bearing faults in gearboxes under multiple operating conditions using deep learning on edge equipment.
Methodology: Proposed a multi-task 1D-CNN model trained with shared and task-specific layers. Model deployed on edge devices for low-latency real-time diagnosis.
Outcome: Achieved 95.76% joint accuracy; after applying triplet loss, test accuracy reached 90.13% even with speed data missing.
Predictive MaintenanceAnomaly Detection2020, [139]CNN-VA, SCVAETask: Perform unsupervised anomaly detection on time-series manufacturing sensor data.
Methodology: Proposes SCVAE (compressed CNN-VAE using Fire Modules) trained on labeled UCI datasets and unlabeled CNC machine data.
Outcome: SCVAE achieved high anomaly detection accuracy while reducing model size and inference time significantly, making it suitable for edge deployment.
Quality ControlDefect Detection2020, [140]R-CNN, ResNet101Task: Detect surface defects on complex-shaped manufactured parts (turbo blades).
Methodology: Faster R-CNN is deployed at edge nodes for low-latency detection, while cloud servers support training and updates. The smart system integrates cloud-edge collaboration for continuous model evolution.
Outcome: Achieved 81% precision and 72% recall on test set; edge computing improved speed over cloud or embedded-only setups.
Quality ControlDefect Detection2021, [141]CNNTask: Automate visual defect detection in injection-molded tampon applicators using deep learning and edge computing.
Methodology: A CNN model processes grayscale images acquired from vision sensors mounted on rotating rails. The system performs real-time defect classification on edge boxes connected to PLCs for automated sorting.
Outcome: Achieved 92.62% accuracy with fast inference, validating industrial applicability.
Quality ControlDefect Detection2020, [142]K-means ClusteringTask: Develop a real-time, low-latency fabric defect detection system.
Methodology: Modified DenseNet is optimized with a custom loss function, data augmentation (6 strategies), and pruning for edge deployment. Trained and deployed on Cambricon 1H8 edge device with factory data.
Outcome: Achieved 18% AUC gain, 50% reduction in data transmission, and 32% lower latency vs. cloud, validating robust, real-time performance for 11 defect classes.
Quality ControlImage Recognition2023, [143]TADSTask: Optimize execution time of DNN-based quality inspection tasks in smart manufacturing.
Methodology: Proposes TADS, a scheme that selects optimal DNN layer split points based on task number, type, inter-arrival time, and bandwidth.
Outcome: Achieved up to 97% task time reduction vs. baseline schemes; validated through both simulations and real-world deployment.
Quality ControlImage Recognition2021, [144]MobileNetV1, ResNetTask: Improve operator safety and operational tracking in a shipyard workshop.
Methodology: A mist computing architecture using smart IIoT cameras performs real-time human detection and machinery tracking locally without uploading image data to the cloud.
Outcome: Demonstrated extremely low yearly energy consumption (0.35–0.36 kWh/device) and scalable carbon footprint analysis across regions using different energy sources.
Quality ControlImage Recognition2020, [145]SVMTask: Automate detection of edge and surface defects in logistics packaging boxes.
Methodology: Images are preprocessed with grayscale, denoising, and morphological operations. Features are extracted using SIFT and classified using SVM (RBF kernel).
Outcome: Achieved 91% accuracy in classifying edge and surface defects, outperforming CNN in both accuracy and speed under edge computing conditions.
Quality ControlOnline Quality Control2020, [146]GBT, SVM, DT, NB, LRTask: Replace traditional X-ray inspections in PCB manufacturing.
Methodology: Historical SPI data were used to train supervised models (GBT selected). Prediction occurs on solder-joint level; deployment strategy filters X-ray usage based on predicted FOV defect status.
Outcome: 29% average X-ray inspection volume reduced without sacrificing defect detection accuracy.
Process OptimizationProcess Control2020, [147]ResNet34, RFBNetTask: Estimate and calibrate the 3D pose of robotic arms with five key points (base, shoulder, elbow, wrist, end).
Methodology: Two-stage pipeline—robot arm detection with RFBNet and key point regression using a lightweight CNN. Trained on RGB-D data from Webots simulator, deployed on NVIDIA Jetson AGX.
Outcome: Achieved 1.28 cm joint error, 0.70 cm base error; 14 FPS on edge device with low GPU memory.
Process OptimizationScheduling2020, [148]LSTM, FCM clusteringTask: Detect anomalies in discrete manufacturing processes and perform energy-aware production rescheduling.
Methodology: Energy data is collected from CNC tools. An LSTM model predicts tool wear and machine degradation. If an anomaly occurs, an edge-triggered rescheduling mechanism (RSR/TR) is initiated.
Outcome: 3.5% detection error; energy and production efficiency improved by 21.3% and 13.7%, respectively.

4. Dataset, Data Acquisition Tools, and Industrial Platforms

The development and deployment of effective AI models in industrial automation critically depend on the quality and characteristics of the datasets used. This section discusses publicly available and proprietary datasets, data acquisition devices, and the input–output variables employed across AI applications in predictive maintenance, quality control, and process optimization. To offer a clearer perspective on the current data landscape, we found it useful to analyze dataset sources, feature types, and labeling methods—providing researchers and practitioners with insights into commonly used data characteristics while also illustrating the complexity and variety of problems addressed in data-driven industrial automation research.

4.1. Dataset

A critical dimension of AI development in industrial automation is the nature and richness of the data used to train, validate, and deploy models. Table 6 provides a comprehensive summary of datasets utilized in recent research studies. In Predictive Maintenance, the most prevalent data type is time-series sensor data, which captures the temporal evolution of equipment behavior. This includes vibration signals [149,150], thermal readings [5], and acoustic emissions [135].
In Quality Control, the dominant data type is visual imagery, derived from high-resolution cameras or industrial scanners. Visual datasets are used to detect surface defects, cracks, or scratches in manufactured parts and to classify product orientation, packaging compliance, or misassembly [4]. Some applications enhance visual inspection by integrating thermal imaging [151] or acoustic signals [134] to detect internal or non-visible defects—highlighting the move toward multimodal quality sensing.
In Process Optimization, datasets are more varied and often involve multivariate process parameters, such as temperature, pressure, feed rate, or cutting speed, collected during operations [5,63]. The use of robot kinematic and trajectory data [90] for motion planning, energy optimization, and collision avoidance in industrial robotics has also been found in some applications. Synthetic or simulation-based datasets are used to supplement limited real-world data, particularly when experiments are costly or disruptive to production [61,64,152,153,154].
Digital Twin systems, serving as dynamic virtual replicas of physical assets, utilize a blend of real sensor data and synthetic inputs to enable predictive modeling and optimization. For instance, synthetic CAD-generated image datasets are used to pretrain CNN classifiers for part orientation, requiring only a small number of real samples for fine-tuning—an approach that highlights the data efficiency of simulation-augmented learning [105]. On the other hand, Edge AI systems focus on low-latency and on-device inference and operate under hardware constraints that shape their data needs [155]. Designed for autonomous operation, Edge AI datasets are structured to support intermittent connectivity, and in some cases, local or federated learning updates occur without central aggregation [156].
Table 6. Overview of dataset characteristics.
Table 6. Overview of dataset characteristics.
AreaReferenceDataset UsedDevices UsedInput VariablesOutput VariablesNumber of Samples
Predictive Maintenance[100]Real-world axle temperature data from CDD5B1 locomotivesOnboard sensorsAxle temperature, ambient temp, GPS speed, generator tempPredicted axle temp, residual error, failure alert 1 0,000
Predictive Maintenance[125]Custom dataset (6 sensors, 6 units)Four low-power embedded edge devicesAccelerometer, gyro, magnetometer, micAging classification939
Predictive Maintenance[138]Custom DDS vibration data (gear and bearing)Edge-ready hardware (lightweight CNNs), DDS simulator, 1D sensors, FFT preprocessorTime-series vibration signals (gear, bearing)Fault category of gear and bearing (multi-label output)192,000
Predictive Maintenance[157]Time-series current signals from solar panel systemsTIDA-010955 AFE board with C2000 control card, current transformers.ADC samples, FFT features.Binary classification: Arc (1) or Normal (0).Not specified
Predictive Maintenance[158]Vibration data (3-axis), collected from motors under various fault conditionsVibration sensors, motor controller, dual GaN inverters, and EMJ04-APB22 PMSM motorsTime-series vibration data, FFT or raw signals.Fault types (e.g., normal, flaking, erosion, localized damage).Not Specified
Quality Control[141]Real factory image dataset from SMEsGigE Vision Cameras, Edge Box (NVIDIA GTX 1080 Ti), PLC, rotating railGrayscale product images (300×300 px)Binary defect classification (OK/Defective)3428
Quality Control[142]Alibaba Tianchi fabric dataset (real industrial images)Intelligent edge camera (Cambrian 1H8), ARM Cortex A7High-res fabric imagesDefect classification2022
Quality Control[145]Custom dataset from logistics warehouseTXG12 industrial camera, LED lights, conveyor with PLCGrayscale carton images (500×653 px)Binary classification (OK, Edge Defect, Surface Defect)3000
Quality Control[159]Custom image dataset (12 defect categories)Sensors, fog nodes, camerasImage features from product sensorsBinary/Multiclass defect classification2400
Process Optimization[4]Custom manufacturing imagesNVIDIA Jetson NanoProduct images, object categoriesDefect detection, inventory stateNot specified
Process Optimization[118]64,789 records of process dataIoT devicesProcess temps, fan pressure/speed, raw material consumptionOperating mode, fault diagnosis, predicted material consumption61,753
Process Optimization[148]Milling shop energy logsElectric meters, edge server, PLCs, CNC lathes, milling machinesEnergy consumption metricsAnomaly class (normal, tool wear, degradation), reschedule strategy1000
Process Optimization[160]Real CNC motion dataFagor 8070 CNC controllerControl loop parameters, speed, load torque, backlash, friction factorsPosition error, control effort, peak errorNot specified

4.2. Industrial Platforms and Software

The AI-driven solutions in industrial automation depend on robust software tools and platforms that support data acquisition, model development, real-time inference, and system optimization. In the context of smart manufacturing, a wide array of commercial, open-source, and hardware-integrated tools are used across three core domains: Predictive Maintenance, Quality Control, and Process Optimization. These tools facilitate the transition from traditional rule-based systems to intelligent, adaptive, and data-driven manufacturing ecosystems.
In Predictive Maintenance, software platforms are designed to monitor equipment health, detect anomalies, and predict failures based on sensor data and historical logs. Industry-leading platforms such as ABB Ability™ Condition Monitoring for Motors, ABB Ability™ Predictive Maintenance, IBM Maximo Application Suite, and PTC ThingWorx enable integration of IoT data streams with AI-powered diagnostic models [161,162,163,164]. Azure IoT Suite and Uptake Fusion further enhance PdM capabilities by combining real-time telemetry with cloud analytics [165,166]. For algorithm development and simulation, tools like MATLAB’s Predictive Maintenance Toolbox allow engineers to extract condition indicators, perform signal analysis, and estimate remaining useful life [167]. These systems often operate in conjunction with edge devices (e.g., NVIDIA Jetson, Raspberry Pi, Siemens IoT2040) to support low-latency, on-device inference for remote or latency-sensitive environments.
In the domain of Quality Control, computer vision and signal processing software are widely used for high-precision inspection tasks. Tools such as Cognex VisionPro [168], Keyence Vision Systems [169], and NI LabVIEW with the Vision Development Module [170] provide industrial-grade image acquisition and analysis capabilities. Matrox Imaging Library (MIL) is also commonly employed for 2D/3D defect detection [171]. In addition, platforms like ZEISS PiWeb enable statistical process control (SPC) using dimensional measurement data [172]. For AI-based inspection, deep learning models such as YOLO [173], ResNet [174], and EfficientNet [175] are trained using frameworks like OpenCV and deployed on Edge AI platforms such as Edge Impulse [176], AWS Panorama [177], or Google Coral [178] enabling real-time classification of defects at the production line.
Process Optimization leverages simulation-based and algorithmic tools for optimizing process flows, resource allocation, and control strategies. Platforms like Siemens Tecnomatix Plant Simulation, Rockwell Automation Arena, and AnyLogic provide discrete event and agent-based modeling environments for simulating factory operations [179,180,181]. In the chemical and energy sectors, AspenTech’s Aspen Plus and HYSYS are widely adopted for modeling and optimizing thermodynamic and kinetic processes [182,183]. Furthermore, Ansys Twin Builder offers integrated simulation and Digital Twin environments for real-time feedback and optimization [184]. On the AI side, libraries such as Scikit-learn, Pyomo [185], Optuna [186], and SimOpt [187] support advanced data-driven optimization tasks, including hyperparameter tuning, constraint satisfaction, and decision modeling. Reinforcement learning frameworks like Stable-Baselines3 and Ray RLlib are increasingly being applied to dynamic PO problems where adaptive control policies are required [188,189].
Several tools operate across these domains, reflecting the convergence of AI, IoT, and simulation technologies. Digital Twin platforms like GE Predix [190], Siemens Insights Hub [191], and ANSYS Twin Builder [184] integrate real-time sensor data with virtual system models to support diagnostics, forecasting, and autonomous decision-making. Edge AI runtimes, including OpenVINO [192], TensorRT [193], and NVIDIA Dynamo [194], allow high-performance inference on low-power devices. Additionally, cloud-integrated IIoT services such as AWS IoT Greengrass [195] and Microsoft Azure Digital Twins [196] support secure data flow, remote monitoring, and coordinated control across decentralized production environments.

5. Discussion and Future Recommendations

5.1. Answers to Research Questions

This section presents the synthesized findings from our review in response to the research questions (RQs) posed in Section 1.1.
  • RQ1: What are the emerging trends in ML models and enabling technologies (e.g., Digital Twins, Edge AI) across the domains of Predictive Maintenance, Quality Control, and Process Optimization?
    Our review identifies a growing adoption of deep learning models, particularly CNNs and RNNs, for tasks such as fault prediction, defect detection, and real-time control. Reinforcement learning is emerging for dynamic optimization in PO tasks. The practical integration of enabling technologies like Edge AI and Digital Twins is expanding, with Edge AI supporting real-time inference at the device level and Digital Twins providing predictive simulation capabilities in PdM and PO applications.
  • RQ2: What types of datasets, sensor modalities, and input–output configurations are used in ML applications for PdM, QC, and PO?
    The dataset landscape includes both public and proprietary sources. PdM primarily uses time-series data (e.g., vibration, current, temperature) captured via accelerometers and industrial sensors. QC focuses on image-based datasets from cameras or scanners. PO applications leverage multi-modal data like pressure, flow rate, and control logs. Input–output structures typically map sensor data to predictions such as remaining useful life, defect classification, or control adjustments.
  • RQ3: What are the major technical and deployment challenges faced by ML-based solutions in real-world industrial environments?
    Challenges include poor model generalizability, limited explainability in safety-critical systems, and incompatibility with legacy systems. Edge AI deployments face constraints in computational capacity and thermal stability. Data imbalance, noise, and lack of labeled datasets further hinder practical adoption.
  • RQ4: How do enabling paradigms such as Digital Twins and Edge AI contribute to scalable, adaptive automation in industrial settings, and what are the remaining research gaps?
    Digital Twins provide synchronized, real-time replicas for predictive diagnostics and closed-loop control. Edge AI enables low-latency, distributed intelligence critical for autonomous systems. However, full automation is hindered by challenges such as the absence of modular DT frameworks, incomplete data synchronization, and the need for federated learning for distributed optimization.

5.2. Limitations in Current Practice

Most existing implementations in Predictive Maintenance, Quality Control, and Process Optimization are developed in siloed contexts, often optimized for narrow problem domains. In PdM, models frequently target individual components—such as motors or bearings—without accounting for interdependent failure modes or system-level health indicators. QC approaches remain heavily reliant on visual inspection techniques using CNNs, overlooking the potential benefits of multi-sensor fusion and neglecting model interpretability, which is critical in high-assurance environments. Meanwhile, PO methods often rely on offline optimization models or supervised learning pipelines that lack responsiveness to real-time disturbances and variability.
Interoperability remains another critical challenge in the deployment of ML solutions. Many industrial ML and Digital Twin applications lack standardized integration pathways across heterogeneous platforms, devices, and control layers. The absence of common communication protocols and data exchange standards—such as OPC-UA and MQTT—constrains seamless interaction between physical systems and digital analytics layers. This fragmented ecosystem hinders the scalability and reliability of AI deployments in complex manufacturing settings.
From a deployment perspective, significant challenges persist. Many industrial environments, particularly SMEs, operate legacy systems with limited computational infrastructure, making integration of Edge AI and DT architectures difficult. Moreover, Edge AI implementations face constraints in latency, energy consumption, and thermal reliability, particularly when deployed in tight control loops. DTs, although conceptually powerful, frequently suffer from incomplete physical–digital synchronization, limited scalability, and the absence of open standards for modular development.

5.3. Future Research Directions

To enable robust and scalable adoption of intelligent industrial systems, future research should prioritize the following directions:
  • Generalizable and Explainable ML Architectures: Development of ML models that can transfer across different tasks and domains is vital. Emphasis should be placed on integrating explainable AI (XAI) methods, especially for QC and PdM, to enhance interpretability and foster user trust in automated decision-making.
  • Lightweight and Adaptive Edge Intelligence: There is a critical need for computationally efficient, low-latency models tailored for edge deployment. Research in model compression, neural architecture search (NAS), and adaptive learning under resource constraints will facilitate broader use of Edge AI, even in cost- or power-sensitive settings.
  • Autonomous and Self-Evolving Digital Twins: Future DT systems should evolve from passive simulators to active, learning agents. Integrating reinforcement learning and unsupervised learning will enable real-time system adaptation and closed-loop control—essential for dynamic environments and process resilience.
  • Federated and Privacy-Preserving Learning: In data-sensitive or distributed industrial settings, federated learning and homomorphic encryption offer viable paths for collaborative intelligence without exposing proprietary data. These methods are especially promising for industries constrained by data governance and compliance.
  • Human-Centered and Safe Learning Frameworks: As intelligent control systems become more autonomous, research must also consider safety and human-in-the-loop integration. Safe RL and human-aware ML models will be critical for ensuring that automation decisions align with operational constraints and ethical standards.
  • Standardized and Interoperable AI Frameworks: Future research should address the development of unified reference architectures and open integration standards to facilitate scalable, cross-platform ML deployment. This includes designing middleware solutions and protocol-agnostic ML pipelines that support seamless interoperability across heterogeneous industrial environments.
In summary, while notable progress has been achieved, addressing persistent limitations in model generalization, real-time deployment, and safe learning remains critical. Continued research in these areas will enable the development of more robust, scalable, and intelligent automation solutions across diverse industrial settings.

6. Conclusions

The integration of ML into industrial automation is no longer a theoretical promise but a growing reality, driving measurable improvements across Predictive Maintenance, Quality Control, and Process Optimization. This review has demonstrated that ML techniques—spanning supervised, unsupervised, and reinforcement learning—enable more intelligent, adaptive, and responsive manufacturing systems. From image-based defect detection to time-series-based predictive modeling and self-optimizing control systems, ML is reshaping the landscape of industrial decision-making.
One of the key findings is the rising dominance of deep learning architectures, particularly CNNs in image-based defect detection and RNNs in time-series-based fault prediction and control. In PdM, ML is enabling early failure prediction through vibration, acoustic, and thermal data analytics, while in QC, CNNs and hybrid models support high-accuracy anomaly detection. In PO, reinforcement learning and adaptive supervised learning algorithms are increasingly employed for real-time parameter tuning and process adaptation. The deployment landscape is also shifting toward edge-based architectures for latency-sensitive applications, enabled by advances in Edge AI. Concurrently, Digital Twin frameworks are gaining traction for their ability to mirror physical systems with high fidelity—allowing for predictive simulations, closed-loop feedback, and continuous optimization.
Beyond algorithmic performance, this paper also highlights the role of datasets, data acquisition tools, and industrial software platforms that form the operational backbone of AI-driven automation. Understanding the nature of the data used, the devices involved in its collection, and the tools available for implementation is essential for the successful deployment of ML in manufacturing settings. The review also reveals a fragmentation in dataset usage, with few publicly shared benchmarks, indicating a need for standardized datasets to enable cross-comparison and reproducibility. Moreover, platforms such as MATLAB, Python-based toolkits, and cloud–edge hybrid solutions are increasingly favored for practical deployment, reflecting a shift toward more modular and integrable ML infrastructures.
Explainability remains a critical concern, especially in high-stakes domains like defect detection and predictive failure analysis, where decision transparency is essential. Future research should prioritize the development of lightweight, adaptable models for edge environments, integration of explainable AI frameworks within Digital Twin systems, and implementation of federated learning to facilitate collaborative intelligence without compromising data privacy. Additionally, advancing toward self-learning and continuously updating ML systems will be vital for supporting dynamic, resource-constrained, and human-in-the-loop industrial settings.
Overall, this review underscores that ML, Digital Twin, and Edge AI are not merely complementary technologies but interdependent enablers of a smarter, more resilient, and adaptive industrial ecosystem. Their convergence signals a paradigm shift toward self-organizing, data-driven manufacturing systems aligned not only with the core vision of Industry 4.0 but also with the emerging priorities of Industry 5.0. Furthermore, this review serves as a valuable resource for researchers, automation engineers, and industrial decision-makers seeking to understand and implement scalable, intelligent ML-driven solutions across PdM, QC, and PO applications.

Author Contributions

Conceptualization, M.A.R. and K.I.; methodology, M.A.R.; validation, K.I. and A.A.A.; formal analysis, M.A.R.; investigation, M.A.R. and M.F.S.; resources, M.A.R. and M.F.S.; data curation, M.A.R. and M.F.S.; writing—original draft preparation, M.A.R.; writing—review and editing, M.A.R. and M.F.S.; visualization, M.A.R.; supervision, K.I. and A.A.A. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
AEAutoencoder
AIArtificial Intelligence
ARAugmented Reality
ANFISAdaptive Neuro-Fuzzy Inference System
ANFIS-GAANFIS with Genetic Algorithm
ANNArtificial Neural Network
ANN-GAArtificial Neural Network with Genetic Algorithm
BNBatch Normalization
BOBayesian Optimization
BNNBayesian Neural Network
BSBWBack Side Bead Width
BCDDPGBehavior-Coupling Deep Deterministic Policy Gradient
BPNNBackpropagation Neural Network
CFCollaborative Filtering
CNNConvolutional Neural Network
CARTClassification and Regression Trees
CFSFDPClustering by Fast Search and Find of Density Peaks
DDQNDouble Deep Q-Network
DLDeep Learning
DTDigital Twin
DtreeDecision Tree
DNNDeep Neural Network
DQNDeep Q-Network
DFDDData-Driven Fault Detection and Diagnosis
ECEdge Computing
ETExtra Trees
FEAFeature Extraction and Analysis
FFTFast Fourier Transform
FTRLFollow-The-Regularized-Leader
GAGenetic Algorithm
GBGradient Boosting
GPRGaussian Process Regression
GRUGated Recurrent Unit
HCHierarchical Clustering
HPDCHigh Performance Distributed Computing
IoTInternet of Things
IRFIterative Random Forest
ILSSInterlaminar Shear Strength
k-NNk-Nearest Neighbors
LRLogistic Regression/Linear Regression
LASSOLeast Absolute Shrinkage and Selection Operator
LDALinear Discriminant Analysis
LSTMLong Short-Term Memory
MECMobile Edge Computing
MLMachine Learning
MLPMulti-Layer Perceptron
MPCModel Predictive Control
MSEMean Squared Error
MQTTMessage Queuing Telemetry Transport
NNNeural Network
OCROptical Character Recognition
OPC-UAOpen Platform Communications–Unified Architecture
POProcess Optimization
PCAPrincipal Component Analysis
PdMPredictive Maintenance
PPCPredictive Process Control
PPOProximal Policy Optimization
PDQNProfit-Sharing Deep Q-Network
QCQuality Control
QLrnQ-Learning
QoSQuality of Service
QUILTQuantized Unsupervised Incremental Learning Tree
RFRandom Forest
RLReinforcement Learning
RRRidge Regression
RMSRoot Mean Square
RSM-GAResponse Surface Methodology with Genetic Algorithm
SVDSingular Value Decomposition
SVRSupport Vector Regression
SVMSupport Vector Machine
SARSAState-Action-Reward-State-Action
SCADASupervisory Control and Data Acquisition
SDAEStacked Denoising Autoencoder
SIFTScale-Invariant Feature Transform
ST-GCNSpatio-Temporal Graph Convolutional Network
TLTransfer Learning
TD3Twin Delayed Deep Deterministic Policy Gradient
TSBWTop-Side Bead Width
UNetU-Shaped Convolutional Neural Network
VCGVariational Cooperative Game
WBGWide Band Gap
WD-CNNWide Deep Convolutional Neural Network
XGBOOST Extreme Gradient Boosting
YOLOYou Only Look Once

References

  1. Lee, J.; Bagheri, B.; Kao, H.-A. A Cyber-Physical Systems Architecture for Industry 4.0-Based Manufacturing Systems. Manuf. Lett. 2015, 3, 18–23. [Google Scholar] [CrossRef]
  2. Wuest, T.; Weimer, D.; Irgens, C.; Thoben, K.-D. Machine Learning in Manufacturing: Advantages, Challenges, and Applications. Prod. Manuf. Res. 2016, 4, 23–45. [Google Scholar] [CrossRef]
  3. Rahman, M.A.; Abushaiba, A.A.; Elrajoubi, A.M. Integration of C2000 Microcontrollers with MATLAB Simulink Embedded Coder: A Real-Time Control Application. In Proceedings of the 2024 7th International Conference on Electrical Engineering and Green Energy (CEEGE), Los Angeles, CA, USA, 28 June 28–1 July 2024; IEEE: Piscataway, NJ, USA, 2024; pp. 131–136. [Google Scholar] [CrossRef]
  4. Gawate, E.; Rane, P. Empowering Intelligent Manufacturing with the Potential of Edge Computing with NVIDIA’s Jetson Nano. In Proceedings of the 2023 International Conference on Computing, Communication, and Intelligent Systems (ICCCIS), Greater Noida, India, 3–4 November 2023; IEEE: Piscataway, NJ, USA, 2023; pp. 375–380. [Google Scholar] [CrossRef]
  5. Liang, Y.C.; Li, W.D.; Lou, P.; Hu, J.M. Thermal Error Prediction for Heavy-Duty CNC Machines Enabled by Long Short-Term Memory Networks and Fog-Cloud Architecture. J. Manuf. Syst. 2022, 62, 950–963. [Google Scholar] [CrossRef]
  6. Qin, J.; Liu, Y.; Grosvenor, R. A Categorical Framework of Manufacturing for Industry 4.0 and Beyond. Procedia CIRP 2016, 52, 173–178. [Google Scholar] [CrossRef]
  7. Wang, J.; Ma, Y.; Zhang, L.; Gao, R.X.; Wu, D. Deep Learning for Smart Manufacturing: Methods and Applications. J. Manuf. Syst. 2018, 48, 144–156. [Google Scholar] [CrossRef]
  8. Nahavandi, S. Industry 5.0—A Human-Centric Solution. Sustainability 2019, 11, 4371. [Google Scholar] [CrossRef]
  9. Ghobakhloo, M.; Fathi, M. Industry 4.0 and Opportunities for Energy Sustainability. J. Clean. Prod. 2021, 295, 126427. [Google Scholar] [CrossRef]
  10. Shi, W.; Cao, J.; Zhang, Q.; Li, Y.; Xu, L. Edge Computing: Vision and Challenges. IEEE Internet Things J. 2016, 3, 637–646. [Google Scholar] [CrossRef]
  11. Tao, F.; Zhang, H.; Liu, A.; Nee, A.Y.C. Digital Twin in Industry: State-of-the-Art. IEEE Trans. Ind. Inform. 2019, 15, 2405–2415. [Google Scholar] [CrossRef]
  12. Transforming Manufacturing with Digital Twins | McKinsey. Available online: https://www.mckinsey.com/capabilities/operations/our-insights/digital-twins-the-next-frontier-of-factory-optimization (accessed on 19 May 2025).
  13. Hermann, M.; Pentek, T.; Otto, B. Design Principles for Industrie 4.0 Scenarios. In Proceedings of the 2016 49th Hawaii International Conference on System Sciences (HICSS), Koloa, HI, USA, 5–8 January 2016; IEEE: Piscataway, NJ, USA, 2016; pp. 3928–3937. [Google Scholar] [CrossRef]
  14. Carvalho, T.P.; Soares, F.A.A.M.N.; Vita, R.; Francisco, R.P.; Basto, J.P.; Alcalá, S.G.S. A Systematic Literature Review of Machine Learning Methods Applied to Predictive Maintenance. Comput. Ind. Eng. 2019, 137, 106024. [Google Scholar] [CrossRef]
  15. Zhu, Z.; Lei, Y.; Qi, G.; Chai, Y.; Mazur, N.; An, Y.; Huang, X. A Review of the Application of Deep Learning in Intelligent Fault Diagnosis of Rotating Machinery. Measurement 2023, 206, 112346. [Google Scholar] [CrossRef]
  16. Kausik, A.K.; Rashid, A.B.; Baki, R.F.; Maktum, M.M.J. Machine Learning Algorithms for Manufacturing Quality Assurance: A Systematic Review of Performance Metrics and Applications. Array 2025, 26, 100393. [Google Scholar] [CrossRef]
  17. Mazzei, D.; Ramjattan, R. Machine Learning for Industry 4.0: A Systematic Review Using Deep Learning-Based Topic Modelling. Sensors 2022, 22, 8641. [Google Scholar] [CrossRef]
  18. Fahle, S.; Prinz, C.; Kuhlenkötter, B. Systematic Review on Machine Learning (ML) Methods for Manufacturing Processes—Identifying Artificial Intelligence (AI) Methods for Field Application. Procedia CIRP 2020, 93, 413–418. [Google Scholar] [CrossRef]
  19. Plathottam, S.J.; Rzonca, A.; Lakhnori, R.; Iloeje, C. A review of artificial intelligence applications in manufacturing operations. J. Adv. Manuf. Process. 2023, 5, e10159. [Google Scholar] [CrossRef]
  20. Bertolini, M.; Mezzogori, D.; Neroni, M.; Zammori, F. Machine Learning for Industrial Applications: A Comprehensive Literature Review. Expert Syst. Appl. 2021, 175, 114820. [Google Scholar] [CrossRef]
  21. Kim, S.W.; Kong, J.H.; Lee, S.W.; Lee, S. Recent Advances of Artificial Intelligence in Manufacturing Industrial Sectors: A Review. Int. J. Precis. Eng. Manuf. 2022, 23, 111–129. [Google Scholar] [CrossRef]
  22. Meddaoui, A.; Hain, M.; Hachmoud, A. The Benefits of Predictive Maintenance in Manufacturing Excellence: A Case Study to Establish Reliable Methods for Predicting Failures. Int. J. Adv. Manuf. Technol. 2023, 128, 3685–3690. [Google Scholar] [CrossRef]
  23. Hadi, R.H.; Hady, H.N.; Hasan, A.M.; Al-Jodah, A.; Humaidi, A.J. Improved Fault Classification for Predictive Maintenance in Industrial IoT Based on AutoML: A Case Study of Ball-Bearing Faults. Processes 2023, 11, 1507. [Google Scholar] [CrossRef]
  24. Fordal, J.M.; Schjølberg, P.; Helgetun, H.; Skjermo, T.Ø.; Wang, Y.; Wang, C. Application of sensor data based predictive maintenance and artificial neural networks to enable Industry 4.0. Adv. Manuf. 2023, 11, 248–263. [Google Scholar] [CrossRef]
  25. Kiangala, K.S.; Wang, Z. An Effective Predictive Maintenance Framework for Conveyor Motors Using Dual Time-Series Imaging and Convolutional Neural Network in an Industry 4.0 Environment. IEEE Access 2020, 8, 121033–121049. [Google Scholar] [CrossRef]
  26. Lu, B.-L.; Liu, Z.-H.; Wei, H.-L.; Chen, L.; Zhang, H.; Li, X.-H. A Deep Adversarial Learning Prognostics Model for Remaining Useful Life Prediction of Rolling Bearing. IEEE Trans. Artif. Intell. 2021, 2, 329–340. [Google Scholar] [CrossRef]
  27. Guo, D.; Chen, X.; Ma, H.; Sun, Z.; Jiang, Z. State Evaluation Method of Robot Lubricating Oil Based on Support Vector Regression. Comput. Intell. Neurosci. 2021, 2021, 9441649. [Google Scholar] [CrossRef]
  28. Nunes, P.; Rocha, E.; Santos, J.; Antunes, R. Predictive Maintenance on Injection Molds by Generalized Fault Trees and Anomaly Detection. Procedia Comput. Sci. 2023, 217, 1038–1047. [Google Scholar] [CrossRef]
  29. Scalabrini Sampaio, G.; Vallim Filho, A.R.D.A.; Santos Da Silva, L.; Augusto Da Silva, L. Prediction of Motor Failure Time Using an Artificial Neural Network. Sensors 2019, 19, 4342. [Google Scholar] [CrossRef]
  30. Quiroz, J.C.; Mariun, N.; Mehrjou, M.R.; Izadi, M.; Misron, N.; Mohd Radzi, M.A. Fault Detection of Broken Rotor Bar in LS-PMSM Using Random Forests. Measurement 2018, 116, 273–280. [Google Scholar] [CrossRef]
  31. Predictive Maintenance | GE Research. Available online: https://www.ge.com/research/project/predictive-maintenance (accessed on 26 February 2024).
  32. Wang, X.; Liu, M.; Liu, C.; Ling, L.; Zhang, X. Data-Driven and Knowledge-Based Predictive Maintenance Method for Industrial Robots for the Production Stability of Intelligent Manufacturing. Expert Syst. Appl. 2023, 234, 121136. [Google Scholar] [CrossRef]
  33. Satwaliya, D.S.; Thethi, H.P.; Dhyani, A.; Kiran, G.R.; Al-Taee, M.; Alazzam, M.B. Predictive Maintenance Using Machine Learning: A Case Study in Manufacturing Management. In Proceedings of the 2023 3rd International Conference on Advance Computing and Innovative Technologies in Engineering (ICACITE), Greater Noida, India, 12–13 May 2023; IEEE: Piscataway, NJ, USA, 2023; pp. 872–876. [Google Scholar]
  34. Nikfar, M.; Bitencourt, J.; Mykoniatis, K. A Two-Phase Machine Learning Approach for Predictive Maintenance of Low Voltage Industrial Motors. Procedia Comput. Sci. 2022, 200, 111–120. [Google Scholar] [CrossRef]
  35. Cachada, A.; Moreira, P.M.; Romero, L.; Barbosa, J.; Leitão, P.; Geraldes, C.A.S.; Deusdado, L.; Costa, J.; Teixeira, C.; Teixeira, J.; et al. Maintenance 4.0: Intelligent and Predictive Maintenance System Architecture. In Proceedings of the 2018 IEEE 23rd International Conference on Emerging Technologies and Factory Automation (ETFA), Turin, Italy, 4–7 September 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 139–146. [Google Scholar] [CrossRef]
  36. Carbery, C.M.; Woods, R.; Marshall, A.H. A Bayesian Network Based Learning System for Modelling Faults in Large-Scale Manufacturing. In Proceedings of the 2018 IEEE International Conference on Industrial Technology (ICIT), Lyon, France, 19–22 February 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 1357–1362. [Google Scholar] [CrossRef]
  37. Syafrudin, M.; Alfian, G.; Fitriyani, N.; Rhee, J. Performance Analysis of IoT-Based Sensor, Big Data Processing, and Machine Learning Model for Real-Time Monitoring System in Automotive Manufacturing. Sensors 2018, 18, 2946. [Google Scholar] [CrossRef]
  38. Li, Z.; Wang, Y.; Wang, K.-S. Intelligent Predictive Maintenance for Fault Diagnosis and Prognosis in Machine Centers: Industry 4.0 Scenario. Adv. Manuf. 2017, 5, 377–387. [Google Scholar] [CrossRef]
  39. Pejić Bach, M.; Topalović, A.; Krstić, I.; Ivec, A. Predictive Maintenance in Industry 4.0 for the SMEs: A Decision Support System Case Study Using Open-Source Software. Designs 2023, 7, 98. [Google Scholar] [CrossRef]
  40. Borghesi, A.; Burrello, A.; Bartolini, A. ExaMon-X: A Predictive Maintenance Framework for Automatic Monitoring in Industrial IoT Systems. IEEE Internet Things J. 2023, 10, 2995–3005. [Google Scholar] [CrossRef]
  41. Li, Z.; Liu, R.; Wu, D. Data-Driven Smart Manufacturing: Tool Wear Monitoring with Audio Signals and Machine Learning. J. Manuf. Processes 2019, 48, 66–76. [Google Scholar] [CrossRef]
  42. Liu, J.; Hu, Y.; Wu, B.; Wang, Y. An Improved Fault Diagnosis Approach for FDM Process with Acoustic Emission. J. Manuf. Processes 2018, 35, 570–579. [Google Scholar] [CrossRef]
  43. Taşcı, B.; Omar, A.; Ayvaz, S. Remaining Useful Lifetime Prediction for Predictive Maintenance in Manufacturing. Comput. Ind. Eng. 2023, 184, 109566. [Google Scholar] [CrossRef]
  44. Lulu, J.; Yourui, T.; Jia, W. Remaining Useful Life Prediction for Reducer of Industrial Robots Based on MCSA. In Proceedings of the 2021 Global Reliability and Prognostics and Health Management (PHM-Nanjing), Nanjing, China, 15–17 October 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 1–7. [Google Scholar] [CrossRef]
  45. Eddarhri, M.; Adib, J.; Hain, M.; Marzak, A. Towards Predictive Maintenance: The Case of the Aeronautical Industry. Procedia Comput. Sci. 2022, 203, 769–774. [Google Scholar] [CrossRef]
  46. Kuhnle, A.; Jakubik, J.; Lanza, G. Reinforcement Learning for Opportunistic Maintenance Optimization. Prod. Eng. 2019, 13, 33–41. [Google Scholar] [CrossRef]
  47. Villalba-Diez, J.; Schmidt, D.; Gevers, R.; Ordieres-Meré, J.; Buchwitz, M.; Wellbrock, W. Deep Learning for Industrial Computer Vision Quality Control in the Printing Industry 4.0. Sensors 2019, 19, 3987. [Google Scholar] [CrossRef]
  48. Banús, N.; Boada, I.; Xiberta, P.; Toldrà, P.; Bustins, N. Deep Learning for the Quality Control of Thermoforming Food Packages. Sci. Rep. 2021, 11, 21887. [Google Scholar] [CrossRef]
  49. Scime, L.; Beuth, J. Using Machine Learning to Identify In-Situ Melt Pool Signatures Indicative of Flaw Formation in a Laser Powder Bed Fusion Additive Manufacturing Process. Addit. Manuf. 2019, 25, 151–165. [Google Scholar] [CrossRef]
  50. Yin, Y.; Wan, M.; Xu, P.; Zhang, R.; Liu, Y.; Song, Y. Industrial Product Quality Analysis Based on Online Machine Learning. Sensors 2023, 23, 8167. [Google Scholar] [CrossRef]
  51. González, S.V.; Jimenez, L.C.; Given, W.G.G.; Noboa, B.V.; Enderica, C.S. Automated Quality Control System for Canned Tuna Production Using Artificial Vision. In Proceedings of the 2024 3rd International Conference on Artificial Intelligence for Internet of Things (AIIoT), Vellore, India, 3–4 May 2024; IEEE: Piscataway, NJ, USA, 2024; pp. 1–6. [Google Scholar] [CrossRef]
  52. Msakni, M.K.; Risan, A.; Schütz, P. Using Machine Learning Prediction Models for Quality Control: A Case Study from the Automotive Industry. Comput. Manag. Sci. 2023, 20, 14. [Google Scholar] [CrossRef]
  53. Sundaram, S.; Zeid, A. Artificial Intelligence-Based Smart Quality Inspection for Manufacturing. Micromachines 2023, 14, 570. [Google Scholar] [CrossRef] [PubMed]
  54. Müller, D.; März, M.; Scheele, S.; Schmid, U. An Interactive Explanatory AI System for Industrial Quality Control. arXiv 2022. [Google Scholar] [CrossRef]
  55. Caggiano, A.; Zhang, J.; Alfieri, V.; Caiazzo, F.; Gao, R.; Teti, R. Machine learning-based image processing for on-line defect recognition in additive manufacturing. CIRP Ann. 2019, 68, 451–454. [Google Scholar] [CrossRef]
  56. Kim, A.; Oh, K.; Jung, J.-Y.; Kim, B. Imbalanced classification of manufacturing quality conditions using cost-sensitive decision tree ensembles. Int. J. Comput. Integr. Manuf. 2018, 31, 701–717. [Google Scholar] [CrossRef]
  57. Gobert, C.; Reutzel, E.W.; Petrich, J.; Nassar, A.R.; Phoha, S. Application of supervised machine learning for defect detection during metallic powder bed fusion additive manufacturing using high resolution imaging. Addit. Manuf. 2018, 21, 517–528. [Google Scholar] [CrossRef]
  58. Yuan, B.; Guss, G.M.; Wilson, A.C.; Hau-Riege, S.P.; DePond, P.J.; McMains, S.; Matthews, M.J.; Giera, B. Machine-Learning-Based Monitoring of Laser Powder Bed Fusion. Adv. Mater. Technol. 2018, 3, 1800136. [Google Scholar] [CrossRef]
  59. Oh, Y.; Busogi, M.; Ransikarbum, K.; Shin, D.; Kwon, D.; Kim, N. Real-time quality monitoring and control system using an integrated cost effective support vector machine. J. Mech. Sci. Technol. 2019, 33, 6009–6020. [Google Scholar] [CrossRef]
  60. Yu, J.; Zheng, X.; Wang, S. A deep autoencoder feature learning method for process pattern recognition. J. Process Control 2019, 79, 1–15. [Google Scholar] [CrossRef]
  61. Emmert, J.; Mendez, R.; Dastjerdi, H.M.; Syben, C.; Maier, A. The Artificial Neural Twin—Process optimization and continual learning in distributed process chains. Neural Netw. 2024, 180, 106647. [Google Scholar] [CrossRef]
  62. Reguera-Bakhache, D.; Garitano, I.; Uribeetxeberria, R.; Cernuda, C.; Zurutuza, U. Data-Driven Industrial Human-Machine Interface Temporal Adaptation for Process Optimization. In Proceedings of the 2020 25th IEEE International Conference on Emerging Technologies and Factory Automation (ETFA), Vienna, Austria, 8–11 September 2020; pp. 518–525. [Google Scholar] [CrossRef]
  63. Al Aani, S.; Bonny, T.; Hasan, S.W.; Hilal, N. Can machine language and artificial intelligence revolutionize process automation for water treatment and desalination? Desalination 2019, 458, 84–96. [Google Scholar] [CrossRef]
  64. Islam, F.; Wanigasekara, C.; Rajan, G.; Swain, A.; Prusty, B.G. An approach for process optimisation of the Automated Fibre Placement (AFP) based thermoplastic composites manufacturing using Machine Learning, photonic sensing and thermo-mechanics modelling. Manuf. Lett. 2022, 32, 10–14. [Google Scholar] [CrossRef]
  65. Farooq, A.; Iqbal, K. A Survey of Reinforcement Learning for Optimization in Automation. In Proceedings of the 2024 IEEE 20th International Conference on Automation Science and Engineering (CASE), Bari, Italy, 28 August–1 September 2024; pp. 2487–2494. [Google Scholar] [CrossRef]
  66. Liu, Z.; Rolston, N.; Flick, A.C.; Colburn, T.W.; Ren, Z.; Dauskardt, R.H.; Buonassisi, T. Machine Learning with Knowledge Constraints for Process Optimization of Open-Air Perovskite Solar Cell Manufacturing. arXiv 2021. [Google Scholar] [CrossRef]
  67. Stathatos, E.; Vosniakos, G.-C. Real-time simulation for long paths in laser-based additive manufacturing: A machine learning approach. Int. J. Adv. Manuf. Technol. 2019, 104, 1967–1984. [Google Scholar] [CrossRef]
  68. Ghadai, S.; Balu, A.; Sarkar, S.; Krishnamurthy, A. Learning localized features in 3D CAD models for manufacturability analysis of drilled holes. Comput. Aided Geom. Des. 2018, 62, 263–275. [Google Scholar] [CrossRef]
  69. Gyulai, D.; Pfeiffer, A.; Nick, G.; Gallina, V.; Sihn, W.; Monostori, L. Lead time prediction in a flow-shop environment with analytical and machine learning approaches. IFAC-Pap. 2018, 51, 1029–1034. [Google Scholar] [CrossRef]
  70. Dev, S.; Srivastava, R. Experimental investigation and optimization of the additive manufacturing process through AI-based hybrid statistical approaches. Prog. Addit. Manuf. 2025, 10, 107–126. [Google Scholar] [CrossRef]
  71. Khdoudi, A.; Masrour, T.; El Hassani, I.; El Mazgualdi, C. A Deep-Reinforcement-Learning-Based Digital Twin for Manufacturing Process Optimization. Systems 2024, 12, 38. [Google Scholar] [CrossRef]
  72. Dornheim, J.; Link, N.; Gumbsch, P. Model-free Adaptive Optimal Control of Episodic Fixed-horizon Manufacturing Processes Using Reinforcement Learning. Int. J. Control Autom. Syst. 2020, 18, 1593–1604. [Google Scholar] [CrossRef]
  73. Denkena, B.; Dittrich, M.-A.; Böß, V.; Wichmann, M.; Friebe, S. Self-optimizing process planning for helical flute grinding. Prod. Eng. 2019, 13, 599–606. [Google Scholar] [CrossRef]
  74. Tan, Q.; Tong, Y.; Wu, S.; Li, D. Modeling, planning, and scheduling of shopfloor assembly process with dynamic cyber-physical interactions: A case study for CPS-based smart industrial robot production. Int. J. Adv. Manuf. Technol. 2019, 105, 3979–3989. [Google Scholar] [CrossRef]
  75. Priore, P.; Ponte, B.; Puente, J.; Gómez, A. Learning-based scheduling of flexible manufacturing systems using ensemble methods. Comput. Ind. Eng. 2018, 126, 282–291. [Google Scholar] [CrossRef]
  76. Aivaliotis, P.; Georgoulias, K.; Chryssolouris, G. The use of Digital Twin for predictive maintenance in manufacturing. Int. J. Comput. Integr. Manuf. 2019, 32, 1067–1080. [Google Scholar] [CrossRef]
  77. Pan, Y.; Kang, S.; Kong, L.; Wu, J.; Yang, Y.; Zuo, H. Remaining useful life prediction methods of equipment components based on deep learning for sustainable manufacturing: A literature review. Artif. Intell. Eng. Des. Anal. Manuf. 2025, 39, 4. [Google Scholar] [CrossRef]
  78. Xu, Q.; Ali, S.; Yue, T. Digital Twin-based Anomaly Detection with Curriculum Learning in Cyber-physical Systems. ACM Trans. Softw. Eng. Methodol. 2023, 32, 1–32. [Google Scholar] [CrossRef]
  79. Schena, L.; Marques, P.A.; Poletti, R.; Ahizi, S.; Van Den Berghe, J.; Mendez, M.A. Reinforcement Twinning: From digital twins to model-based reinforcement learning. J. Comput. Sci. 2024, 82, 102421. [Google Scholar] [CrossRef]
  80. Kreuzer, T.; Papapetrou, P.; Zdravkovic, J. Artificial intelligence in digital twins—A systematic literature review. Data Knowl. Eng. 2024, 151, 102304. [Google Scholar] [CrossRef]
  81. Jones, D.; Snider, C.; Nassehi, A.; Yon, J.; Hicks, B. Characterising the Digital Twin: A systematic literature review. CIRP J. Manuf. Sci. Technol. 2020, 29, 36–52. [Google Scholar] [CrossRef]
  82. Grieves, M.; Vickers, J. Digital Twin: Mitigating unpredictable, undesirable emergent behavior in complex systems. In Transdisciplinary Perspectives on Complex Systems; Kahlen, J., Flumerfelt, S., Alves, A., Eds.; Springer: Cham, Switzerland, 2017; pp. 85–113. [Google Scholar] [CrossRef]
  83. Zhang, M.; Tao, F.; Huang, B.; Liu, A.; Wang, L.; Anwer, N.; Nee, A.Y.C. Digital twin data: Methods and key technologies [version 2; peer review: 4 approved]. Digit. Twin 2022, 1, 2. [Google Scholar] [CrossRef]
  84. Liu, J.; Lu, X.; Zhou, Y.; Cui, J.; Wang, S.; Zhao, Z. Design of Photovoltaic Power Station Intelligent Operation and Maintenance System Based on Digital Twin. In Proceedings of the 2021 6th International Conference on Robotics and Automation Engineering (ICRAE), Guangzhou, China, 20–22 November 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 206–211. [Google Scholar] [CrossRef]
  85. Livera, A.; Paphitis, G.; Pikolos, L.; Papadopoulos, I.; Montes-Romero, J.; Lopez-Lorente, J.; Makrides, G.; Sutterlueti, J.; Georghiou, G.E. Intelligent Cloud-Based Monitoring and Control Digital Twin for Photovoltaic Power Plants. In Proceedings of the 2022 IEEE 49th Photovoltaics Specialists Conference (PVSC), Philadelphia, PA, USA, 5–10 June 2022; IEEE: Piscataway, NJ, USA, 2022; pp. 267–274. [Google Scholar] [CrossRef]
  86. Ma, Y.; Kassler, A.; Ahmed, B.S.; Krakhmalev, P.; Thore, A.; Toyser, A.; Lindbäck, H. Using Deep Reinforcement Learning for Zero Defect Smart Forging. In Advances in Transdisciplinary Engineering; Ng, A.H.C., Syberfeldt, A., Högberg, D., Holm, M., Eds.; IOS Press: Amsterdam, The Netherlands, 2022. [Google Scholar] [CrossRef]
  87. Chhetri, S.R.; Faezi, S.; Canedo, A.; Faruque, M.A.A. QUILT: Quality inference from living digital twins in IoT-enabled manufacturing systems. In Proceedings of the International Conference on Internet of Things Design and Implementation, Montreal, QC, Canada, 15–18 April 2019; ACM: New York, NY, USA, 2019; pp. 237–248. [Google Scholar] [CrossRef]
  88. Maia, E.; Wannous, S.; Dias, T.; Praça, I.; Faria, A. Holistic Security and Safety for Factories of the Future. Sensors 2022, 22, 9915. [Google Scholar] [CrossRef]
  89. Barriga, R.; Romero, M.; Nettleton, D.; Hassan, H. Advanced data modeling for industrial drying machine energy optimization. J. Supercomput. 2022, 78, 16820–16840. [Google Scholar] [CrossRef]
  90. Bansal, R.; Khanesar, M.A.; Branson, D. Ant Colony Optimization Algorithm for Industrial Robot Programming in a Digital Twin. In Proceedings of the 2019 25th International Conference on Automation and Computing (ICAC), Lancaster, UK, 5–7 September 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 1–5. [Google Scholar] [CrossRef]
  91. Liu, Y.; Xu, H.; Liu, D.; Wang, L. A digital twin-based sim-to-real transfer for deep reinforcement learning-enabled industrial robot grasping. Robot. Comput. Integr. Manuf. 2022, 78, 102365. [Google Scholar] [CrossRef]
  92. Li, J.; Pang, D.; Zheng, Y.; Guan, X.; Le, X. A flexible manufacturing assembly system with deep reinforcement learning. Control Eng. Pract. 2022, 118, 104957. [Google Scholar] [CrossRef]
  93. Kobayashi, K.; Alam, S.B. Explainable, Interpretable & Trustworthy AI for Intelligent Digital Twin: Case Study on Remaining Useful Life. arXiv 2023. [Google Scholar] [CrossRef]
  94. Mehrabi, A.; Yari, K.; Van Driel, W.D.; Poelma, R.H. AI-Driven Digital Twin for Health Monitoring of Wide Band Gap Power Semiconductors. In Proceedings of the 2024 IEEE 10th Electronics System-Integration Technology Conference (ESTC), Berlin, Germany, 11–13 September 2024; IEEE: Piscataway, NJ, USA, 2024; pp. 1–8. [Google Scholar] [CrossRef]
  95. Li, Y. Fault Prediction and Diagnosis System for Large-diameter Auger Rigs Based on Digital Twin and BP Neural Network. In Proceedings of the 2022 IEEE 6th Advanced Information Technology, Electronic and Automation Control Conference (IAEAC), Beijing, China, 7–9 January 2022; IEEE: Piscataway, NJ, USA, 2022; pp. 523–527. [Google Scholar] [CrossRef]
  96. Guo, K.; Wan, X.; Liu, L.; Gao, Z.; Yang, M. Fault Diagnosis of Intelligent Production Line Based on Digital Twin and Improved Random Forest. Appl. Sci. 2021, 11, 7733. [Google Scholar] [CrossRef]
  97. Chakrabarti, A.; Sukumar, R.P.; Jarke, M.; Rudack, M.; Buske, P.; Holly, C. Efficient Modeling of Digital Shadows for Production Processes: A Case Study for Quality Prediction in High Pressure Die Casting Processes. In Proceedings of the 2021 IEEE 8th International Conference on Data Science and Advanced Analytics (DSAA), Porto, Portugal, 6–9 October 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 1–9. [Google Scholar] [CrossRef]
  98. Boulfani, F.; Gendre, X.; Ruiz-Gazen, A.; Salvignol, M. Anomaly detection for aircraft electrical generator using machine learning in a functional data framework. In Proceedings of the 2020 Global Congress on Electrical Engineering (GC-ElecEng), Valencia, Spain, 4–6 November 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 27–32. [Google Scholar] [CrossRef]
  99. Xu, Y.; Sun, Y.; Liu, X.; Zheng, Y. A Digital-Twin-Assisted Fault Diagnosis Using Deep Transfer Learning. IEEE Access 2019, 7, 19990–19999. [Google Scholar] [CrossRef]
  100. Ren, Z.; Wan, J.; Deng, P. Machine-Learning-Driven Digital Twin for Lifecycle Management of Complex Equipment. IEEE Trans. Emerg. Top. Comput. 2022, 10, 9–22. [Google Scholar] [CrossRef]
  101. Xiong, M.; Wang, H.; Fu, Q.; Xu, Y. Digital twin–driven aero-engine intelligent predictive maintenance. Int. J. Adv. Manuf. Technol. 2021, 114, 3751–3761. [Google Scholar] [CrossRef]
  102. Piltan, F.; Toma, R.N.; Shon, D.; Im, K.; Choi, H.-K.; Yoo, D.-S.; Kim, J.-M. Strict-Feedback Backstepping Digital Twin and Machine Learning Solution in AE Signals for Bearing Crack Identification. Sensors 2022, 22, 539. [Google Scholar] [CrossRef]
  103. Tancredi, G.P.; Vignali, G.; Bottani, E. Integration of Digital Twin, Machine-Learning and Industry 4.0 Tools for Anomaly Detection: An Application to a Food Plant. Sensors 2022, 22, 4143. [Google Scholar] [CrossRef]
  104. Melesse, T.Y.; Bollo, M.; Pasquale, V.D.; Centro, F.; Riemma, S. Machine Learning-Based Digital Twin for Monitoring Fruit Quality Evolution. Procedia Comput. Sci. 2022, 200, 13–20. [Google Scholar] [CrossRef]
  105. Alexopoulos, K.; Nikolakis, N.; Chryssolouris, G. Digital twin-driven supervised machine learning for the development of artificial intelligence applications in manufacturing. Int. J. Comput. Integr. Manuf. 2020, 33, 429–439. [Google Scholar] [CrossRef]
  106. Wang, Q.; Jiao, W.; Zhang, Y. Deep learning-empowered digital twin for visualized weld joint growth monitoring and penetration control. J. Manuf. Syst. 2020, 57, 429–439. [Google Scholar] [CrossRef]
  107. Angin, P.; Anisi, M.H.; Göksel, F.; Gürsoy, C.; Büyükgülcü, A. AgriLoRa: A Digital Twin Framework for Smart Agriculture. J. Wirel. Mob. Netw. Ubiquitous Comput. Dependable Appl. 2020, 11, 77–96. [Google Scholar] [CrossRef]
  108. Zhang, Q.; Li, Y.; Lim, E.; Sun, J. Real Time Object Detection in Digital Twin with Point-Cloud Perception for a Robotic Manufacturing Station. In Proceedings of the 2022 27th International Conference on Automation and Computing (ICAC), Bristol, UK, 1–3 September 2022; pp. 1–6. [Google Scholar] [CrossRef]
  109. Zhou, X.; Xu, X.; Liang, W.; Zeng, Z.; Shimizu, S.; Yang, L.T.; Jin, Q. Intelligent Small Object Detection for Digital Twin in Smart Manufacturing With Industrial cyber–physical Systems. IEEE Trans. Ind. Inform. 2022, 18, 1377–1386. [Google Scholar] [CrossRef]
  110. Wang, Q.; Jiao, W.; Wang, P.; Zhang, Y. Digital Twin for Human-Robot Interactive Welding and Welder Behavior Analysis. IEEE/CAA J. Autom. Sin. 2021, 8, 334–343. [Google Scholar] [CrossRef]
  111. Chiurco, A.; Elbasheer, M.; Longo, F.; Nicoletti, L.; Solina, V. Data Modeling and ML Practice for Enabling Intelligent Digital Twins in Adaptive Production Planning and Control. Procedia Comput. Sci. 2023, 217, 1908–1917. [Google Scholar] [CrossRef]
  112. Chen, D.; Lv, Z. Artificial Intelligence Enabled Digital Twins for Training Autonomous Cars. Internet Things Cyber-Phys. Syst. 2022, 2, 31–41. [Google Scholar] [CrossRef]
  113. Shen, G.; Lei, L.; Li, Z.; Cai, S.; Zhang, L.; Cao, P.; Liu, X. Deep Reinforcement Learning for Flocking Motion of Multi-UAV Systems: Learn From a Digital Twin. IEEE Internet Things J. 2022, 9, 11141–11153. [Google Scholar] [CrossRef]
  114. Li, B.; Liu, Y.; Tan, L.; Pan, H.; Zhang, Y. Digital Twin Assisted Task Offloading for Aerial Edge Computing and Networks. IEEE Trans. Veh. Technol. 2022, 71, 10863–10877. [Google Scholar] [CrossRef]
  115. Urgo, M.; Terkaj, W.; Simonetti, G. Monitoring Manufacturing Systems Using AI: A Method Based on a Digital Factory Twin to Train CNNs on Synthetic Data. CIRP J. Manuf. Sci. Technol. 2024, 50, 249–268. [Google Scholar] [CrossRef]
  116. Gallala, A.; Kumar, A.A.; Hichri, B.; Plapper, P. Digital Twin for Human–Robot Interactions by Means of Industry 4.0 Enabling Technologies. Sensors 2022, 22, 4950. [Google Scholar] [CrossRef] [PubMed]
  117. Wang, T.; Cheng, J.; Yang, Y.; Esposito, C.; Snoussi, H.; Tao, F. Adaptive Optimization Method in Digital Twin Conveyor Systems via Range-Inspection Control. IEEE Trans. Autom. Sci. Eng. 2022, 19, 1296–1304. [Google Scholar] [CrossRef]
  118. Li, H.; Liu, Z.; Yuan, W.; Chen, G.; Chen, X.; Yang, Y.; Peng, J. The Digital Twin Model of Chemical Production Systems in Smart Factories: A Case Study. In Proceedings of the 2021 IEEE 23rd International Conference on High Performance Computing & Communications (HPCC); 7th International Conference on Data Science & Systems; 19th International Conference on Smart City; 7th International Conference on Dependability in Sensor, Cloud & Big Data Systems & Application (DSS/SmartCity/DependSys), Haikou, China,, 20–22 December 2021; pp. 1035–1041. [Google Scholar] [CrossRef]
  119. Min, Q.; Lu, Y.; Liu, Z.; Su, C.; Wang, B. Machine Learning Based Digital Twin Framework for Production Optimization in Petrochemical Industry. Int. J. Inf. Manag. 2019, 49, 502–519. [Google Scholar] [CrossRef]
  120. Huang, B.; Wang, D.; Li, H.; Zhao, C. Network Selection and QoS Management Algorithm for 5G Converged Shipbuilding Network Based on Digital Twin. In Proceedings of the 2022 10th International Conference on Information and Education Technology (ICIET), Matsue, Japan, 9–11 April 2022; pp. 403–408. [Google Scholar] [CrossRef]
  121. Dos Santos, C.H.; Gabriel, G.T.; Do Amaral, J.V.S.; Montevechi, J.A.B.; De Queiroz, J.A. Decision-Making in a Fast Fashion Company in the Industry 4.0 Era: A Digital Twin Proposal to Support Operational Planning. Int. J. Adv. Manuf. Technol. 2021, 116, 1653–1666. [Google Scholar] [CrossRef]
  122. Latif, H.; Shao, G.; Starly, B. A Case Study of Digital Twin for a Manufacturing Process Involving Human Interactions. In Proceedings of the 2020 Winter Simulation Conference (WSC), Orlando, FL, USA, 14–18 December 2020; pp. 2659–2670. [Google Scholar] [CrossRef]
  123. Qiu, T.; Chi, J.; Zhou, X.; Ning, Z.; Atiquzzaman, M.; Wu, D.O. Edge Computing in Industrial Internet of Things: Architecture, Advances and Challenges. IEEE Commun. Surv. Tutor. 2020, 22, 2462–2488. [Google Scholar] [CrossRef]
  124. Nain, G.; Pattanaik, K.K.; Sharma, G.K. Towards Edge Computing in Intelligent Manufacturing: Past, Present and Future. J. Manuf. Syst. 2022, 62, 588–611. [Google Scholar] [CrossRef]
  125. Montes-Sánchez, J.M.; Uwate, Y.; Nishio, Y.; Vicente-Díaz, S.; Jiménez-Fernández, A. Predictive Maintenance Edge Artificial Intelligence Application Study Using Recurrent Neural Networks for Early Aging Detection in Peristaltic Pumps. IEEE Trans. Reliab. 2024, 1–15. [Google Scholar] [CrossRef]
  126. Parikh, S.; Dave, D.; Patel, R.; Doshi, N. Security and Privacy Issues in Cloud, Fog and Edge Computing. Procedia Comput. Sci. 2019, 160, 734–739. [Google Scholar] [CrossRef]
  127. Maciel, P.; Dantas, J.; Melo, C.; Pereira, P.; Oliveira, F.; Araujo, J.; Matos, R. A Survey on Reliability and Availability Modeling of Edge, Fog, and Cloud Computing. J. Reliab. Intell. Environ. 2022, 8, 227–245. [Google Scholar] [CrossRef]
  128. Artiushenko, V.; Lang, S.; Lerez, C.; Reggelin, T.; Hackert-Oschätzchen, M. Resource-Efficient Edge AI Solution for Predictive Maintenance. Procedia Comput. Sci. 2024, 232, 348–357. [Google Scholar] [CrossRef]
  129. Texas Instruments. TMS320F28P55x Real-Time Microcontrollers Datasheet (Rev. B). 2024. Available online: https://www.ti.com/lit/ds/symlink/tms320f28p550sj.pdf (accessed on 6 May 2025).
  130. What’s New in Artificial Intelligence from the 2023 Gartner Hype CycleTM. Available online: https://www.gartner.com/en/articles/what-s-new-in-artificial-intelligence-from-the-2023-gartner-hype-cycle (accessed on 6 May 2025).
  131. Bringing AI to the Device: Edge AI Chips Come Into Their Own. Available online: https://www2.deloitte.com/us/en/insights/industry/technology/technology-media-and-telecom-predictions/2020/ai-chips.html (accessed on 6 May 2025).
  132. Hsu, H.-Y.; Srivastava, G.; Wu, H.-T.; Chen, M.-Y. Remaining Useful Life Prediction Based on State Assessment Using Edge Computing on Deep Learning. Comput. Commun. 2020, 160, 91–100. [Google Scholar] [CrossRef]
  133. Teoh, Y.K.; Gill, S.S.; Parlikad, A.K. IoT and Fog-Computing-Based Predictive Maintenance Model for Effective Asset Management in Industry 4.0 Using Machine Learning. IEEE Internet Things J. 2023, 10, 2087–2094. [Google Scholar] [CrossRef]
  134. Zhang, T.; Ding, B.; Zhao, X.; Liu, G.; Pang, Z. LearningADD: Machine Learning Based Acoustic Defect Detection in Factory Automation. J. Manuf. Syst. 2021, 60, 48–58. [Google Scholar] [CrossRef]
  135. Milić, S.D.; Miladinović, N.M.; Rakić, A. A Wayside Hotbox System with Fuzzy and Fault Detection Algorithms in IIoT Environment. Control Eng. Pract. 2020, 104, 104624. [Google Scholar] [CrossRef]
  136. Hu, L.; Miao, Y.; Wu, G.; Hassan, M.M.; Humar, I. iRobot-Factory: An Intelligent Robot Factory Based on Cognitive Manufacturing and Edge Computing. Future Gener. Comput. Syst. 2019, 90, 569–577. [Google Scholar] [CrossRef]
  137. Huang, H.; Ding, S.; Zhao, L.; Huang, H.; Chen, L.; Gao, H.; Ahmed, S.H. Real-Time Fault Detection for IIoT Facilities Using GBRBM-Based DNN. IEEE Internet Things J. 2020, 7, 5713–5722. [Google Scholar] [CrossRef]
  138. Zhao, X.; Lv, K.; Zhang, Z.; Zhang, Y.; Wang, Y. A Multi-Fault Diagnosis Method of Gear-Box Running on Edge Equipment. J. Cloud Comput. 2020, 9, 58. [Google Scholar] [CrossRef]
  139. Kim, D.; Yang, H.; Chung, M.; Cho, S.; Kim, H.; Kim, M.; Kim, K.; Kim, E. Squeezed Convolutional Variational AutoEncoder for Unsupervised Anomaly Detection in Edge Device Industrial Internet of Things. In Proceedings of the 2018 International Conference in Information and Communication Technologies (ICICT), DeKalb, IL, USA, 23–25 March 2018; pp. 67–71. [Google Scholar] [CrossRef]
  140. Wang, Y.; Liu, M.; Zheng, P.; Yang, H.; Zou, J. A Smart Surface Inspection System Using Faster R-CNN in Cloud-Edge Computing Environment. Adv. Eng. Inform. 2020, 43, 101037. [Google Scholar] [CrossRef]
  141. Ha, H.; Jeong, J. CNN-Based Defect Inspection for Injection Molding Using Edge Computing and Industrial IoT Systems. Appl. Sci. 2021, 11, 6378. [Google Scholar] [CrossRef]
  142. Zhu, Z.; Han, G.; Jia, G.; Shu, L. Modified DenseNet for Automatic Fabric Defect Detection With Edge Computing for Minimizing Latency. IEEE Internet Things J. 2020, 7, 9623–9636. [Google Scholar] [CrossRef]
  143. Gauttam, H.; Pattanaik, K.K.; Bhadauria, S.; Nain, G.; Prakash, P.B. An Efficient DNN Splitting Scheme for Edge-AI Enabled Smart Manufacturing. J. Ind. Inf. Integr. 2023, 34, 100481. [Google Scholar] [CrossRef]
  144. Fraga-Lamas, P.; Lopes, S.I.; Fernández-Caramés, T.M. Green IoT and Edge AI as Key Technological Enablers for a Sustainable Digital Transition towards a Smart Circular Economy: An Industry 5.0 Use Case. Sensors 2021, 21, 5745. [Google Scholar] [CrossRef]
  145. Yang, X.; Han, M.; Tang, H.; Li, Q.; Luo, X. Detecting Defects With Support Vector Machine in Logistics Packaging Boxes for Edge Computing. IEEE Access 2020, 8, 64002–64010. [Google Scholar] [CrossRef]
  146. Schmitt, J.; Bönig, J.; Borggräfe, T.; Beitinger, G.; Deuse, J. Predictive Model-Based Quality Inspection Using Machine Learning and Edge Cloud Computing. Adv. Eng. Inform. 2020, 45, 101101. [Google Scholar] [CrossRef]
  147. Ma, Q.; Niu, J.; Ouyang, Z.; Li, M.; Ren, T.; Li, Q. Edge Computing-Based 3D Pose Estimation and Calibration for Robot Arms. In Proceedings of the 2020 7th IEEE International Conference on Cyber Security and Cloud Computing (CSCloud)/6th IEEE International Conference on Edge Computing and Scalable Cloud (EdgeCom), New York, NY, USA, 22–24 August 2020; pp. 246–251. [Google Scholar] [CrossRef]
  148. Zhang, C.; Ji, W. Edge Computing Enabled Production Anomalies Detection and Energy-Efficient Production Decision Approach for Discrete Manufacturing Workshops. IEEE Access 2020, 8, 158197–158207. [Google Scholar] [CrossRef]
  149. Du, J.; Li, X.; Gao, Y.; Gao, L. Integrated Gradient-Based Continuous Wavelet Transform for Bearing Fault Diagnosis. Sensors 2022, 22, 8760. [Google Scholar] [CrossRef] [PubMed]
  150. Yang, D.; Karimi, H.R.; Gelman, L. A Fuzzy Fusion Rotating Machinery Fault Diagnosis Framework Based on the Enhancement Deep Convolutional Neural Networks. Sensors 2022, 22, 671. [Google Scholar] [CrossRef]
  151. Janssens, O.; Loccufier, M.; Van Hoecke, S. Thermal Imaging and Vibration-Based Multisensor Fault Detection for Rotating Machinery. IEEE Trans. Ind. Inform. 2019, 15, 434–444. [Google Scholar] [CrossRef]
  152. Duquesnoy, M.; Liu, C.; Dominguez, D.Z.; Kumar, V.; Ayerbe, E.; Franco, A.A. Machine Learning-Assisted Multi-Objective Optimization of Battery Manufacturing from Synthetic Data Generated by Physics-Based Simulations. arXiv 2022. [Google Scholar] [CrossRef]
  153. Khosravi, H.; Farhadpour, S.; Grandhi, M.; Raihan, A.S.; Das, S.; Ahmed, I. Strategic Data Augmentation with CTGAN for Smart Manufacturing: Enhancing Machine Learning Predictions of Paper Breaks in Pulp-and-Paper Production. arXiv 2023. [Google Scholar] [CrossRef]
  154. Liaskovska, S.; Tyskyi, S.; Martyn, Y.; Augousti, A.T.; Kulyk, V. Systematic Generation and Evaluation of Synthetic Production Data for Industry 5.0 Optimization. Technologies 2025, 13, 84. [Google Scholar] [CrossRef]
  155. Singh, R.; Gill, S.S. Edge AI: A Survey. Internet Things Cyber-Phys. Syst. 2023, 3, 71–92. [Google Scholar] [CrossRef]
  156. Lim, W.Y.B.; Luong, N.C.; Hoang, D.T.; Jiao, Y.; Liang, Y.-C.; Yang, Q.; Niyato, D.; Miao, C. Federated Learning in Mobile Edge Networks: A Comprehensive Survey. arXiv 2019. [Google Scholar] [CrossRef]
  157. Texas Instruments. TIDA-010955 Arc Fault Detection Using Embedded AI Models Reference Design. 2024. Available online: https://www.ti.com/tool/TIDA-010955 (accessed on 21 May 2025).
  158. Texas Instruments. Motor Fault Detection Using Embedded AI Models. 2024. Available online: https://dev.ti.com/tirex/explore/node?node=A__AYjCIAmJIjRiZJ7OuRmv0w__motor_control_c2000ware_sdk_software_package__0.jXikd__LATEST (accessed on 21 May 2025).
  159. Xu, C.; Zhu, G. Intelligent Manufacturing Lie Group Machine Learning: Real-Time and Efficient Inspection System Based on Fog Computing. J. Intell. Manuf. 2021, 32, 237–249. [Google Scholar] [CrossRef]
  160. Guerra, R.H.; Quiza, R.; Villalonga, A.; Arenas, J.; Castano, F. Digital Twin-Based Optimization for Ultraprecision Motion Systems With Backlash and Friction. IEEE Access 2019, 7, 93462–93472. [Google Scholar] [CrossRef]
  161. ABB. ABB AbilityTM Digital Powertrain—Condition Monitoring of Rotating Equipment Fitted with ABB AbilityTM Smart Sensors (EN). Available online: https://new.abb.com/service/motion/data-and-advisory-services/condition-monitoring-for-rotating-equipment (accessed on 21 May 2025).
  162. ABB. ABB Ability Predictive Maintenance for Grinding. Available online: https://new.abb.com/mining/services/advanced-digital-services/predictive-maintenance-grinding (accessed on 21 May 2025).
  163. IBM. IBM Maximo Application Suite (2025). Available online: https://www.ibm.com/docs/en/masv-and-l/cd?topic=overview-maximo-application-suite-technical (accessed on 21 May 2025).
  164. PTC. ThingWorx: Industrial IoT Software | IIoT Platform. Available online: https://www.ptc.com/en/products/thingworx (accessed on 21 May 2025).
  165. Microsoft Azure. Azure IoT—Internet of Things Platform. Available online: https://azure.microsoft.com/en-us/solutions/iot (accessed on 21 May 2025).
  166. Uptake. Predictive Maintenance. Available online: https://uptake.com/ (accessed on 21 May 2025).
  167. MathWorks. Predictive Maintenance Toolbox. Available online: https://www.mathworks.com/products/predictive-maintenance.html (accessed on 21 May 2025).
  168. Cognex. VisionPro Software. Available online: https://www.cognex.com/products/machine-vision/vision-software/visionpro-software (accessed on 21 May 2025).
  169. KEYENCE. Vision Systems | KEYENCE America. Available online: https://www.keyence.com/products/vision/vision-sys/ (accessed on 21 May 2025).
  170. NI. What Is the NI Vision Development Module. Available online: https://www.ni.com/en/shop/data-acquisition-and-control/add-ons-for-data-acquisition-and-control/what-is-vision-development-module.html (accessed on 21 May 2025).
  171. Integrys. Matrox Imaging Library (MIL) Developing Machine Vision. Available online: https://integrys.com/product/matrox-imaging-library-mil/ (accessed on 21 May 2025).
  172. ZEISS. ZEISS PiWeb|Quality Data Management. Available online: https://www.zeiss.com/metrology/us/software/zeiss-piweb.html (accessed on 21 May 2025).
  173. Mao, M.; Hong, M. YOLO Object Detection for Real-Time Fabric Defect Inspection in the Textile Industry: A Review of YOLOv1 to YOLOv11. Sensors 2025, 25, 2270. [Google Scholar] [CrossRef]
  174. Razavi, M.; Mavaddati, S.; Koohi, H. ResNet Deep Models and Transfer Learning Technique for Classification and Quality Detection of Rice Cultivars. Expert Syst. Appl. 2024, 247, 123276. [Google Scholar] [CrossRef]
  175. Yadao, G.G.; Julkipli, O.M.Y.; Manlises, C.O. Performance Analysis of EfficientNet for Rice Grain Quality Control—An Evaluation against YOLOv7 and YOLOv8. In Proceedings of the 2024 7th International Conference on Information and Computer Technologies (ICICT), Honolulu, HI, USA, 15–17 March 2024; IEEE: Piscataway, NJ, USA, 2024; pp. 93–98. [Google Scholar] [CrossRef]
  176. Amazon Web Services. Computer Vision SDK—AWS Panorama. Available online: https://aws.amazon.com/panorama/ (accessed on 21 May 2025).
  177. Edge Impulse. The Leading Edge AI Platform. Available online: https://edgeimpulse.com/ (accessed on 21 May 2025).
  178. Coral. Available online: https://coral.ai/ (accessed on 21 May 2025).
  179. Siemens. Plant Simulation Software|Siemens Software. Available online: https://plm.sw.siemens.com/en-US/tecnomatix/plant-simulation-software/ (accessed on 21 May 2025).
  180. Rockwell Automation. Arena Simulation Software. Available online: https://www.rockwellautomation.com/de-de/products/software/arena-simulation.html (accessed on 21 May 2025).
  181. AnyLogic. Simulation Modeling Software Tools & Solutions. Available online: https://www.anylogic.com/ (accessed on 21 May 2025).
  182. AspenTech. Aspen Plus|Leading Process Simulation Software. Available online: https://www.aspentech.com/en/products/engineering/aspen-plus (accessed on 21 May 2025).
  183. AspenTech. Aspen HYSYS|Process Simulation Software. Available online: https://www.aspentech.com/en/products/engineering/aspen-hysys (accessed on 21 May 2025).
  184. Ansys. Ansys Twin Builder|Create and Deploy Digital Twin Models. Available online: https://www.ansys.com/products/digital-twin/ansys-twin-builder (accessed on 21 May 2025).
  185. Schmid, J.; Teichert, K.; Chioua, M.; Schindler, T.; Bortz, M. Simulation and Optimal Control of the Williams-Otto Process Using Pyomo. arXiv 2020. [Google Scholar] [CrossRef]
  186. Akiba, T.; Sano, S.; Yanase, T.; Ohta, T.; Koyama, M. Optuna: A Next-generation Hyperparameter Optimization Framework. In Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, Anchorage, AK, USA, 4–8 August 2019; ACM: New York, NY, USA, 2019; pp. 2623–2631. [Google Scholar] [CrossRef]
  187. Eckman, D.J.; Henderson, S.G.; Shashaani, S. SimOpt: A Testbed for Simulation-Optimization Experiments. INFORMS J. Comput. 2023, 35, 495–508. [Google Scholar] [CrossRef]
  188. Raffin, A.; Hill, A.; Gleave, A.; Kanervisto, A.; Ernestus, M.; Dormann, N. Stable-Baselines3: Reliable Reinforcement Learning Implementations. J. Mach. Learn. Res. 2021, 22, 1–8. [Google Scholar]
  189. Ray Project. RLlib: Industry-Grade, Scalable Reinforcement Learning—Ray 2.46.0. Available online: https://docs.ray.io/en/latest/rllib/index.html (accessed on 21 May 2025).
  190. General Electric (GE). Predix Overview. Available online: https://www.gevernova.com/software/documentation/predix-platforms/PDFs/Predix%20Overview.pdf (accessed on 21 May 2025).
  191. Siemens. Insights Hub Capability Packages Product Sheet. 2024. Available online: https://plm.sw.siemens.com/en-US/insights-hub/resources/product-sheets/ (accessed on 21 May 2025).
  192. Intel. Intel® Distribution of OpenVINO™ Toolkit. Available online: https://www.intel.com/content/www/us/en/developer/tools/openvino-toolkit/overview.html (accessed on 21 May 2025).
  193. NVIDIA. TensorRT. Available online: https://developer.nvidia.com/tensorrt (accessed on 21 May 2025).
  194. NVIDIA. Dynamo. Available online: https://developer.nvidia.com/dynamo (accessed on 21 May 2025).
  195. Amazon Web Services. IoT Edge, Open Source Edge—AWS IoT Greengrass. Available online: https://aws.amazon.com/greengrass/ (accessed on 21 May 2025).
  196. Microsoft Azure. Digital Twins—Modeling and Simulations. Available online: https://azure.microsoft.com/en-us/products/digital-twins (accessed on 21 May 2025).
Figure 1. Year-wise publication trends (2015–2024) on ML integration in PdM, QC, PO, DT, and Edge AI within industrial automation.
Figure 1. Year-wise publication trends (2015–2024) on ML integration in PdM, QC, PO, DT, and Edge AI within industrial automation.
Automation 06 00037 g001
Figure 2. Classification of machine learning algorithms.
Figure 2. Classification of machine learning algorithms.
Automation 06 00037 g002
Figure 3. Schematic diagram of a Digital Twin with an AI component [81].
Figure 3. Schematic diagram of a Digital Twin with an AI component [81].
Automation 06 00037 g003
Figure 4. Five-dimensional architecture of a comprehensive Digital Twin system.
Figure 4. Five-dimensional architecture of a comprehensive Digital Twin system.
Automation 06 00037 g004
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Rahman, M.A.; Shahrior, M.F.; Iqbal, K.; Abushaiba, A.A. Enabling Intelligent Industrial Automation: A Review of Machine Learning Applications with Digital Twin and Edge AI Integration. Automation 2025, 6, 37. https://doi.org/10.3390/automation6030037

AMA Style

Rahman MA, Shahrior MF, Iqbal K, Abushaiba AA. Enabling Intelligent Industrial Automation: A Review of Machine Learning Applications with Digital Twin and Edge AI Integration. Automation. 2025; 6(3):37. https://doi.org/10.3390/automation6030037

Chicago/Turabian Style

Rahman, Mohammad Abidur, Md Farhan Shahrior, Kamran Iqbal, and Ali A. Abushaiba. 2025. "Enabling Intelligent Industrial Automation: A Review of Machine Learning Applications with Digital Twin and Edge AI Integration" Automation 6, no. 3: 37. https://doi.org/10.3390/automation6030037

APA Style

Rahman, M. A., Shahrior, M. F., Iqbal, K., & Abushaiba, A. A. (2025). Enabling Intelligent Industrial Automation: A Review of Machine Learning Applications with Digital Twin and Edge AI Integration. Automation, 6(3), 37. https://doi.org/10.3390/automation6030037

Article Metrics

Back to TopTop