Next Article in Journal
eCBAM and saSIoU Co-Optimized YOLOv11 for Riverine Floating Garbage Classification Under Complex Aquatic Scenarios
Previous Article in Journal
New CdS–Bentonite Composites with Photocatalytic Properties
error_outline You can access the new MDPI.com website here. Explore and share your feedback with us.
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Framework for a Digital Twin of Inspection Robots

1
Department of Civil and Mechanical Engineering, University of Cassino and Southern Lazio, Via Di Biasio 43, 03043 Cassino, Italy
2
Department of Mechanical, Chemical and Materials Engineering, University of Cagliari, Via Marengo 2, 09123 Cagliari, Italy
3
Department of Mechatronics, Silesian University of Technology, ul. Akademicka 10A, 44-100 Gliwice, Poland
*
Author to whom correspondence should be addressed.
Appl. Sci. 2026, 16(2), 650; https://doi.org/10.3390/app16020650
Submission received: 13 December 2025 / Revised: 30 December 2025 / Accepted: 6 January 2026 / Published: 8 January 2026

Abstract

The study addresses the design and implementation of a modular, scalable platform for specialized inspection tasks, highlighting its suitability for future research activities. The work presents a fully validated methodology that encompasses both the physical robot and its digital twin. Specifically, the objective of this work is to design and develop a sensor-equipped mobile robot designed for inspection and surveillance tasks. The study places particular emphasis on the robot’s actuation system, the design and implementation of its control architecture, and the creation of a PC-based control interface. Additionally, suitable sensors can be integrated to enable future capabilities in automatic obstacle detection and autonomous navigation. The paper presents a digital shadow/DT-enabling framework to support inspection and surveillance operations, grounded in the digital representation of the robot.

1. Introduction

Robotic systems have become integral to modern manufacturing, where they are primarily utilized to execute repetitive, labor-intensive, and high-precision tasks. The global industrial robotics market has demonstrated significant growth, with an estimated 2.1 million new installations recorded between 2018 and 2021 as reported in [1]. Advanced robotics deals with systems distinguished by a high level of autonomy, precision, and adaptability. These robots typically incorporate sophisticated control and planning algorithms that enable them to execute complex tasks within dynamic and unstructured environments. In conventional industrial systems, real-time monitoring and inspections of equipment safety and the prompt identification of potential safety issues are fundamental prerequisites for the stable, reliable and efficient operation of substations [2]. Robots are used for inspecting components in the manufacturing and remanufacturing industries [3]. Moreover, to address the growing complexity of modern manufacturing requirements, robotic systems integrated with Digital Twin (DT) technologies are being increasingly deployed in advanced applications, such as Human–Robot Collaboration (HRC) and Human–Robot Interaction (HRI) [4,5].
A Digital Twin is the functionally correct, predictable, and reproducible digital representation of a product or system under development at the appropriate level of fidelity to perform verification, performance analysis, and system validation tasks [6]. An overview of the main components of a digital twin for robotics, encompassing the physical element, the virtual element, middleware, and the service and communication (transport) layers, is proposed in [7].
A DT concept varies significantly depending on whether the DT is perceived as a model or simulation technology or in relation to Cyber-Physical Systems (CPSs). These differences primarily arise from the degree of integration and the mode of data exchange between physical and virtual entities, as reported in [8]. While both the CPS and DT aim to bridge the physical and virtual worlds, a key distinction lies in their respective emphases. The DT centers on the virtual model and associated data to accurately represent and simulate the behavior of a product. Conversely, the CPS focuses on the direct integration and coordination of sensors and actuators within the physical system [9]. The interaction and data exchange between the physical and virtual models can occur in either manual or automated modes. When data transfer between the two models is performed manually, changes in the physical or virtual model do not trigger automatic updates in the other, as the virtual representation is referred to as a digital model. If the data exchange is fully automated between the physical and virtual models, changes in one model are automatically updated in the other; the system is classified as a digital twin. Conversely, when data transfer is automated only from the physical model to the virtual model, without reciprocal updates, the virtual representation is referred to as a digital shadow, as defined in [1].
Embodied Artificial Intelligence (EAI) refers to the advancement of intelligent systems that integrate perception, cognition, and physical interaction to operate effectively within dynamic, unstructured, and uncertain real-world environments [10]. The authors of [11] examine how the integration of DTs with EAI can mitigate the simulation-to-reality gap by enabling virtual environments to function as dynamic, data-driven platforms for continuous learning, adaptation, and validation of real-world robotic behaviors. In ref. [12], a system architecture integrating digital twins and Virtual Reality was proposed to enhance human–robot collaborative systems, implementing a prototype that allows end-users to simulate and optimize disassembly operations in a safe, virtual environment. Although robots have been utilized in industrial settings and for inspection tasks, enabling technologies, applications, practical challenges, and trends remain areas of ongoing research, as noted in [13], which proposes a systematic review of the state of the art in robot development. Current applications of the pre-DT are related to programming and action generation for robots in industrial settings, particularly for disassembly operations, as proposed in [14]. In Ref. [15], an integration of a collaborative industrial robot in a DT was proposed using game engine and developing a ROS package for the specific system.
Several mobile robots can carry suitable sensors for inspection and monitoring for industrial and non-conventional applications. A power generator inspection robot was proposed in [16] and was based on a crawler design. A robot for inspection of a thermal power plant was detailed in [17]. An inspection robot was proposed in [18] for inspecting power plants, with a focus on obstacle recognition and avoidance using a vision system. In Ref. [19], a wheeled robot was proposed to determine the damage of a large concrete structure strengthened with fiber-reinforced polymer laminates using diagnostic sensors and an active thermographic method.
The authors of [20] review current robotic inspection technologies, assess their performance, and outline key trends shaping the future of autonomous infrastructure monitoring. A four-wheeled pipeline inspection robot was presented in [21]. A sewer system inspection robot was proposed in [22]. The development of an in-pipe inspection robot system designed for large-diameter water pipes was proposed in [23].
Referring to indoor and outdoor mobility by ground, mobile robots are generally classified according to their motion capability, being wheeled/tracked [24,25,26], legged [27,28], or combinations of locomotion types, named hybrid mobile robots [29,30]. Synthesis of mechanisms is often used to enhance a system’s capabilities in overcoming obstacles; methodologies are proposed in [31]. This paper proposes the design, simulation, and experimentation of a mobile robot solution for inspection and surveillance. Its development was carried out to be linked to the realization of the DT for mission planning and adjustments in response to challenging conditions. Robots are poised to play a central role in the future of infrastructure inspection. Fully harnessing their potential requires the development of digital twins that accurately replicate both the robots and the demanding, often harsh environments in which they operate [32,33]. These digital counterparts enable a rapid and iterative process for robot design, development, and testing, thereby supporting their industrial deployment and standardization. For effective operation, digital twins must interact seamlessly with their physical counterparts through an integrated digital infrastructure, allowing for realistic simulation, emulation, and goal-oriented design optimization for specific missions. This work focuses on the development of a modular and extensible platform for dedicated inspection tasks, with demonstrated potential for further research applications. It presents a practical and experimentally validated approach, spanning from the robotic prototype to its digital twin, thereby constituting a concrete contribution to applied robotics.

2. Design of the Inspection Robot

Inspection, maintenance, and assessment tasks that typically require trained staff to be carried out have been performed by mobile robots as an alternative resource, particularly in harsh environments where human operators are not feasible. These automatic systems can perform specific operations very efficiently, avoiding dangerous risks to human operators due to the physical interaction with the surrounding environment. Additionally, robots can be equipped with several sensors, able to obtain additional information regarding other process variables while maintaining a compact structure.
The design of a mobile robot for inspection purposes should be adequate for certain terrains or locations where it is expected to operate, such as in the presence of physical barriers or obstacles that it must overcome or avoid. In these complex scenarios, the teleoperation task can become increasingly difficult for the human controller, as there are more actions and movements of the robot to control. Additionally, there might be a diminished awareness of the robot’s environment due to poor visibility. To overcome these issues, it is possible to automate some or all locomotion functions of mobile robots, thereby decreasing the time spent by the mobile robot to complete its inspection function. The action and movements of a robotic system can be controlled via teleoperation, where a pilot remotely controls a robot operating at a distance [34]. Inspection tasks using teleoperation generally involve the use of a land-based mobile robot equipped with image cameras for visual assessment by the operator. Thus, the performance of an inspection task by teleoperation depends not only on the robot component but also on the human operator’s skill. Regarding autonomous mobile robots, one of the main functionalities inherent to their navigation systems is the capacity to detect obstacles on their path. The information retrieved from the sensors equipped in a mobile robot is used as input data for obstacle detection and further processed to define subsequent control actions. Based on the results obtained at this stage, the response of robots can be either avoidance or overpassing. Obstacle avoidance is a recurring theme in the development of autonomous systems, with several solutions being implemented to find alternative paths around physical obstacles. Inspection robots play a critical role in modern industries due to their technological sophistication and operational advantages, as robotic systems continue to evolve. Their key benefits include:
Efficiency: Inspection robots are capable of performing repetitive tasks with higher speed, precision, and consistency compared to human operators, thereby enhancing productivity and operational efficiency across multiple industrial sectors.
Safety: These systems can operate in hazardous or inaccessible environments, such as nuclear facilities, chemical plants, or offshore oil platforms, effectively mitigating risks to human health and safety.
Data Acquisition and Analysis: Equipped with advanced sensor suites, inspection robots can capture high-resolution data and conduct real-time analysis, providing actionable insights for monitoring and decision-making processes.
Cost-effectiveness: By enabling continuous, unmanned operation without the need for rest or shift rotations, inspection robots significantly reduce labor costs and improve overall operational uptime.
Predictive Maintenance: Through continuous monitoring and data-driven diagnostics, inspection robots facilitate preventive maintenance, allowing for the early detection and resolution of potential faults, thereby minimizing downtime and reducing maintenance expenditures.
Considering the application of inspecting critical infrastructures that require indoor and outdoor explorations, the design objectives for an inspection rover shown in Figure 1 and Figure 2 are as follows:
Ensure outdoor mobility across both natural terrains, such as grass and sand, and urban environments as cobblestones and asphalt;
Enable effective navigation over uneven or irregular surfaces.
A wheeled robot has been selected for its robustness and adaptability to various terrains; nevertheless, the proposed solution can be extended to tracked vehicles or hybrid robots, with the actuation and control schemes, as well as the modeling of the DT, being the main objectives of the paper.
The wheeled robot model and its prototype, which constitute the DT, are shown in Figure 1a,b, respectively. The proposed framework is illustrated in Figure 1c, which schematically depicts the architecture of a wireless-controlled wheeled robotic system integrated with a Digital Twin (DT) environment for enhanced monitoring and simulation. Each board is dedicated to controlling one motor group (either the left or the right). They likely operate as motor drivers interfaced with microcontrollers (e.g., Arduino), receiving commands via wireless communication.
The third electronic board serves as the sensor hub, collecting environmental data and interfacing with the Digital Twin system. It acts as a bridge between physical sensing and virtual simulation. Additional sensors can be added to capture real-time data (e.g., distance, temperature, orientation), which is transmitted to Electronic Board 3 for processing and DT further synchronization.
The Wireless Router operates as the central node for communication, enabling data exchange between the tablet, electronic boards, and the Digital Twin system.
Tablet Interface is used for user interaction, remote control, and visualization. It communicates wirelessly with the robot and may also display DT feedback as further described in Section 3.
The DT represented by a computer and a 3D model mirrors the robot’s physical state in a virtual environment. It enables real-time simulation, predictive diagnostics, and performance analysis.
To design and realize the DT of the system, a test bed for experimental activity and sensorization was established, as shown in Figure 2. A further objective of the test bed was to develop a distributed control system for four DC motors equipped with incremental encoders. The system comprises a set of interconnected hardware and software modules that work together to enable coordinated control of the four-motor robot via Wi-Fi. Figure 2 illustrates the overall structure of the system, with the main elements, numbered 1 to 11, described below. The design philosophy integrates robust and cost-effective solutions with the capability for DT implementation, specifically aimed at mission planning and dynamic updates.
The components of the testbed, which will be part of the wheeled robot actuation and control, are the following:
1.
Wi-Fi router: The router provides the local network needed for communication between the user interface and the control boards. All devices are collected in the network, allowing them to communicate via HTTP requests.
2.
Power supply (batteries): The batteries provide the electrical power needed for the whole system, powering both the motors and the Arduino boards. The capacity of a single battery is 2000 mAh, and the voltage is 12 V.
3.
Motor 1 of the right pair.
4.
Motor 2 of the right pair. The two motors on the right side of the robot are controlled by the Arduino Uno R4 board via a DC motor driver (H-bridge), as specified in point 8. Each motor is equipped with a magnetic encoder for measuring rotation speed, which is essential for the feedback of the PID control implemented in the firmware.
5.
Motor 1 of the left pair.
6.
Motor 2 of the left pair: similarly, the two motors on the left side are controlled by the Arduino board as specified in point 7. Here too, the encoders enable instantaneous speed determination and precise, balanced regulation between the robot’s left and right sides.
7.
Arduino “Secondary Left” board: This board is dedicated to controlling the left pair of motors. It receives HTTP commands sent from the graphical interface and calculates the motor control PWM signals in real time, implementing a PID control based on the feedback provided by the encoders. Communication occurs via the WiFiS3 library, which allows for the exposure of a local web server on port 80.
8.
Arduino R4 WiFi “Primary Right” board: the right board functions similarly to the previous one but acts independently on its own pair of motors. It also processes PID control locally, managing the PWM power signals and publishing its status data (RPM, errors, and power) via the/status web endpoint. Both boards (7 and 8) receive commands from the web interface via HTTP.
9.
Arduino “Additional Board”: This third board serves as a system expansion board. It is designed to manage auxiliary devices such as LEDs, sensors, or servomotors. It is also network-accessible and integrated into the graphical interface, allowing for the control of digital outputs, PWM signals, and servos. This modular structure allows for easy system expansion, for example, to control a robotic arm or other future rover modules.
10.
Connection Breadboard: The breadboard houses the electrical connections between the boards, encoders, and motor drivers, serving as an intermediate platform for the distribution of power and control signals.
11.
H-bridges (4 modules): Each motor is driven by a dedicated H-bridge that receives PWM signals from the Arduino boards, allowing for complete control over the direction and intensity of the start-up torque.
Accordingly, a control algorithm was developed and shown in Figure 3. It controls the rotational speed of the four DC motors via a Proportional-Integral-Derivative (PID) controller. The system is implemented on a microcontroller-based control board (e.g., Arduino) and uses suitable encoders to detect real-time actuator velocities, enabling the PID to compensate for deviations from the reference signal.
More specifically, the controller for the actuators is realized using two Arduino UNO R4 Wi-Fi boards, each connected to a pair of motors and managed through a specifically designed interface. Each board, therefore, operates as an HTTP server. A schematic flow-chart for the motors’ control is depicted in Figure 3.
A closed-loop RPM (Revolutions Per Minute) control is executed for each of the two drive wheels. Movement instructions (in the form of desired RPM) are received via HTTP GET requests. After calculating the target RPM, a Progressive Ramp is applied for smooth acceleration.
The system uses real-time data from Hall-effect encoders to calculate the actual motor RPM. This value is compared with the target RPM, and the difference (error) feeds into a PID Control algorithm. The PID output is summed with the base PWM value to generate the final PWM (Pulse Width Modulation) signal sent to the motor driver, minimizing steady-state error and ensuring a stable dynamic response.
The control interface was implemented as a web-based application (HTML/JavaScript), accessible via any browser and designed for use on mobile devices. The interface serves as the central command unit, translating user input into movement instructions (Figure 4). The main control logic (visible in the flowchart) is based on the continuous acquisition of values from the Forward/Backward and Steering sliders; these values are combined in real time to calculate the base PWM values for the two motor pairs ML and MR, respectively. These commands are sent to the Arduino boards via HTTP GET requests (also known as fetches). The interface also manages auxiliary controls (Digital Outputs, PWM, Servo), sending separate commands to each of the three dedicated Arduino boards (Primary, Secondary, Additional) (Figure 5).
A third Arduino UNO R4 Wi-Fi board (Additional Board) is dedicated to managing auxiliary Input/Output, ensuring system expandability for future components such as sensors, LEDs, or actuators (Figure 6).
This board can also be managed by the specifically designed web interface. Upon receiving an HTTP GET request, the firmware is capable of decoding three main command types:
  • Digital Outputs: To activate/deactivate digital pins (set pin mode to OUTPUT).
  • PWM Outputs: To set duty cycle values (0–255) on PWM pins.
  • Servo Motors: To control the rotation angle of a servo motor (0° to 180°), dynamically managing the attach and angle write operations.

3. Simulations and Experimental Tests

To evaluate and optimize the performance of the control system, an experimental test bed, as depicted in Figure 7, was developed. This bench is powered by two 12 V, 2000 mAh batteries connected in series, providing a 24 V DC bus and a router. This enables it to function autonomously and allows for remote control via the web interface, as well as management of the Arduino boards. The optimization of the PID controller was crucial to ensure a robust and precise dynamic response. The parameters (Kp, Ki, Kd) were calibrated empirically through a series of targeted experimental tests. The comparative results of the step response and the error trajectory for each motor pair are presented in Figure 8, Figure 9 and Figure 10, illustrating the three distinct operating scenarios.
  • Optimized PID Case (Figure 8): The responses (Figure 8a,b) show that both motor pairs follow the ideal motion profile almost perfectly, with a rapid settling time and negligible overshoot. The error analysis (Figure 8c) confirms this effectiveness, displaying only reduced initial peaks during the transient phase.
  • Non-Optimized PID Case (Figure 9): This scenario presented a marked transient overshoot problem (Figure 9a,b). Despite the initial instability, the control mechanism successfully ensured good settling behavior in the steady-state response, which converged to the ideal value.
  • No PID Case (Figure 10): The system operated in open-loop, resulting in a response (Figure 10a,b) characterized by a large overshoot and, crucially, a total failure to settle (zero steady-state), completely failing to follow the ideal profile.
The analysis of Figure 8, Figure 9 and Figure 10 demonstrates the significant improvement in terms of settling time and reduction in steady-state error achieved through PID parameter refinement, confirming the necessity of an adequately calibrated closed-loop control system.
A further fundamental step for validation involved using simulations within a CAD environment to model the robot’s behavior (Figure 11). Two distinct simulations were performed on an identical straight-line path:
  • Ideal Simulation: The theoretical motion profile (the ideal step command the robot should follow) was imposed on the robot model.
  • Real Simulation: The model was simulated by inserting the real-world speed and time data—exported from the Arduino Serial Monitor during the physical test—into the four motors (i.e., the actual motor output after the “ideal” step command was applied).
Overlaying the results of these two simulations, as shown in Figure 12 and Figure 13, enables the quantification of the difference between the theoretical trajectory and the experimental trajectory obtained through a dynamic simulation. This approach establishes the foundation for developing a Digital Twin.
Using real data from the physical system to feed a virtual model enables more accurate prediction of the robot’s performance and endurance under load, offering an essential mechanism for updating the mission in real time or periodically based on the system’s actual behavior, as in the case of the mechatronic system described in [35]. The EMG49 motors used in the system are equipped with Hall-effect incremental encoders, providing 980 counts per revolution of the output shaft, corresponding to an angular resolution of 0.37° and a quantization noise of ±2 RPM (≈approximately 1.6% of the nominal speed).
The comparison between the numerical simulation and the experimental data (Figure 12 and Figure 13) reveals significant discrepancies in dynamics. However, these discrepancies do not compromise the system’s basic functionality; instead, the system maintains a stable and predictable response, thanks to the effectiveness of the PID control in ensuring a steady-state average velocity [36,37]. A sensitivity analysis of the PID controller showed that higher proportional (Kp > 2.5) or integral (Ki > 0.6) gains increase the oscillation amplitude, while the optimized configuration (Kp = 1.8, Ki = 0.4, Kd = 0.05) ensures a stable response with velocity oscillations below ±2% and steady-state error below 1%.
Velocity and Acceleration (Figure 12): Figure 12a (Linear Velocity) shows that while the real data (Arduino data) exhibits more contained speed peaks and a higher degree of oscillation (ripple) compared to the ideal response, the average speed value is successfully attained. Figure 12b (Linear Acceleration) highlights a notable noise and variability in the real data, which is attributable to the inherent noise of the encoders and the continuous corrective action of the PID.
Power and Position (Figure 13): Figure 13a (Power Consumption) is particularly relevant, demonstrating that the power drawn by the real system is significantly more variable than the constant consumption expected under ideal conditions. Despite this fluctuation, Figure 13b (Center of Mass Position—COM) shows that the trajectory remains linear, indicating the system’s stability in the steady state. Nevertheless, the trajectory based on real data is phase-shifted and exhibits a slightly shallower slope than the ideal motion. This offset in the COM position is fundamental as it highlights how even a small deviation in velocity response translates into a cumulative positioning error over time, confirming the utility of the Digital Twin for correcting such positional drift during the mission.

4. Conclusions

This study presents the design, implementation, and experimental validation of a mobile system intended for inspection and monitoring applications. Employing an integrated engineering methodology, the project developed a robotic platform capable of moving on variable terrain. The experimental phase, conducted on a test bench, confirmed the effectiveness of the control architecture. The optimization of the PID controller demonstrated significant improvements, ensuring that the motors maintain a velocity trajectory stably aligned with the predefined motion profile, characterized by a rapid settling time and minimal steady-state error. A fundamental step towards validating dynamic performance was the use of simulations in a CAD environment, fed directly with real-world speed and time data, realizing the basis of a DT. This approach allowed for the quantification of deviations between the system’s ideal and experimental behavior, particularly in terms of power consumption and cumulative drift in the COM position. While the system operates stably at steady state, this analysis revealed the impact of encoder noise and continuous corrective action on dynamic loads. This method of direct comparison between real data and the virtual model establishes a concrete foundation for the development of a DT. The integration of the Digital Twin is the next crucial enhancement, which is expected to further improve mission efficiency, optimize paths in real time, and enhance adaptability through predictive simulation in unpredictable field conditions. In summary, this work presents a digital twin framework that couples a physical platform with distributed control, comprehensive data acquisition, and a virtual simulation calibrated using experimental data. Furthermore, the paper explores the technical feasibility of a mobile robotic platform for inspection tasks and establishes a methodological framework for controlling and integrating technologies to realize the DT.

Author Contributions

Conceptualization, P.R., E.O., M.K. and Z.K.; methodology, P.R., E.O., M.K., Z.K. and J.B.; software C.P., M.K. and Z.K.; validation, C.P., P.R. and R.D.B.; formal analysis, C.P., R.D.B., P.R., E.O., M.K., Z.K. and J.B.; investigation, C.P.; resources, E.O.; data curation, P.R. and R.D.B.; writing—original draft preparation, C.P., P.R. and E.O.; writing—review and editing, M.K., Z.K. and J.B.; visualization, P.R.; supervision, E.O. and P.R.; project administration, E.O.; funding acquisition, E.O. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The original contributions presented in this study are included in the article. Further inquiries can be directed to the corresponding author.

Acknowledgments

This paper is a part of a project that has funding by NATO, Science for Peace and Security Programme Multi-Year Project Application, NATO SPS MYP G61140 “Advanced technologies for Physical ResIlience Of cRiti-cal Infrastructures (APRIORI)”.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
CADComputer-Aided Design
COMCenter Of Mass
CPSCyber-Physical Systems
EAIEmbodied Artificial Intelligence
HRCHuman–Robot Collaboration
HRIHuman–Robot Interaction
RPMRevolutions Per Minute

References

  1. Ramasubramanian, A.K.; Mathew, R.; Kelly, M.; Hargaden, V.; Papakostas, N. Digital Twin for Human–Robot Collaboration in Manufacturing: Review and Outlook. Appl. Sci. 2022, 12, 4811. [Google Scholar] [CrossRef]
  2. Meixi, J.; Yidan, T.; Zhifan, W. Comprehensive Analysis of Inspection Robot Technology and Application in Substation. In Proceedings of the 2023 International Conference on Data Science, Advanced Algorithm and Intelligent Computing (DAI 2023); Atlantis Press: Dordrecht, The Netherlands, 2024; pp. 136–147. [Google Scholar] [CrossRef]
  3. Khan, A.; Mineo, C.; Dobie, G.; Macleod, C.; Pierce, G. Vision guided robotic inspection for parts in manufacturing and remanufacturing industry. J. Remanuf. 2021, 11, 49–70. [Google Scholar] [CrossRef]
  4. Feddoul, Y.; Ragot, N.; Duval, F.; Havard, V.; Baudry, D.; Assila, A. Exploring human-machine collaboration in industry: A systematic literature review of digital twin and robotics interfaced with extended reality technologies. Int. J. Adv. Manuf. Technol. 2023, 129, 1917–1932. [Google Scholar] [CrossRef]
  5. Grieves, M. Digital Twin: Manufacturing Excellence Through Virtual Factory Replication. White Paper 2025. pp. 1–7. Available online: https://www.researchgate.net/publication/275211047_Digital_Twin_Manufacturing_Excellence_through_Virtual_Factory_Replication (accessed on 30 December 2025).
  6. Schirrmeister, F. System Emulation and Digital Twins in Aerospace Applications; Cadence Design Systems, Inc.: San Jose, CA, USA, 2019. [Google Scholar]
  7. Shaptala, S.; Myronova, N. Embedding Digital Twin Technology in Robotics. Manag. Dev. Complex Syst. 2023, 45–51. [Google Scholar] [CrossRef]
  8. Zheng, Y.; Yang, S.; Cheng, H. An Application Framework of Digital Twin and Its Case Study. J. Ambient. Intell. Humaniz. Comput. 2019, 10, 1141–1153. [Google Scholar] [CrossRef]
  9. Tao, F.; Qi, Q.; Wang, L.; Nee, A.Y.C. Digital Twins and Cyber–Physical Systems toward Smart Manufacturing and Industry 4.0: Correlation and Comparison. Engineering 2019, 5, 653–661. [Google Scholar] [CrossRef]
  10. Chen, Z.; Foong, S.; Chen, X.; Ghorbel, F.; Tang, J. Focused section on robotics enabled inspection. Int. J. Intell. Robot. Appl. 2005, 9, 815–817. [Google Scholar] [CrossRef]
  11. Li, J.; Yang, S.X. Digital twins to embodied artificial intelligence: Review and perspective. Intell. Robot. 2025, 5, 202–227. [Google Scholar] [CrossRef]
  12. Hoebert, T.; Seibel, S.; Amersdorfer, M.; Vincze, M.; Lepuschitz, W.; Merdan, M.A. Framework for Enhanced Human–Robot Collaboration during Disassembly Using Digital Twin and Virtual Reality. Robotics 2024, 13, 104. [Google Scholar] [CrossRef]
  13. Qin, Q.; Liu, Z.; Zhong, R.; Wang, X.V.; Wang, L.; Wiktorsson, M.; Wang, W. Robot digital twin systems in manufacturing: Technologies, applications, trends and challenges. Robot. Comput.-Integr. Manuf. 2026, 97, 103103. [Google Scholar] [CrossRef]
  14. Bayrhammer, W.C.; Nadoda, E.A.J. Digital Twins to drive Robot-based Disassembly Applications: Asset Models for Autonomous Planning. In Proceedings of the ACM/IEEE 27th International Conference on Model Driven Engineering Languages and Systems; Association for Computing Machinery: New York, NY, USA, 2024; ISBN 9798400706226. [Google Scholar] [CrossRef]
  15. Diachenko, D.; Partyshev, A.; Pizzagalli, S.; Bondarenko, Y.; Otto, T.; Kuts, V. Industrial Collaborative Robot Digital Twin integration and Control Using Robot Operating System. J. Mach. Eng. 2022, 22, 57–67. [Google Scholar] [CrossRef]
  16. Pistone, A.; Canali, C.; Gloriani, C.; Leggieri, S.; Guardiani, P.; Caldwell, D.G. Reconfigurable inspection robot for industrial applications. Procedia Manuf. 2019, 38, 597–604. [Google Scholar] [CrossRef]
  17. Mondal, A.; Das, A.; Ray, D.; Patkar, U.; Chand, S.; Kumar, S.; Keshari, C.; Patel, S. Indigenously Developed Robotic Devices for Inspection of Critical Components in Power Plants. e-J. Nondestruct. Test. 2025, 30. [Google Scholar] [CrossRef]
  18. Lin, L.; Guo, J.; Liu, L. Multi-scene application of intelligent inspection robot based on computer vision in power plant. Sci. Rep. 2024, 14, 10657. [Google Scholar] [CrossRef]
  19. Domin, J.; Górski, M.; Białecki, R.; Zając, J.; Grzyb, K.; Kielan, P.; Adamczyk, W.; Ostrowski, Z.; Wienchol, P.; Lamkowski, K.; et al. Wheeled robot dedicated to the evaluation of the technical condition of large-dimension engineering structures. Robotics 2020, 9, 28. [Google Scholar] [CrossRef]
  20. Bei, X.; Luo, Y. Advances in Autonomous Pipeline Inspection Robotics: A Comprehensive Analysis of Modern Technologies and Performance Evaluation. 2024. Available online: https://www.researchgate.net/publication/397316922_Advances_in_Autonomous_Pipeline_Inspection_Robotics_A_Comprehensive_Analysis_of_Modern_Technologies_and_Performance_Evaluation_Author (accessed on 30 December 2025).
  21. Sun, M.M.; Islam, M.R.; Himel, M.A. Design of a Four-Wheeled Inner Pipeline Inspection Robot. N. Am. Acad. Res. 2024, 7, 95–137. [Google Scholar] [CrossRef]
  22. Villinger, G.; Reiterer, A. Robotic Systems for Sewer Inspection and Monitoring Tasks: Overview and Novel Concepts. IEEE Sens. J. 2025, 25, 8786–8796. [Google Scholar] [CrossRef]
  23. Jeon, K.-W.; Jung, E.-J.; Bae, J.-H.; Park, S.-H.; Kim, J.-J.; Chung, G.; Chung, H.-J.; Yi, H. Development of an In-Pipe Inspection Robot for Large-Diameter Water Pipes. Sensors 2024, 24, 3470. [Google Scholar] [CrossRef]
  24. Fischer, W.; Caprari, G.; Siegwart, R.; Moser, R. Compact Magnetic Wheeled Robot for Inspecting Complex Shaped Structures in Generator Housings and Similar Environments. In Proceedings of the 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems, St. Louis, MO, USA, 10–15 October 2009; pp. 4116–4121. [Google Scholar]
  25. Ahary, V.G.; Abbasgholipour, M.; Alasty, B.M. Six-wheeled Demining Robot for crossing rough terrain and Obstacles. N. Y. Sci. J. 2020, 13, 81–84. [Google Scholar]
  26. Panwar, V.S.; Moujan, I.; Kazi, A.; Kale, P.; Rode, S.; Pandey, A.; Mogal, S. A Review on Design and Characteristics of Landmine Detection Robot. EVERGREEN Jt. J. Nov. Carbon Resour. Sci. Green Asia Strategy 2024, 1, 900–912. [Google Scholar]
  27. Gonzalez-de-Santos, P.; Cobano, J.; Garcia, E.; Estremera, J.; Armada, M.A. A six-legged robot-based system for humanitarian demining missions. Mechatronics 2007, 17, 417–430. [Google Scholar] [CrossRef]
  28. Colon, E.; De Cubber, G.; Ping, H.; Habumuremyi, J.-C.; Sahli, H.; Baudoin, Y. Integrated robotic systems for Humanitarian Demining. Int. J. Adv. Robot. Syst. 2007, 4, 219–228. [Google Scholar] [CrossRef]
  29. Rea, P.; Ottaviano, E.; Castillo-Garcia, F.; Gonzalez-Rodriguez, A. Inspection Robotic System: Design and Simulation for Indoor and Outdoor Surveys. In Innovations in Mechatronics Engineering, Proceedings of the 1st International Conference on Innovation in Engineering, ICIE 2021, Guimarães, Portugal, 28–30 June 2021; Lecture Notes in Mechanical Engineering; Springer: Cham, Switzerland, 2022; pp. 313–321. [Google Scholar]
  30. Ottaviano, E.; Rea, P.; Castelli, G. THROO: A Tracked Hybrid Rover to Overpass Obstacles. Adv. Robot. 2014, 28, 683–694. [Google Scholar] [CrossRef]
  31. Figliolini, G.; Rea, P. Effects of the design parameters on the synthesis of Geneva mechanisms. Proc. Inst. Mech. Eng. Part C J. Mech. Eng. Sci. 2013, 227, 2000–2009. [Google Scholar] [CrossRef]
  32. Pierrel, S.; Cirillo, F.; Solmaz, G. Country-Wide Digital Twin for Humanitarian AI: The Case Study of Cambodian Mine Action. hal-04832133v2, 2024. Available online: https://hal.science/hal-04832133v2 (accessed on 30 December 2025).
  33. De Cubber, G.; Le Flécher, E.; La Grappe, A.; Ghisoni, E.; Maroulis, E.; Ouendo, P.; Hawari, D.; Doroftei, D. Dual Use Security Robotics: A Demining, Resupply and Reconnaissance Use Case. In Proceedings of the 2023 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR), Naraha, Japan, 13–15 November 2023; pp. 181–185. [Google Scholar] [CrossRef]
  34. Caiza, G.; Garcia, C.A.; Naranjo, J.E.; Garcia, M.V. Flexible robotic teleoperation architecture for intelligent oil fields. Heliyon 2020, 6, e03833. [Google Scholar] [CrossRef] [PubMed]
  35. Figliolini, G.; Rea, P. Ca.U.M.Ha. robotic hand (cassino-underactuated-multifinger-hand). In Proceedings of the 2007 IEEE/ASME International Conference on Advanced Intelligent Mechatronics, Zurich, Switzerland, 4–7 September 2007. Article number 44125622007. [Google Scholar]
  36. Pelagalli, C.; Di Bona, R.; Rea, P.; Ottaviano, E.; Kciuk, M.; Kowalik, Z. Design and Simulation of an Inspection Robot for Realizing Its Digital Twin. In Advances in Lean Manufacturing, Volume 1, Proceedings of the ELEC 2025, Rzeszow, Poland, 22–24 October 2025; Antosz, K., Trojanowska, J., Machado, J., Stadnicka, D., Eds.; Lecture Notes in Mechanical Engineering; Springer: Cham, Switzerland, 2026. [Google Scholar] [CrossRef]
  37. Figliolini, G.; Rea, P. Mechatronic design and experimental validation of a novel robotic hand. Ind. Robot. 2014, 41, 98–108. [Google Scholar] [CrossRef]
Figure 1. The proposed framework for the DT: (a) 3D model and (b) the prototype of the mobile robot; (c) the proposed framework.
Figure 1. The proposed framework for the DT: (a) 3D model and (b) the prototype of the mobile robot; (c) the proposed framework.
Applsci 16 00650 g001
Figure 2. An overall scheme for the actuation control of the 4-wheeled robot in Figure 1.
Figure 2. An overall scheme for the actuation control of the 4-wheeled robot in Figure 1.
Applsci 16 00650 g002
Figure 3. A schematic flow-chart for controlling the speed of the four DC motors via a Proportional-Integral-Derivative (PID) controller.
Figure 3. A schematic flow-chart for controlling the speed of the four DC motors via a Proportional-Integral-Derivative (PID) controller.
Applsci 16 00650 g003
Figure 4. A flowchart for the control interface implemented as a web-based application, accessible via any browser, designed for use on mobile devices.
Figure 4. A flowchart for the control interface implemented as a web-based application, accessible via any browser, designed for use on mobile devices.
Applsci 16 00650 g004
Figure 5. Web interface for robot control: (a) Motion control (forward and steering); (b) Additional Arduino commands (Digital Outputs and PWM); (c) Command control for the left (Slave) and right (Master) Arduino boards.
Figure 5. Web interface for robot control: (a) Motion control (forward and steering); (b) Additional Arduino commands (Digital Outputs and PWM); (c) Command control for the left (Slave) and right (Master) Arduino boards.
Applsci 16 00650 g005
Figure 6. A flowchart of the Additional Arduino board code dedicated to managing auxiliary Input/Output, ensuring system expandability for future components such as sensors.
Figure 6. A flowchart of the Additional Arduino board code dedicated to managing auxiliary Input/Output, ensuring system expandability for future components such as sensors.
Applsci 16 00650 g006
Figure 7. Experimental test-bed for motor control.
Figure 7. Experimental test-bed for motor control.
Applsci 16 00650 g007
Figure 8. Performance comparison using optimized PID parameters. (a) left motor step response: Speed (RPM); (b) right motor step response: speed (RPM). (c) left motor speed error: Error trajectory (Target—Actual RPM) for the left motor pair, illustrating effective minimization of steady-state error.
Figure 8. Performance comparison using optimized PID parameters. (a) left motor step response: Speed (RPM); (b) right motor step response: speed (RPM). (c) left motor speed error: Error trajectory (Target—Actual RPM) for the left motor pair, illustrating effective minimization of steady-state error.
Applsci 16 00650 g008
Figure 9. Performance comparison using non-optimized PID parameters; (a) left motor step response: Speed (RPM); (b) right motor step response: speed (RPM); (c) left motor speed error: error trajectory (Target—Actual RPM) for the left motor pair.
Figure 9. Performance comparison using non-optimized PID parameters; (a) left motor step response: Speed (RPM); (b) right motor step response: speed (RPM); (c) left motor speed error: error trajectory (Target—Actual RPM) for the left motor pair.
Applsci 16 00650 g009
Figure 10. Performance comparison without PID Control. (a) left motor step response: Speed (RPM), illustrating the large offset and lack of correction inherent to the open-loop system; (b) right motor step response: Speed (RPM); (c) left motor speed error: (Target—Actual RPM) for the left motor pair, demonstrating the system’s baseline open-loop error.
Figure 10. Performance comparison without PID Control. (a) left motor step response: Speed (RPM), illustrating the large offset and lack of correction inherent to the open-loop system; (b) right motor step response: Speed (RPM); (c) left motor speed error: (Target—Actual RPM) for the left motor pair, demonstrating the system’s baseline open-loop error.
Applsci 16 00650 g010
Figure 11. A top view of the simulation environment for motor control tests.
Figure 11. A top view of the simulation environment for motor control tests.
Applsci 16 00650 g011
Figure 12. Comparison of numerical vs. experimental tests: (a) velocity; (b) acceleration.
Figure 12. Comparison of numerical vs. experimental tests: (a) velocity; (b) acceleration.
Applsci 16 00650 g012
Figure 13. Comparison of numerical vs. experimental tests: (a) Power consumption; (b) COM position.
Figure 13. Comparison of numerical vs. experimental tests: (a) Power consumption; (b) COM position.
Applsci 16 00650 g013
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Pelagalli, C.; Rea, P.; Di Bona, R.; Ottaviano, E.; Kciuk, M.; Kowalik, Z.; Bijak, J. A Framework for a Digital Twin of Inspection Robots. Appl. Sci. 2026, 16, 650. https://doi.org/10.3390/app16020650

AMA Style

Pelagalli C, Rea P, Di Bona R, Ottaviano E, Kciuk M, Kowalik Z, Bijak J. A Framework for a Digital Twin of Inspection Robots. Applied Sciences. 2026; 16(2):650. https://doi.org/10.3390/app16020650

Chicago/Turabian Style

Pelagalli, Cristian, Pierluigi Rea, Roberto Di Bona, Erika Ottaviano, Marek Kciuk, Zygmunt Kowalik, and Joanna Bijak. 2026. "A Framework for a Digital Twin of Inspection Robots" Applied Sciences 16, no. 2: 650. https://doi.org/10.3390/app16020650

APA Style

Pelagalli, C., Rea, P., Di Bona, R., Ottaviano, E., Kciuk, M., Kowalik, Z., & Bijak, J. (2026). A Framework for a Digital Twin of Inspection Robots. Applied Sciences, 16(2), 650. https://doi.org/10.3390/app16020650

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop