A Cyber-Physical Integrated Framework for Developing Smart Operations in Robotic Applications
Abstract
1. Introduction
- Inconsistent communication protocols: Robots of different brands (such as ABB, KUKA, Fanuc) use different communication protocols and interfaces. It is challenging to standardize data formats and control commands when integrating virtual and physical.
- Limited data access: Many robot arm data (such as joint angle, load, temperature, error code) cannot be read directly from the controller, or require a specific SDK/API. Some manufacturers lock data in proprietary software, increasing the difficulty of integration.
- High real-time requirements: If the virtual system (such as a digital twin) cannot reflect the arm status in real time, it will affect the accuracy of prediction or control. Excessive delay may cause motion deviation or even damage the equipment.
- Complex digital twin modeling: High-level modeling is required to accurately simulate the kinematics, dynamics, collision detection, etc., of the robot arm. If the simulation is inconsistent with the entity, maintenance or process problems cannot be accurately predicted.
2. Materials and Methods
2.1. System Architecture
- Hardware: A six-axis robot arm, named as Niryo Ned, and a depth camera, Intel RealSense D435i are used in this research. The computer acts as the main control unit, wirelessly connected to the Niryo Ned robotic arm via TCP/IP. The depth camera is connected via a USB cable.
- Software: The system integration framework emphasizes the need for a modular and standardized approach to managing robotic systems. Therefore, a ROS environment is essential. While ROS 1 is being phased out, it still offers advantages such as mature tools and easier setup, making it a suitable choice for prototyping. However, ROS 2 is now the recommended platform for all new developments. The framework’s workflow is implemented in the ROS environment and comprises several steps. It begins with real-time object recognition using the depth camera and the YOLO object detection model. Next, object positions are simulated, and the Niryo Ned robotic arm model and kinematic calculation method are established. The robot path is planned using the artificial potential field method, and the object information and path planning results are integrated. Feasibility is verified, and collisions are detected in the simulator CoppeliaSim. If there are no collisions, the physical robot executes the planned path through ROS nodes. The RVIZ visualization tool is used to monitor the operation status. The entire architecture of the robotic system is shown in Figure 2.
2.2. Experimental Setup
2.3. Depth Camera Principle
2.4. YOLO Model Training
- Input Preprocessing
- Resize the input image to a fixed size (e.g., 416 × 416 or 640 × 640 pixels).
- Normalize pixel values.
- Pass the image through the CNN backbone.
- Feature Extraction
- Use a deep CNN to extract hierarchical feature maps from the image.
- Different versions use different backbones.
- Grid Division
- Divide the image into an S × S grid.
- Each grid cell is responsible for detecting objects whose center falls within that cell.
- Bounding Box PredictionEach grid cell predicts bounding boxes, where each box includes:
- ■
- Center coordinates (x, y) (relative to the grid cell),
- ■
- Width (w), height (h) (relative to the whole image),
- ■
- Confidence score = Pr (Object) × IoU (pred, truth).
- Class Probability Prediction
- Each grid cell also predicts C class probabilities (conditional class probabilities: Pr (Class_i|Object)).
- Final Predictions
- Multiply confidence score with class probabilities to obtain class-specific confidence scores.
- Final score = Pr (Class_i) × IoU (pred, truth).
- Post-Processing
- Thresholding: Discard bounding boxes below a confidence threshold.
- Non-Maximum Suppression (NMS): Eliminate overlapping boxes to keep only the most confident prediction per object.
- Output
- A list of detected objects, each with:
- ■
- Class label,
- ■
- Bounding box coordinates (x, y, w, h),
- ■
- Confidence score.
2.5. Coordinate Conversion
2.6. Robotic Kinematics
2.7. Obstacle Avoidance Path Planning
3. Experimental Results
3.1. YOLO Model Training Result
3.2. Coordinate Conversion Result
3.3. Virtual Model Establishment and Kinematic Calculation
3.4. Artificial Potential Field Path Planning
3.5. Resultant Demonstration
3.6. Discussion
4. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Sbaragli, A.; Ghafoorpoor, P.Y.; Thiede, S.; Pilati, F. A cyber-physical architecture to monitor human-centric reconfigurable manufacturing systems. J Intell Manuf. 2025. [Google Scholar] [CrossRef]
- Baheti, R.; Gill, H. Cyber-physical systems. In The Impact of Control Technology, 1st ed.; Samad, T., Annaswamy, A., Eds.; IEEE Control Systems Society: Piscataway, NJ, USA, 2011; pp. 161–166. [Google Scholar]
- Lou, S.; Hu, Z.; Zhang, Y.; Feng, Y.; Zhou, M.; Lv, C. Human-Cyber-Physical System for Industry 5.0: A Review From a Human-Centric Perspective. IEEE Trans. Autom. Sci. Eng. 2024, 22, 494–511. [Google Scholar] [CrossRef]
- Lee, J.; Bagheri, B.; Kao, H.A. A cyber-physical systems architecture for industry 4.0-based manufacturing systems. Manuf. Lett. 2015, 3, 18–23. [Google Scholar] [CrossRef]
- Rasheed, A.; San, O.; Kvamsdal, T. Digital Twin: Values, Challenges and Enablers from a Modeling Perspective. IEEE Access 2020, 8, 21980–22012. [Google Scholar] [CrossRef]
- Shaaban, M.; Carfì, A.; Mastrogiovanni, F. Digital Twins for Human-Robot Collaboration: A Future Perspective. arXiv 2023, arXiv:2311.02421. [Google Scholar]
- Rolofs, G.; Wilking, F.; Goetz, S.; Wartzack, S. Integrating Digital Twins and Cyber-Physical Systems for Flexible Energy Management in Manufacturing Facilities: A Conceptual Framework. Electronics 2024, 13, 4964. [Google Scholar] [CrossRef]
- Caiza, G.; Sanz, R. An Immersive Digital Twin Applied to a Manufacturing Execution System for the Monitoring and Control of Industry 4.0 Processes. Appl. Sci. 2024, 14, 4125. [Google Scholar] [CrossRef]
- Baniqued, P.; Bremner, P.; Sandison, M.; Harper, S.; Agrawal, S.; Bolarinwa, J.; Blanche, J.; Jiang, Z.; Johnson, T.; Mitchell, D.; et al. Multimodal immersive digital twin platform for cyber–physical robot fleets in nuclear environments. J. Field Robot. 2024, 41, 1521–1540. [Google Scholar] [CrossRef]
- Wang, C.Y.; Yeh, I.H.; Liao, H.Y. YOLOv9: Learning What You Want to Learn Using Programmable Gradient Information. arXiv 2024, arXiv:2402.13616. [Google Scholar]
- Rohmer, E.; Singh, S.P.; Freese, M. V-REP: A versatile and scalable robot simulation framework. In Proceedings of the 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, Tokyo, Japan, 3–7 November 2013; pp. 1321–1326. [Google Scholar]
- Belda, K.; Venkrbec, L.; Jirsa, J. Modelling, Control Design and Inclusion of Articulated Robots in Cyber-Physical Factories. Actuators 2025, 14, 129. [Google Scholar] [CrossRef]
- Tailor, P.; Roy, D.; Jagtap, K.; ali bhojani, K.; Vasant Badhan, K.; Prakash, A.; Atpadkar, V. Mono Camera-based Localization of Objects to Guide Real-time Grasp of a Robotic Manipulator. In Proceedings of the Advances in Robotics-5th International Conference of the Robotics Society, Kanpur, India, 30 June–4 July 2021; Association for Computing Machinery: New York, NY, USA, 2021; Volume 42, pp. 1–8. [Google Scholar] [CrossRef]
- Tadic, V.; Toth, A.; Vizvari, Z.; Klincsik, M.; Sari, Z.; Sarcevic, P.; Sarosi, J.; Biro, I. Perspectives of RealSense and ZED Depth Sensors for Robotic Vision Applications. Machines 2022, 10, 183. [Google Scholar] [CrossRef]
- Bochkovskiy, A.; Wang, C.Y.; Liao, H.Y.M. YOLOv4: Optimal speed and accuracy of object detection. arXiv 2020, arXiv:2004.10934. [Google Scholar]
- Nakamura, T. Real-time 3-D object tracking using Kinect sensor. In Proceedings of the 2011 IEEE International Conference on Robotics and Biomimetics, Phuket, Thailand, 7–11 December 2011; IEEE: Piscataway, NJ, USA, 2011; pp. 784–788. [Google Scholar]
- Denavit, J.; Hartenberg, R.S. A kinematic notation for lower-pair mechanisms based on matrices. J. Appl. Mech. 1955, 22, 215–221. [Google Scholar] [CrossRef]
- Almeida, J.; Rosa, D.; Viegas, G. Direct and Inverse Kinematics of Serial Manipulators (Nyrio One 6-Axis Robotic Arm). 2021. Available online: https://www.scribd.com/document/731672012/Direct-and-Inverse-Kinematics-of-Serial-Manipulators-Nyrio-One-6-axis-Robotic-Arm (accessed on 15 July 2024).
- Khatib, O. Real-Time Obstacle Avoidance for Manipulators and Mobile Robots. IJRR 1986, 5, 90–98. [Google Scholar]
- Yao, Q.; Zheng, Z.; Qi, L.; Yuan, H.; Guo, X.; Zhao, M.; Liu, Z.; Yang, T. Path planning method with improved artificial potential field—A reinforcement learning perspective. IEEE Access 2020, 8, 135513–135523. [Google Scholar] [CrossRef]
- Xia, X.; Li, T.; Sang, S.; Cheng, Y.; Ma, H.; Zhang, Q.; Yang, K. Path Planning for Obstacle Avoidance of Robot Arm Based on Improved Potential Field Method. Sensors 2023, 23, 3754. [Google Scholar] [CrossRef] [PubMed]
- Wang, C.Y.; Bochkovskiy, A.; Liao, H.Y. YOLOv7: Trainable Bag-of-Freebies Sets New State-of-the-Art for Real-Time Object Detectors. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Vancouver, BC, Canada, 17–24 June 2023; pp. 7464–7475. [Google Scholar]
- ISO 10218-1:2025; Robotics—Safety Requirements, Part 1: Industrial Robots, Edition 3. 2025. Available online: https://www.iso.org/standard/73933.html (accessed on 15 July 2024).
- ISO 10218-2:2025; Robotics—Safety Requirements, Part 2: Industrial Robot Applications and Robot Cells, Edition 2. 2025. Available online: https://www.iso.org/standard/73934.html (accessed on 15 July 2024).
- ISO/TS 15066:2016; Robots and Robotic Devices—Collaborative Robots, Edition 1. 2016. Available online: https://www.iso.org/standard/62996.html (accessed on 15 July 2024).
Connection Layer |
|
Conversion Layer |
|
Cyber Layer |
|
Cognitive Layer |
|
Configuration Layer |
|
Joint i | Range of Rotation Angles | ||||
---|---|---|---|---|---|
1 | 0.175 | 0 | 90° | (−170°, 170°) | |
2 | 0 | 0.221 | 0° | (−120°, 35°) | |
3 | 0 | 0.0325 | 90° | (−77°, 90°) | |
4 | 0.235 | 0 | −90° | (−120°, 120°) | |
5 | 0 | 0 | 90° | (−100°, 55°) | |
6 | 0.04 | 0 | 0° | (−145°, 145°) |
Point i | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 |
---|---|---|---|---|---|---|---|---|---|
Actual value | 0.252 | 0.25 | 0.247 | 0.304 | 0.302 | 0.301 | 0.359 | 0.357 | 0.357 |
Predicted valie | 0.252 | 0.249 | 0.247 | 0.303 | 0.304 | 0.301 | 0.358 | 0.36 | 0.355 |
Error | ~0 | 0.001 | ~0 | 0.001 | −0.002 | ~0 | 0.001 | −0.003 | 0.002 |
Point i | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 |
---|---|---|---|---|---|---|---|---|---|
Actual value | −0.061 | 0 | 0.062 | −0.06 | 0.001 | 0.063 | −0.058 | 0.002 | 0.064 |
Predicted valie | −0.061 | 0 | 0.062 | −0.06 | −0.001 | 0.064 | −0.057 | 0.002 | 0.065 |
Error | ~0 | ~0 | ~0 | ~0 | −0.002 | −0.001 | −0.001 | ~0 | −0.001 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Liu, T.-L.; Chen, P.-C.; Chao, Y.-H.; Huang, K.-C. A Cyber-Physical Integrated Framework for Developing Smart Operations in Robotic Applications. Electronics 2025, 14, 3130. https://doi.org/10.3390/electronics14153130
Liu T-L, Chen P-C, Chao Y-H, Huang K-C. A Cyber-Physical Integrated Framework for Developing Smart Operations in Robotic Applications. Electronics. 2025; 14(15):3130. https://doi.org/10.3390/electronics14153130
Chicago/Turabian StyleLiu, Tien-Lun, Po-Chun Chen, Yi-Hsiang Chao, and Kuan-Chun Huang. 2025. "A Cyber-Physical Integrated Framework for Developing Smart Operations in Robotic Applications" Electronics 14, no. 15: 3130. https://doi.org/10.3390/electronics14153130
APA StyleLiu, T.-L., Chen, P.-C., Chao, Y.-H., & Huang, K.-C. (2025). A Cyber-Physical Integrated Framework for Developing Smart Operations in Robotic Applications. Electronics, 14(15), 3130. https://doi.org/10.3390/electronics14153130