Industrial Robotic Setups: Tools and Technologies for Tracking and Analysis in Industrial Processes
Abstract
1. Introduction
2. Materials and Methods
- The focus of the articles should be on industrial environments, referencing robotic manufacturing, manipulation, and assembly.
- Articles must address machine vision techniques for object detection, localization, or recognition in an industrial robotic arm setup.
- Articles must address other ML tracking methods for tools, parts, or RM during operation.
- The focus of the articles should be on the discussion or evaluation of particular tools, sensors, or technologies.
- Research must address and describe the use of data acquisition systems.
- Articles must have been published within the last 5 years, unless the work is unique.
- Articles focusing on non-industrial or mobile robotic applications, unless it has the potential to be adopted in industrial robotics.
- Studies which only involve simulation and do not refer to industrial process constraints or hardware validation.
- Articles discussing general ML or vision algorithms without specifically adapting or applying them to industrial robotic systems.
- Papers on robotic control, navigation, or planning the robot’s path, unless they are directly related to process tracking or machine vision.
- Articles lacking technical depth.
- Papers with insufficient citations, a poorly structured methodology, or an unclear experimental setup.
- Machine vision for object detection and recognition in industrial robotics;
- Tracking and analysis in industrial robotic machining and manipulation.
3. Machine Vision and Recognition in an Industrial Robotics Setup
3.1. YOLO Models for Industrial Robotic Vision
3.2. Other ML Models Used for Industrial Robotic Vision
4. Industrial Robotic Process Tracking and Analysis in Machining and Manipulation
4.1. Machining Tasks
4.2. Manipulation Tasks
5. Discussions
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Abbreviations
IoT | Internet of Things |
ML | Machine Learning |
RM | Robotic Manipulators |
CNN | Convolutional Neural Network |
VR | Virtual Reality |
ToF | Time-of-Flight |
RGBD | Red, Green, Blue, and Depth |
AI | Artificial Intelligence |
YOLO | You Only Look Once |
CSI | Camera Serial Interface |
ROC | Receiver Operating Characteristic |
R-Bbox | Rectangular Bounding Boxes |
VGG | Visual Geometry Group |
ROS | Robot Operating System |
DL | Deep Learning |
DOF | Degrees Of Freedom |
SSD | Single-Shot Detector |
SGD | Stochastic Gradient Descent |
AAE | Augmented Autoencoder |
LSTM | Long Short-Term Memory |
DNN | Dynamic Neural Network |
IK | Inverse Kinematics |
ANN | Artificial Neural Network |
GLT | Glued Laminated Timber |
BIM | Building Information Modeling |
IFC | Industry Foundation Classes |
ICP | Iterative Closest Point |
ILC | Iterative Learning Control |
DE | Differential Evolution |
RNN | Recurrent Neural Network |
CLR | Cyclical Learning Rate |
RMSE | Root Mean Square Error |
DKF | Kalman Filter |
GP | Gaussian Process |
BANSAI | Bridging the AI Adoption Gap via Neurosymbolic AI |
TTO | Task Time Optimizer |
PSO | Particle swarm optimization |
RRT | Rapidly Random Tree |
References
- Fathi, M.; Sepehri, A.; Ghobakhloo, M.; Iranmanesh, M.; Tseng, M.L. Balancing assembly lines with industrial and collaborative robots: Current trends and future research directions. Comput. Ind. Eng. 2024, 193, 110254. [Google Scholar] [CrossRef]
- Ayasrah, F.T.M.; Abu-Alnadi, H.J.; Al-Said, K.; Shrivastava, G.; Mohan, G.K.; Muniyandy, E.; Chandra, U. IoT Integration for Machine Learning System using Big Data Processing. Int. J. Intell. Syst. Appl. Eng. 2024, 12, 591–599. [Google Scholar]
- Mazhar, A.; Tanveer, A.; Izhan, M.; Khan, M.Z.T. Robust Control Approaches and Trajectory Planning Strategies for Industrial Robotic Manipulators in the Era of Industry 4.0: A Comprehensive Review. Eng. Proc. 2023, 56, 75. [Google Scholar] [CrossRef]
- Makulavičius, M.; Petkevičius, S.; Rožėnė, J.; Dzedzickis, A.; Bučinskas, V. Industrial Robots in Mechanical Machining: Perspectives and Limitations. Robotics 2023, 12, 160. [Google Scholar] [CrossRef]
- Su, C.; Li, B.; Zhang, W.; Tian, W.; Liao, W. An analysis and reliability-based optimization design method of trajectory accuracy for industrial robots considering parametric uncertainties. Reliab. Eng. Syst. Saf. 2025, 254, 110626. [Google Scholar] [CrossRef]
- Pandiyan, V.; Murugan, P.; Tjahjowidodo, T.; Caesarendra, W.; Manyar, O.M.; Then, D.J.H. In-process virtual verification of weld seam removal in robotic abrasive belt grinding process using deep learning. Robot. Comput. Integr. Manuf. 2019, 57, 477–487. [Google Scholar] [CrossRef]
- Wang, N.; Shi, X.; Zhong, K.; Zhang, X.; Chen, W. A Path Correction Method Based on Global and Local Matching for Robotic Autonomous Systems. J. Intell. Robot. Syst. Theory Appl. 2022, 104, 1–12. [Google Scholar] [CrossRef]
- Hull, B.; John, V. Non-Destructive Testing; Springer Inc.: New York, NY, USA, 1988. [Google Scholar]
- Almadhoun, R.; Taha, T.; Seneviratne, L.; Dias, J.; Cai, G. A survey on inspecting structures using robotic systems. Int. J. Adv. Robot. Syst. 2016, 13, 1–18. [Google Scholar] [CrossRef]
- Mineo, C.; Montinaro, N.; Fustaino, M.; Pantano, A.; Cerniglia, D. Fine Alignment of Thermographic Images for Robotic Inspection of Parts with Complex Geometries. Sensors 2022, 22, 6267. [Google Scholar] [CrossRef]
- Cuevas, E.; López, M.; García, M. Ultrasonic Techniques and Industrial Robots: Natural Evolution of Inspection Systems. In Proceedings of the 4th International Symposium on NDT in Aerospace, Berlin, Germany, 13–15 November 2012. [Google Scholar]
- Mineo, C.; Riise, J.; Pierce, S.G.; Summan, R.; Macleod, C.N.; Pierce, S.G. Index-based triangulation method for efficient generation of large three-dimensional ultrasonic C-scans. Insight-Non-Destr. Test. Cond. Monit. 2018, 60, 183–189. [Google Scholar] [CrossRef]
- Soori, M.; Arezoo, B.; Dastres, R. Internet of things for smart factories in industry 4.0, a review. Internet Things Cyber-Phys. Syst. 2023, 3, 192–204. [Google Scholar] [CrossRef]
- Droukas, L.; Doulgeri, Z.; Tsakiridis, N.L.; Triantafyllou, D.; Kleitsiotis, I.; Mariolis, I.; Giakoumis, D.; Tzovaras, D.; Kateris, D.; Bochtis, D. A Survey of Robotic Harvesting Systems and Enabling Technologies. J. Intell. Robot. Syst. 2023, 107, 1–29. [Google Scholar] [CrossRef]
- Rahmati, M. Dynamic role-adaptive collaborative robots for sustainable smart manufacturing: An AI-driven approach. J. Intell. Manuf. Spec. Equip. 2025; ahead-of-print. [Google Scholar] [CrossRef]
- Nenna, F.; Zanardi, D.; Gamberini, L. Enhanced Interactivity in VR-based Telerobotics: An Eye-tracking Investigation of Human Performance and Workload. Int. J. Hum. Comput. Stud. 2023, 177, 103079. [Google Scholar] [CrossRef]
- Amobonye, A.; Lalung, J.; Mheta, G.; Pillai, S. Writing a Scientific Review Article: Comprehensive Insights for Beginners. Sci. World J. 2024, 2024, 7822269. [Google Scholar] [CrossRef] [PubMed]
- Maideen, A.; Mohanarathinam, A. Computer Vision-Assisted Object Detection and Handling Framework for Robotic Arm Design Using YOLOV5. ADCAIJ: Adv. Distrib. Comput. Artif. Intell. J. 2023, 12, e31586. [Google Scholar] [CrossRef]
- Zhao, H.; Tang, Z.; Li, Z.; Dong, Y.; Si, Y.; Lu, M.; Panoutsos, G. Real-Time Object Detection and Robotic Manipulation for Agriculture Using a YOLO-Based Learning Approach. In Proceedings of the IEEE International Conference on Industrial Technology, Bristol, UK, 25–27 March 2024. [Google Scholar] [CrossRef]
- Wang, Y.; Zhou, Y.; Wei, L.; Li, R. Design of a Four-Axis Robot Arm System Based on Machine Vision. Appl. Sci. 2023, 13, 8836. [Google Scholar] [CrossRef]
- Zhong, T.; Gan, Y.; Han, Z.; Gao, H.; Li, A. A Lightweight Object Detection Network for Industrial Robot Based YOLOv5. In Proceedings of the Proceedings-2023 China Automation Congress, CAC 2023, Chongqing, China, 17–19 November 2023; pp. 4685–4690. [Google Scholar] [CrossRef]
- Kondratev, S.; Pikalov, V.; Belokopytov, R.; Muravyev, A.; Boikov, A. Designing an Advanced Control System for ABB IRB 140 Robotic Manipulator: Integrating Machine Learning and Computer Vision for Enhanced Object Manipulation. In Proceedings of the 2023 5th International Conference on Control Systems, Mathematical Modeling, Automation and Energy Efficiency (SUMMA), Lipetsk, Russia, 8–10 November 2023. [Google Scholar] [CrossRef]
- Li, L.; Cherouat, A.; Snoussi, H.; Wang, T.; Lou, Y.; Wu, Y. Vision-Based Deep learning for Robot Grasping Application in Industry 4.0. In Proceedings of the Technological Systems, Sustainability and Safety (TS3), Paris, France, 6–7 February; 2024. [Google Scholar]
- Govi, E.; Sapienza, D.; Toscani, S.; Cotti, I.; Franchini, G.; Bertogna, M. Addressing challenges in industrial pick and place: A deep learning-based 6 Degrees-of-Freedom pose estimation solution. Comput. Ind. 2024, 161, 104130. [Google Scholar] [CrossRef]
- Sundermeyer, M.; Marton, Z.C.; Durner, M.; Triebel, R. Augmented Autoencoders: Implicit 3D Orientation Learning for 6D Object Detection. Int. J. Comput. Vis. 2020, 128, 714–729. [Google Scholar] [CrossRef]
- Ma, Z.; Dong, N.; Gu, J.; Cheng, H.; Meng, Z.; Du, X. STRAW-YOLO: A detection method for strawberry fruits targets and key points. Comput. Electron. Agric. 2025, 230, 109853. [Google Scholar] [CrossRef]
- Kang, S.; Hu, Z.; Liu, L.; Zhang, K.; Cao, Z. Object Detection YOLO Algorithms and Their Industrial Applications: Overview and Comparative Analysis. Electronics 2025, 14, 1104. [Google Scholar] [CrossRef]
- Addy, C.; Nadendla, V.S.S.; Awuah-Offei, K. YOLO-Based Miner Detection Using Thermal Images in Underground Mines. Min. Met. Explor. 2025, 42, 1369–1386. [Google Scholar] [CrossRef]
- Qi, K.; Yang, Z.; Fan, Y.; Song, H.; Liang, Z.; Wang, S.; Wang, F. Detection and classification of Shiitake mushroom fruiting bodies based on Mamba YOLO. Sci. Rep. 2025, 15, 1–14. [Google Scholar] [CrossRef]
- Zhu, C.; Li, Z.; Liu, W.; Wu, P.; Zhang, X.; Wang, S. YOLO-VDS: Accurate detection of strawberry developmental stages for embedded agricultural robots. Eng. Res. Express 2025, 7, 015274. [Google Scholar] [CrossRef]
- Zheng, F.; Yin, A.; Zhou, C. YOLO with feature enhancement and its application in intelligent assembly. Rob. Auton. Syst. 2025, 183, 104844. [Google Scholar] [CrossRef]
- Vaghela, R.; Vaishnani, D.; Sarda, J.; Thakkar, A.; Nasit, Y.; Brahma, B.; Bhoi, A.K. Optimizing object detection for autonomous robots: A comparative analysis of YOLO models. Measurement 2026, 257, 118676. [Google Scholar] [CrossRef]
- Jalayer, R.; Chen, Y.; Jalayer, M.; Orsenigo, C.; Tomizuka, M. Testing human-hand segmentation on in-distribution and out-of-distribution data in human–robot interactions using a deep ensemble model. Mechatronics 2025, 110, 103365. [Google Scholar] [CrossRef]
- Sapkota, R.; Karkee, M. Object Detection with Multimodal Large Vision-Language Models: An In-depth Review. TechRxiv 2025. [Google Scholar] [CrossRef]
- Wan, D.; Deng, L.; Dong, J.; Guo, M.; Yin, J.; Liu, C.; Liu, H. An algorithm for multi-directional text detection in natural scenes. Digit. Signal Process 2026, 168, 105482. [Google Scholar] [CrossRef]
- Roveda, L.; Maroni, M.; Mazzuchelli, L.; Praolini, L.; Shahid, A.A.; Bucca, G.; Piga, D. Robot End-Effector Mounted Camera Pose Optimization in Object Detection-Based Tasks. J. Intell. Robot. Syst. Theory Appl. 2022, 104, 1–21. [Google Scholar] [CrossRef]
- Lavrenov, R.; Kidiraliev, E. Using optical sensors for industrial robot-human interactions in a Gazebo environment. Proc. Int. Conf. Artif. Life Robot. 2023, 28, 174–177. [Google Scholar] [CrossRef]
- Secil, S.; Ozkan, M. Minimum distance calculation using skeletal tracking for safe human-robot interaction. Robot. Comput. Integr. Manuf. 2022, 73, 102253. [Google Scholar] [CrossRef]
- Tashtoush, T.; Garcia, L.; Landa, G.; Amor, F.; Nicolas, A.; Oliva, D.; Safar, F. Human-Robot Interaction and Collaboration (HRI-C) Utilizing Top-View RGB-D Camera System. Int. J. Adv. Comput. Sci. Appl. 2023, 12, 10–17. [Google Scholar] [CrossRef]
- Bilal, D.K.; Unel, M.; Tunc, L.T.; Gonul, B. Development of a vision based pose estimation system for robotic machining and improving its accuracy using LSTM neural networks and sparse regression. Robot. Comput. Integr. Manuf. 2022, 74, 102262. [Google Scholar] [CrossRef]
- Chern, O.Z.; Hoe, H.K.; Chua, W. Towards Industry 4.0: Color-Based Object Sorting Using a Robot Arm and Real-Time Object. Ind. Manag. Adv. 2023, 1, 125. [Google Scholar] [CrossRef]
- Lao, D.; Quan, Y.; Wang, F.; Liu, Y. Error Modeling and Parameter Calibration Method for Industrial Robots Based on 6-DOF Position and Orientation. Appl. Sci. 2023, 13, 10901. [Google Scholar] [CrossRef]
- Tan, S.; Yang, J.; Ding, H. A prediction and compensation method of robot tracking error considering pose-dependent load decomposition. Robot. Comput. Integr. Manuf. 2023, 80, 102476. [Google Scholar] [CrossRef]
- Eren, B.; Demir, M.H.; Mistikoglu, S. Recent developments in computer vision and artificial intelligence aided intelligent robotic welding applications. Int. J. Adv. Manuf. Technol. 2023, 126, 4763–4809. [Google Scholar] [CrossRef]
- Zhu, Z.; Tang, X.; Chen, C.; Peng, F.; Yan, R.; Zhou, L.; Li, Z.; Wu, J. High precision and efficiency robotic milling of complex parts: Challenges, approaches and trends. Chin. J. Aeronaut. 2022, 35, 22–46. [Google Scholar] [CrossRef]
- Bilal, M.T.; Tyapin, I.; Choux, M.M.H. Enhancing Object Localization Accuracy by using Multiple Camera Viewpoints for Disassembly Systems. In Proceedings of the IECON Proceedings (Industrial Electronics Conference) 2022, Brussels, Belgium, 17–20 October 2022. [Google Scholar] [CrossRef]
- Lee, K.W.; Ko, D.K.; Lim, S.C. Toward Vision-Based High Sampling Interaction Force Estimation with Master Position and Orientation for Teleoperation. IEEE Robot. Autom. Lett. 2021, 6, 6640–6646. [Google Scholar] [CrossRef]
- Srinivasamurthy, C.; Sivavenkatesh, R.; Gunasundari, R. Six-Axis Robotic Arm Integration with Computer Vision for Autonomous Object Detection using TensorFlow. In Proceedings of the 2023 2nd International Conference on Advances in Computational Intelligence and Communication, ICACIC 2023, Puducherry, India, 7–8 December 2023. [Google Scholar] [CrossRef]
- Deshpande, T.R.; Sapkal, S.U. Development of Vision Enabled Articulated Robotic Arm with Grasping Strategies for Simple Objects. In Proceedings of the 2021 IEEE Bombay Section Signature Conference, IBSSC 2021, Gwalior, India, 18–20 November 2021. [Google Scholar] [CrossRef]
- Gu, X. Design of delay compensation system for visual tracking control of industrial robots. In Proceedings of the Fourth International Conference on Mechanical, Electronics, and Electrical and Automation Control (METMS 2024), Xi’an, China, 26–28 January 2024; Volume 13163, pp. 2138–2143. [Google Scholar] [CrossRef]
- Deng, Z.Y.; Kang, L.W.; Chiang, H.H.; Li, H.C. Integration of Robotic Vision and Automatic Tool Changer Based on Sequential Motion Primitive for Performing Assembly Tasks. IFAC-Pap. 2023, 56, 5320–5325. [Google Scholar] [CrossRef]
- Mateo, C.M.; Gil, P.; Torres, F. Visual perception for the 3D recognition of geometric pieces in robotic manipulation. Int. J. Adv. Manuf. Technol. 2016, 83, 1999–2013. [Google Scholar] [CrossRef]
- Nair, D.; Pakdaman, A.; Plöger, P.G. Performance Evaluation of Low-Cost Machine Vision Cameras for Image-Based Grasp Verification. arXiv 2020, arXiv:2003.10167. [Google Scholar]
- D’Avella, S.; Avizzano, C.A.; Tripicchio, P. ROS-Industrial based robotic cell for Industry 4.0: Eye-in-hand stereo camera and visual servoing for flexible, fast, and accurate picking and hooking in the production line. Robot. Comput. Integr. Manuf. 2023, 80, 102453. [Google Scholar] [CrossRef]
- Zhang, M.; Tong, W.; Li, P.; Hou, Y.; Xu, X.; Zhu, L.; Wu, E.Q. Robust Neural Dynamics Method for Redundant Robot Manipulator Control With Physical Constraints. IEEE Trans. Ind. Inf. 2023, 19, 11721–11729. [Google Scholar] [CrossRef]
- Lai, Z.; Xiong, R.; Wu, H.; Guan, Y. Integration of Visual Information and Robot Offline Programming System for Improving Automatic Deburring Process. In Proceedings of the 2018 IEEE International Conference on Robotics and Biomimetics, ROBIO 2018, Kuala Lumpur, Malaysia, 12–15 December 2018; pp. 1132–1137. [Google Scholar] [CrossRef]
- Zhao, X.; Zhang, Y.; Wang, H.; Liu, Y.; Zhang, B.; Hu, S. Research on Trajectory Recognition and Control Technology of Real-Time Tracking Welding. Sensors 2022, 22, 8546. [Google Scholar] [CrossRef]
- Wu, X.; Tian, R.; Lei, Y.; Gao, H.; Fang, Y. Real-Time Space Trajectory Judgment for Industrial Robots in Welding Tasks. Machines 2024, 12, 360. [Google Scholar] [CrossRef]
- Srinivasu, P.N.; Bhoi, A.K.; Jhaveri, R.H.; Reddy, G.T.; Bilal, M. Probabilistic Deep Q Network for real-time path planning in censorious robotic procedures using force sensors. J. Real. Time Image Process 2021, 18, 1773–1785. [Google Scholar] [CrossRef]
- Almusawi, A.R.J.; Dülger, L.C.; Kapucu, S. Artificial Neural Network Based Kinematics: Case Study on Robotic Surgery. Mech. Mach. Sci. 2019, 73, 1839–1848. [Google Scholar] [CrossRef]
- Iskandar, M.; Ott, C.; Albu-Schaffer, A.; Siciliano, B.; Dietrich, A. Hybrid Force-Impedance Control for Fast End-Effector Motions. IEEE Robot. Autom. Lett. 2023, 8, 3931–3938. [Google Scholar] [CrossRef]
- Fekik, A.; Azar, A.T.; Hamida, M.L.; Denoun, H.; Kais, D.; Saidi, S.M.; Bousbaine, A.; Kasim, I.; Kamal, N.A.; Al Mhdawi, A.K.; et al. Sliding Mode Control of the PUMA 560 Robot. In Proceedings of the 2023 International Conference on Control, Automation and Diagnosis, ICCAD 2023, Rome, Italy, 10–12 May 2023. [Google Scholar] [CrossRef]
- Gao, R.; Zhang, W.; Wang, G.; Wang, X. Experimental Research on Motion Analysis Model and Trajectory Planning of GLT Palletizing Robot. Buildings 2023, 13, 966. [Google Scholar] [CrossRef]
- Li, S.; Zhang, X. Research on planning and optimization of trajectory for underwater vision welding robot. Array 2022, 16, 100253. [Google Scholar] [CrossRef]
- Wang, J.; Wen, K.; Lei, T.; Xiao, Y.; Pan, Y. Automatic Aluminum Alloy Surface Grinding Trajectory Planning of Industrial Robot Based on Weld Seam Recognition and Positioning. Actuators 2023, 12, 170. [Google Scholar] [CrossRef]
- Mineo, C.; Pierce, S.G.; Nicholson, P.I.; Cooper, I. Robotic path planning for non-destructive testing–A custom MATLAB toolbox approach. Robot. Comput. Integr. Manuf. 2016, 37, 1–12. [Google Scholar] [CrossRef]
- Ma, K.; Han, L.; Sun, X.; Liang, C.; Zhang, S.; Shi, Y.; Wang, X. A Path Planning Method of Robotic Belt Grinding for Workpieces with Complex Surfaces. IEEE/ASME Trans. Mechatron. 2020, 25, 728–738. [Google Scholar] [CrossRef]
- Wang, W.; Yun, C. A Path Planning Method for Robotic Belt Surface Grinding. Chin. J. Aeronaut. 2011, 24, 520–526. [Google Scholar] [CrossRef]
- Cheng, C.; Lv, X.; Zhang, J.; Zhang, M. Robot Arm Path Planning Based on Improved RRT Algorithm. In Proceedings of the 2021 3rd International Symposium on Robotics and Intelligent Manufacturing Technology, ISRIMT 2021, Changzhou, China, 24–26 September 2021; pp. 243–247. [Google Scholar] [CrossRef]
- Li, T.; Meng, S.; Lu, C.; Wu, Y.; Liu, J. A novel BIM and vision-based robotic welding trajectory planning method for complex intersection curves. Measurement 2025, 253, 117587. [Google Scholar] [CrossRef]
- Li, B.; Tian, W.; Zhang, C.; Hua, F.; Cui, G.; Li, Y. Positioning error compensation of an industrial robot using neural networks and experimental study. Chin. J. Aeronaut. 2022, 35, 346–360. [Google Scholar] [CrossRef]
- Chen, Y.; Chu, B.; Freeman, C.T. Iterative Learning Control for Robotic Path Following With Trial-Varying Motion Profiles. IEEE/ASME Trans. Mechatron. 2022, 27, 4697–4706. [Google Scholar] [CrossRef]
- Bhattarai, U.; Sapkota, R.; Kshetri, S.; Mo, C.; Whiting, M.D.; Zhang, Q.; Karkee, M. A vision-based robotic system for precision pollination of apples. Comput. Electron. Agric. 2025, 234, 110158. [Google Scholar] [CrossRef]
- Juříček, M.; Parák, R.; Kůdela, J. Evolutionary Computation Techniques for Path Planning Problems in Industrial Robotics: A State-of-the-Art Review. Computation 2023, 11, 245. [Google Scholar] [CrossRef]
- He, L.; Sun, Y.; Chen, L.; Feng, Q.; Li, Y.; Lin, J.; Qiao, Y.; Zhao, C. Advance on Agricultural Robot Hand–Eye Coordination for Agronomic Task: A Review. Engineering 2025, 51, 263–279. [Google Scholar] [CrossRef]
- Sidhik, S.; Sridharan, M.; Ruiken, D. An adaptive framework for trajectory following in changing-contact robot manipulation tasks. Rob. Auton. Syst. 2024, 181, 104785. [Google Scholar] [CrossRef]
- Sabique, P.V.; Pasupathy, G.; Ramachandran, S. A data driven recurrent neural network approach for reproduction of variable visuo-haptic force feedback in surgical tool insertion. Expert. Syst. Appl. 2024, 238, 122221. [Google Scholar] [CrossRef]
- Kruzic, S.; Music, J.; Kamnik, R.; Papic, V. Estimating robot manipulator end-effector forces using deep learning. In Proceedings of the 2020 43rd International Convention on Information, Communication and Electronic Technology, MIPRO 2020-Proceedings, Opatija, Croatia, 28 September–2 October 2020; pp. 1163–1168. [Google Scholar] [CrossRef]
- Roveda, L.; Riva, D.; Bucca, G.; Piga, D. External joint torques estimation for a position-controlled manipulator employing an extended kalman filter. In Proceedings of the 2021 18th International Conference on Ubiquitous Robots, UR 2021, Gangneung, Republic of Korea, 12–14 July 2021; pp. 101–107. [Google Scholar] [CrossRef]
- Meng, Q.; Lai, X.; Yan, Z.; Su, C.Y.; Wu, M. Motion Planning and Adaptive Neural Tracking Control of an Uncertain Two-Link Rigid-Flexible Manipulator With Vibration Amplitude Constraint. IEEE Trans. Neural Netw. Learn. Syst. 2022, 33, 3814–3828. [Google Scholar] [CrossRef] [PubMed]
- Gao, H.; An, H.; Lin, W.; Yu, X.; Qiu, J. Trajectory Tracking of Variable Centroid Objects Based on Fusion of Vision and Force Perception. IEEE Trans. Cybern. 2023, 53, 7957–7965. [Google Scholar] [CrossRef] [PubMed]
- Semnani, S.H.; De Ruiter, A.H.J.; Liu, H.H.T. Force-Based Algorithm for Motion Planning of Large Agent. IEEE Trans. Cybern. 2022, 52, 654–665. [Google Scholar] [CrossRef]
- Liu, M.; Shang, M. Orientation Tracking Incorporated Multicriteria Control for Redundant Manipulators With Dynamic Neural Network. IEEE Trans. Ind. Electron. 2024, 71, 3801–3810. [Google Scholar] [CrossRef]
- Wu, D.; Zhao, Q.; Fan, J.; Qi, J.; Zheng, P.; Hu, J. H2R Bridge: Transferring vision-language models to few-shot intention meta-perception in human robot collaboration. J. Manuf. Syst. 2025, 80, 524–535. [Google Scholar] [CrossRef]
- Wang, W.; Tian, W.; Liao, W.; Li, B.; Hu, J. Error compensation of industrial robot based on deep belief network and error similarity. Robot. Comput. Integr. Manuf. 2022, 73, 102220. [Google Scholar] [CrossRef]
- Zhang, J.; Yan, W. 6-DOF UR3 Robot Manipulation Based on Deep Learning. In Proceedings of the 16th International Conference on Advanced Computer Theory and Engineering, ICACTE 2023, Hefei, China, 15–17 September 2023; pp. 237–241. [Google Scholar] [CrossRef]
- Ma, S.; Deng, K.; Lu, Y.; Xu, X. Robot error compensation based on incremental extreme learning machines and an improved sparrow search algorithm. Int. J. Adv. Manuf. Technol. 2023, 125, 5431–5443. [Google Scholar] [CrossRef]
- Wrütz, T.; Group, V.; Biesenbach, R. Robot Offline Programming Tool (RoBO-2L) for Model-Based Design with MATLAB. In Proceedings of the 2nd International Conference on Engineering Science and Innovative Technology, ESIT 2016, Phuket, Thailand, 21–23 April 2016; pp. 1–5. [Google Scholar]
- Golz, J.; Wruetz, T.; Eickmann, D.; Biesenbach, R. RoBO-2L, a Matlab interface for extended offline programming of KUKA industrial robots. In Proceedings of the 2016 11th France-Japan and 9th Europe-Asia Congress on Mechatronics, MECATRONICS 2016/17th International Conference on Research and Education in Mechatronics, REM 2016, Compiègne, France, 15–17 June 2016; pp. 64–67. [Google Scholar] [CrossRef]
- Al-Mahasneh, A.J.; Falkenhain, J.; Mousa, M.; Biesenbach, R.; Al-Mahasneh, A.; Baniyounis, M. Development of ANFIS Controller for Trajectory Tracking Control Using ROBO2L MATLAB Toolbox for KUKA Industrial Robot via RSI. In Proceedings of the 2024 21st International Multi-Conference on Systems, Signals & Devices (SSD), Erbil, Iraq, 22–25 April 2024. [Google Scholar] [CrossRef]
- Mousa, M.A.A.; Elgohr, A.T.; Khater, H.A. Trajectory Optimization for a 6 DOF Robotic Arm Based on Reachability Time. Ann. Emerg. Technol. Comput. 2024, 8, 22–35. [Google Scholar] [CrossRef]
- Barhaghtalab, M.H.; Meigoli, V.; Haghighi, M.R.G.; Nayeri, S.A.; Ebrahimi, A. Dynamic analysis, simulation, and control of a 6-DOF IRB-120 robot manipulator using sliding mode control and boundary layer method. J. Cent. South. Univ. 2018, 25, 2219–2244. [Google Scholar] [CrossRef]
- Roveda, L.; Forgione, M.; Piga, D. Robot control parameters auto-tuning in trajectory tracking applications. Control Eng. Pr. 2020, 101, 104488. [Google Scholar] [CrossRef]
- Hu, J.; Xiong, R. Contact Force Estimation for Robot Manipulator Using Semiparametric Model and Disturbance Kalman Filter. IEEE Trans. Ind. Electron. 2018, 65, 3365–3375. [Google Scholar] [CrossRef]
- Wei, Y.; Li, W.; Yang, Y.; Yu, X.; Guo, L. Decoupling Observer for Contact Force Estimation of Robot Manipulators Based on Enhanced Gaussian Process Model. In Proceedings of the 2022 8th IEEE International Conference on Cloud Computing and Intelligence Systems, CCIS 2022, Chengdu, China, 26–28 November 2022; pp. 1–7. [Google Scholar] [CrossRef]
- Xiao, Y. Integrating CNN and RANSAC for improved object recognition in industrial robotics. Syst. Soft Comput. 2025, 7, 200240. [Google Scholar] [CrossRef]
- Alt, B.; Dvorak, J.; Katic, D.; Jäkel, R.; Beetz, M.; Lanza, G. BANSAI: Towards Bridging the AI Adoption Gap in Industrial Robotics with Neurosymbolic Programming. Procedia CIRP 2024, 130, 532–537. [Google Scholar] [CrossRef]
- Song, Y.; Liu, M.; Lian, B.; Qi, Y.; Wang, Y.; Wu, J.; Li, Q. Industrial serial robot calibration considering geometric and deformation errors. Robot. Comput. Integr. Manuf. 2022, 76, 102328. [Google Scholar] [CrossRef]
- Righettini, P.; Strada, R.; Cortinovis, F. Neural Network Mapping of Industrial Robots’ Task Times for Real-Time Process Optimization. Robotics 2023, 12, 143. [Google Scholar] [CrossRef]
- Li, L.; Ren, X.; Feng, H.; Chen, H.; Chen, X. A novel material removal rate model based on single grain force for robotic belt grinding. J. Manuf. Process 2021, 68, 1–12. [Google Scholar] [CrossRef]
- Lv, Y.; Peng, Z.; Qu, C.; Zhu, D. An adaptive trajectory planning algorithm for robotic belt grinding of blade leading and trailing edges based on material removal profile model. Robot. Comput. Integr. Manuf. 2020, 66, 101987. [Google Scholar] [CrossRef]
- Zhu, D.; Feng, X.; Xu, X.; Yang, Z.; Li, W.; Yan, S.; Ding, H. Robotic grinding of complex components: A step towards efficient and intelligent machining–challenges, solutions, and applications. Robot. Comput. Integr. Manuf. 2020, 65, 101908. [Google Scholar] [CrossRef]
- Gao, K.; Chen, H.; Zhang, X.; Ren, X.K.; Chen, J.; Chen, X. A novel material removal prediction method based on acoustic sensing and ensemble XGBoost learning algorithm for robotic belt grinding of Inconel 718. Int. J. Adv. Manuf. Technol. 2019, 105, 217–232. [Google Scholar] [CrossRef]
- Abeywardena, S.; Yuan, Q.; Tzemanaki, A.; Psomopoulou, E.; Droukas, L.; Melhuish, C.; Dogramadzi, S. Estimation of Tool-Tissue Forces in Robot-Assisted Minimally Invasive Surgery Using Neural Networks. Front. Robot. AI 2019, 6, 457392. [Google Scholar] [CrossRef]
- Batty, T.; Ehrampoosh, A.; Shirinzadeh, B.; Zhong, Y.; Smith, J. A Transparent Teleoperated Robotic Surgical System with Predictive Haptic Feedback and Force Modelling. Sensors 2022, 22, 9770. [Google Scholar] [CrossRef]
- Beetz, M.; Kazhoyan, G.; Vernon, D. Robot manipulation in everyday activities with the CRAM 2.0 cognitive architecture and generalized action plans. Cogn. Syst. Res. 2025, 92, 101375. [Google Scholar] [CrossRef]
Algorithm | Application | Details | Ref. |
---|---|---|---|
YOLO V5 | Apple recognition | Integrated into Raspberry Pi 4B, 8MP camera, ROC values 0.98 and 0.9488 | [18] |
YOLO V3 | Harvesting item localization | Uses of R-Bbox, VGG models for image characteristics | [19] |
YOLO V7 | RM trajectory analysis | Reduces manufacturing cost and power consumption, recognition accuracy of 95.2% | [20] |
YOLO V5 | 3D object detection | OpenVINO-based model deployment, 70% reduction in inference time | [21] |
YOLO ROS | Industrial robot control | Uses Simulink environment, ROS with C270 Digital Webcam | [22] |
YOLO V4, YOLO V7 | Color recognition, robotic arm grasping | Uses of EPSON C4-A601S, RealSense Depth Camera D435i, 5 Tesla T4 GPUs | [23] |
YOLO V7 | 6 DOF pose estimation | New synthetic dataset, fine-tuning method | [24] |
YOLOv8-to STRAW-YOLO; | Target accuracy | P, R and mAP@50 of the key point 91.6%, 91.7% and 96.0%, respectively, are 3.4%, 1.0% and 2.3% higher | [26] |
Algorithm | Application | Details | Ref. |
---|---|---|---|
ICP Canny algorithm | Deburring operation | OLP system, OpenCV Library; motion RobSim; | [56] |
Original extraction algorithm and the correction algorithm | Welding | Recognition rate of 97.0%; Adaptive feature extraction 0.04 s, | [57] |
Original trajectory judgment algorithm; original trajectory judgment algorithm | Welding, to monitor the real-time status | RPP, up to ±0.04 mm; APP only ±0.5 mm | [58] |
Deep Q Networks model reinforcement learning algorithm | Robotic surgery for censorious surgeries in real-time | Learning rate 0.0 and 1.0 | [59] |
Neural Network and Genetic Algorithms | Robotics assisted minimally invasive surgery | ANN architecture | [60] |
Original Cartesian impedance control algorithm | Industrial | Damping ration 0.7; Transl. Cart. Impedance 1500 N/m | [65] |
Sequential quadratic programming with filter | Welding | Optimize the time of the quintic B-spline curve trajectory | [64] |
Particle swarm optimization (PSO) refinement algorithm | Grinding | Measurement accuracy of 50 nm; Measurement range of 200 mm in diameter | [67] |
Curve optimization algorithm | Grinding | Accuracy 10% | [68] |
XGboost learning algorithm | Grinding | Max errors 10.9%, Material Removal Rate is 14.4%. | [100] |
Trajectory planning | Grinding | Ra values reach 0.277 μm and 0.264 μm; Profile errors at blade leading and training edges 0.0319 mm and 0.0342 mm; standard deviation on the convex and concave is 0.0232 and 0.0216, | [101] |
Cartesian architecture, point cloud matching algorithm; tool deflection compensation algorithm | Grinding | Accuracy 0.005 mm; Repeatability 0.0 2 mm | [102] |
Combined the acoustic sensing and ensemble XGBoost learning algorithm | Grinding | Absolute percentage error 4.373%, | [103] |
RRT | Path planning | Success rate 97.85% | [69] |
BIM-based | Welding | Repeat accuracy 0.1mm | [70] |
Spatial ILC | Micro scale Application | Accuracy level of 10−3 | [72] |
Neural network algorithm; force estimation algorithm | Testing for the prediction in a force feedback system | Haptic feedback in robotic surgery; execution time of the code should be improved for online estimation | [104] |
Exponentially weighed recursive least squares | Surgical | The measured slave force was reduced to 0.076 N; estimates the respective parameters of the Kelvin–Voigt (KV) 0.356 N and Hunt–Crossley (HC) to 0.560 N force models | [105] |
LSTM based RNN | Estimating the force on the surface and internal layers | Framework RNN-LSTM + DR + CLR show a 9.23% and 3.8% in force prediction accuracy in real-time and 7.11% and 1.68% | [77] |
Extended Kalman Filter; control architecture in real sensorless robotic applications | Industrial manipulators | Optimal switching impact/force Controller is under investigation | [79] |
Genetic algorithm | Robot manipulator | SMC with SMCBL to eliminate the chattering | [92] |
Bayesian optimization algorithm | Robot manipulator | 25 parameters optimized | [93] |
Bridging the AI Adoption Gap via Neurosymbolic AI | Industrial robot | Described AI gap in industrial robot programming | [97] |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Makulavičius, M.; Petronienė, J.J.; Šutinys, E.; Bučinskas, V.; Dzedzickis, A. Industrial Robotic Setups: Tools and Technologies for Tracking and Analysis in Industrial Processes. Appl. Sci. 2025, 15, 10249. https://doi.org/10.3390/app151810249
Makulavičius M, Petronienė JJ, Šutinys E, Bučinskas V, Dzedzickis A. Industrial Robotic Setups: Tools and Technologies for Tracking and Analysis in Industrial Processes. Applied Sciences. 2025; 15(18):10249. https://doi.org/10.3390/app151810249
Chicago/Turabian StyleMakulavičius, Mantas, Juratė Jolanta Petronienė, Ernestas Šutinys, Vytautas Bučinskas, and Andrius Dzedzickis. 2025. "Industrial Robotic Setups: Tools and Technologies for Tracking and Analysis in Industrial Processes" Applied Sciences 15, no. 18: 10249. https://doi.org/10.3390/app151810249
APA StyleMakulavičius, M., Petronienė, J. J., Šutinys, E., Bučinskas, V., & Dzedzickis, A. (2025). Industrial Robotic Setups: Tools and Technologies for Tracking and Analysis in Industrial Processes. Applied Sciences, 15(18), 10249. https://doi.org/10.3390/app151810249