Low-Code Mixed Reality Programming Framework for Collaborative Robots: From Operator Intent to Executable Trajectories
Abstract
1. Introduction
- We propose a novel “coarse-to-fine” interaction paradigm for MR robot programming that effectively resolves the prevailing trade-off between intuitiveness and precision.
- We design and implement a low-code system that realizes this paradigm, synergizing an intuitive MR front-end with a powerful ROS back-end for real-time, high-quality path generation from sparse user inputs.
- We validate our system’s effectiveness and efficiency through two industrial case studies and a comparative experiment, demonstrating its superiority over traditional programming methods for non-expert users.
2. Related Work
2.1. Traditional Robot Programming
2.2. Mixed Reality-Based Robot Programming
3. Materials and Methods
3.1. MR Low-Code Programming Overview
3.2. System Architecture
3.2.1. Operator Intent Module
3.2.2. Execution Core
3.3. Hardware and Software Setup
3.4. Core Workflow
3.4.1. Coarse-Grained Path Definition
3.4.2. Fine-Grained Trajectory Generation and Verification
4. System Implementation and Algorithms
4.1. HoloLens-to-Robot Coordinate System Registration
4.2. Adaptive Curvature-Based Spline Resampling (ACRSR)
4.3. Robot Motion Planning
| Algorithm 1 Robot Motion Planning from ACRSR-Refined Paths |
|
5. Experiments and Results
5.1. Experimental Platform and Protocol
5.2. System Accuracy Evaluation
5.2.1. Spatial Registration Accuracy
5.2.2. Trajectory Evaluation
5.3. System Functionality and Validation
5.3.1. Butt Joint Welding
5.3.2. Sealant Application
5.4. Performance Evaluation and Comparison
6. Conclusions
Author Contributions
Funding
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
| ACRSR | Adaptive Curvature-based Spline Resampling |
| DKT | Demonstrative-Kinesthetic Teaching |
| FK | Forward Kinematics |
| GUI | Graphical User Interface |
| HMD | Head-Mounted Display |
| HRI | Human–Robot Interaction |
| IK | Inverse Kinematics |
| MAE | Mean Absolute Error |
| MR | Mixed Reality |
| MRTK | Mixed Reality Toolkit |
| OLP | Offline Programming |
| PbD | Programming by Demonstration |
| PTP | Point-to-Point |
| RMSD | Root Mean Squared Deviation |
| RMSE | Root Mean Square Error |
| ROS | Robot Operating System |
| R&D | Research and Development |
| SLERP | Spherical Linear Interpolation |
| SUS | System Usability Scale |
| SVD | Singular Value Decomposition |
| TCT | Task Completion Time |
| TCP | Tool Center Point |
| TP | Teach Pendant |
References
- Makulavičius, M.; Petkevičius, S.; Rožėnė, J.; Dzedzickis, A.; Bučinskas, V. Industrial robots in mechanical machining: Perspectives and limitations. Robotics 2023, 12, 160. [Google Scholar] [CrossRef]
- Wen, Y.; Pagilla, P.R. A novel 3D path following control framework for robots performing surface finishing tasks. Mechatronics 2021, 76, 102540. [Google Scholar] [CrossRef]
- Khan, M.M.; Singh, K.P.; Khan, W.U. A critical study on the implementation of operation, control and maintenance techniques for flexible manufacturing systems in small scale industries. Mater. Today Proc. 2023. [Google Scholar] [CrossRef]
- Heimann, O.; Guhl, J. Industrial robot programming methods: A scoping review. In Proceedings of the 2020 25th IEEE International Conference on Emerging Technologies and Factory Automation (ETFA), Vienna, Austria, 8–11 September 2020; Volume 1, pp. 696–703. [Google Scholar]
- Liao, Z.; Cai, Y. AR-enhanced digital twin for human–Robot interaction in manufacturing systems. Energy Ecol. Environ. 2024, 9, 530–548. [Google Scholar] [CrossRef]
- Fu, J.; Rota, A.; Li, S.; Zhao, J.; Liu, Q.; Iovene, E.; Ferrigno, G.; De Momi, E. Recent advancements in augmented reality for robotic applications: A survey. Actuators 2023, 12, 323. [Google Scholar] [CrossRef]
- Soares, I.; Petry, M.; Moreira, A.P. Programming robots by demonstration using augmented reality. Sensors 2021, 21, 5976. [Google Scholar] [CrossRef]
- Maric, B.; Zoric, F.; Petric, F.; Orsag, M. Comparative analysis of programming by demonstration methods: Kinesthetic teaching vs. human demonstration. arXiv 2024, arXiv:2403.10140. [Google Scholar] [CrossRef]
- Huang, S.; Wang, B.; Li, X.; Zheng, P.; Mourtzis, D.; Wang, L. Industry 5.0 and Society 5.0—Comparison, complementation and co-evolution. J. Manuf. Syst. 2022, 64, 424–428. [Google Scholar] [CrossRef]
- Pan, Z.; Polden, J.; Larkin, N.; Van Duin, S.; Norrish, J. Recent progress on programming methods for industrial robots. Robot. Comput.-Integr. Manuf. 2012, 28, 87–94. [Google Scholar] [CrossRef]
- Ryalat, M.; Almtireen, N.; Al-refai, G.; Elmoaqet, H.; Rawashdeh, N. Research and education in robotics: A comprehensive review, trends, challenges, and future directions. J. Sens. Actuator Netw. 2025, 14, 76. [Google Scholar] [CrossRef]
- Xu, Y.; Yu, H.; Wu, L.; Song, Y.; Liu, C. Contingency Planning of Visual Contamination for Wheeled Mobile Robots with Chameleon-Inspired Visual System. Electronics 2023, 12, 2365. [Google Scholar] [CrossRef]
- Mabong, G.P.; Osore, E.A.; Cherop, P.T. Robot Manipulator Programming Via Demonstrative-Kinesthetic Teaching for Efficient Industrial Material Handling Applications. Afr. Sci. Annu. Rev. 2024, 1, 165–171. [Google Scholar] [CrossRef]
- Nieto Bastida, S.; Lin, C.Y. Autonomous trajectory planning for spray painting on complex surfaces based on a point cloud model. Sensors 2023, 23, 9634. [Google Scholar] [CrossRef] [PubMed]
- Gonzalez, M.; Rodriguez, A.; de Lacalle, L.N.L. A novel methodology to improve robotic contour following by using radial-compliant pneumatic spindles. Int. J. Adv. Manuf. Technol. 2025, 1–9. [Google Scholar] [CrossRef]
- Sarivan, I.M.; Madsen, O.; Wæhrens, B.V. Automatic welding-robot programming based on product-process-resource models. Int. J. Adv. Manuf. Technol. 2024, 132, 1931–1950. [Google Scholar] [CrossRef]
- Slavković, N.; Zivanovic, S.; Dimic, Z.; Kokotovic, B. An advanced machining robot flexible programming methodology supported by verification in a virtual environment. Int. J. Comput. Integr. Manuf. 2025, 38, 1424–1442. [Google Scholar] [CrossRef]
- Li, G.; Wang, R.; Xu, P.; Ye, Q.; Chen, J. The Developments and Challenges towards Dexterous and Embodied Robotic Manipulation: A Survey. arXiv 2025, arXiv:2507.11840. [Google Scholar] [CrossRef]
- Babcinschi, M.; Cruz, F.; Duarte, N.; Santos, S.; Alves, S.; Neto, P. Offline robot programming assisted by task demonstration: An AutomationML interoperable solution for glass adhesive application and welding. Int. J. Comput. Integr. Manuf. 2025, 38, 864–875. [Google Scholar] [CrossRef]
- Zhang, F.; Lai, C.Y.; Simic, M.; Ding, S. Augmented reality in robot programming. Procedia Comput. Sci. 2020, 176, 1221–1230. [Google Scholar] [CrossRef]
- Ong, S.; Nee, A.; Yew, A.; Thanigaivel, N. AR-assisted robot welding programming. Adv. Manuf. 2020, 8, 40–48. [Google Scholar] [CrossRef]
- Xu, Y.; Dai, P.; Xin, M.; Wu, L.; Song, Y. Design and Analysis of a Dual-Screw Propelled Robot for Underwater and Muddy Substrate Operations in Agricultural Ponds. Actuators 2025, 14, 450. [Google Scholar] [CrossRef]
- Liu, G.; Sun, W.; Li, P. Motion capture and AR based programming by demonstration for industrial robots using handheld teaching device. Sci. Rep. 2024, 14, 23259. [Google Scholar] [CrossRef] [PubMed]
- Dogangun, F.; Bahar, S.; Yildirim, Y.; Temir, B.T.; Ugur, E.; Dogan, M.D. RAMPA: Robotic Augmented Reality for Machine Programming by DemonstrAtion. IEEE Robot. Autom. Lett. 2025, 10, 3795–3802. [Google Scholar] [CrossRef]
- Lotsaris, K.; Gkournelos, C.; Fousekis, N.; Kousi, N.; Makris, S. AR based robot programming using teaching by demonstration techniques. Procedia CIRP 2021, 97, 459–463. [Google Scholar] [CrossRef]
- Ong, S.K.; Yew, A.; Thanigaivel, N.K.; Nee, A.Y. Augmented reality-assisted robot programming system for industrial applications. Robot. Comput.-Integr. Manuf. 2020, 61, 101820. [Google Scholar] [CrossRef]
- Tadeja, S.K.; Zhou, T.; Capponi, M.; Walas, K.; Bohné, T.; Forni, F. Using augmented reality in Human-robot assembly: A comparative study of eye-gaze and hand-ray pointing methods. In Proceedings of the 2024 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Abu Dhabi, United Arab Emirates, 14–18 October 2024; pp. 8786–8793. [Google Scholar]
- Ikeda, B.; Szafir, D. Programar: Augmented reality end-user robot programming. ACM Trans. Hum.-Robot. Interact. 2024, 13, 1–20. [Google Scholar] [CrossRef]
- Kim, J.; Lee, S.; Bae, J. An augmented reality-based wearable system for handheld-free and intuitive robot programming. J. Intell. Manuf. 2025, 1–10. [Google Scholar] [CrossRef]
- Fang, H.C.; Ong, S.K.; Nee, A.Y.C. Interactive robot trajectory planning and simulation using augmented reality. Robot. Comput.-Integr. Manuf. 2012, 28, 227–237. [Google Scholar] [CrossRef]
- ISO/TS 15066:2016; Robots and Robotic Devices—Collaborative Robots. International Organization for Standardization: Geneva, Switzerland, 2016.
- REP 103; Standard Units of Measure and Coordinate Conventions. Open Robotics: Mountain View, CA, USA, 2010. Available online: https://www.ros.org/reps/rep-0103.html (accessed on 7 October 2010).
- Kabsch, W. A solution for the best rotation to relate two sets of vectors. Found. Crystallogr. 1976, 32, 922–923. [Google Scholar] [CrossRef]
















| Task | Method | TCT (s) |
|---|---|---|
| Butt Welding | MR System | 70.2 ± 11.4 |
| Teach Pendant | 127.0 ± 9.4 | |
| Sealant Application | MR System | 109.4 ± 9.3 |
| Teach Pendant | 220.3 ± 20.0 |
| Questions | MR System | Teach Pendant |
|---|---|---|
| 1. I think that I would like to use this system frequently | 4.35 | 2.80 |
| 2. I found the system unnecessarily complex | 1.70 | 3.50 |
| 3. I thought the system was easy to use | 4.25 | 3.00 |
| 4. I think that I would need the support of a technical person to be able to use this system | 1.75 | 3.20 |
| 5. I found the various functions in this system were well integrated | 4.45 | 3.50 |
| 6. I thought there was too much inconsistency in this system | 1.70 | 3.00 |
| 7. I would imagine that most people would learn to use this system very quickly | 4.35 | 2.50 |
| 8. I found the system very cumbersome to use | 1.70 | 3.50 |
| 9. I felt very confident using the system | 4.28 | 3.00 |
| 10. I needed to learn a lot of things before I could get going with this system | 1.75 | 3.50 |
| Global SUS Score (0–100 Scale) | 82.7 | 45.3 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license.
Share and Cite
Wang, Z.; Li, Z.; Yu, H.; Pan, D.; Peng, S.; Liu, S. Low-Code Mixed Reality Programming Framework for Collaborative Robots: From Operator Intent to Executable Trajectories. Robotics 2026, 15, 9. https://doi.org/10.3390/robotics15010009
Wang Z, Li Z, Yu H, Pan D, Peng S, Liu S. Low-Code Mixed Reality Programming Framework for Collaborative Robots: From Operator Intent to Executable Trajectories. Robotics. 2026; 15(1):9. https://doi.org/10.3390/robotics15010009
Chicago/Turabian StyleWang, Ziyang, Zhihai Li, Hongpeng Yu, Duotao Pan, Songjie Peng, and Shenlin Liu. 2026. "Low-Code Mixed Reality Programming Framework for Collaborative Robots: From Operator Intent to Executable Trajectories" Robotics 15, no. 1: 9. https://doi.org/10.3390/robotics15010009
APA StyleWang, Z., Li, Z., Yu, H., Pan, D., Peng, S., & Liu, S. (2026). Low-Code Mixed Reality Programming Framework for Collaborative Robots: From Operator Intent to Executable Trajectories. Robotics, 15(1), 9. https://doi.org/10.3390/robotics15010009

