An Integrated Real-Time Hand Gesture Recognition Framework for Human–Robot Interaction in Agriculture
Abstract
:1. Introduction
2. Materials and Methods
2.1. Hand Gesture Recognition
2.1.1. Data Acquisition
2.1.2. Data Preprocessing
2.1.3. Machine Learning Algorithms Tested for Classification Predictive Modeling
- Logistic regression (LR): Applied to estimate discrete values from a set of independent variables for predicting the likelihood of an event through fitting data to a logit function;
- Linear discriminant analysis (LDA): A technique whose objective is to project the features of a higher-dimensional space onto a lower one with the intention of avoiding dimensional costs;
- K-nearest neighbor (KNN): A pattern recognition algorithm, which utilizes training datasets so as to find out the closest relatives in future samples;
- Classification and regression trees (CART): A tree-based model relying on a set of “if-else” conditions;
- Naïve Bayes: A probabilistic classifier, which assumes that the existence of a specific feature in a class is independent of the existence of any other feature;
- Support vector machine (SVM): A methodology in which raw data are plotted as points in an n-dimensional space (n represents the number of features). Subsequently, each feature’s value is tied to a specific coordinate so as to enable data classification.
2.2. Real-Time Human–Robot Interaction Based on Hand Gesture Recognition
- Waits until the identified person performs a gesture. When the gesture is registered, the tf publisher is initialized (class 0). For each identified human, a unique identifier (ID) is assigned. When one of the identified persons performs the lock gesture, the HRI node is activated. Finally, the person who performed the lock gesture has the remit to control the UGV and collaborate with it. In order for a person with a different ID to be able to control the UGV, the “unlock” command must be detected;
- “Unlocks” the person and removes the tf publisher (class 1);
- Enables a tf following sequence and obstacle avoidance. The UGV moves autonomously in the field while keeping a predefined safe distance. This must be greater than or equal to 0.9 m, similarly to [46], in order to be within the so-called social acceptable zone (class 2);
- Disables the tf following sequence, but does not “unlock” the person (class 3);
- Navigates the UGV to a specific predefined location (class 4).
3. Results
3.1. Comparison of the Machine Learning Algorithms Performance for Classification of the Hand Gestures
3.2. Demonstration of the Proposed System in Differrent Scenarios
4. Discussion and Main Conclusions
Supplementary Materials
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Acknowledgments
Conflicts of Interest
References
- Oliveira, L.F.P.; Moreira, A.P.; Silva, M.F. Advances in Agriculture Robotics: A State-of-the-Art Review and Challenges Ahead. Robotics 2021, 10, 52. [Google Scholar] [CrossRef]
- Bechar, A. Agricultural Robotics for Precision Agriculture Tasks: Concepts and Principles. In Innovation in Agricultural Robotics for Precision Agriculture: A Roadmap for Integrating Robots in Precision Agriculture; Bechar, A., Ed.; Springer International Publishing: Cham, Switzerland, 2021; pp. 17–30. ISBN 9783030770365. [Google Scholar]
- Lampridi, M.; Benos, L.; Aidonis, D.; Kateris, D.; Tagarakis, A.C.; Platis, I.; Achillas, C.; Bochtis, D. The Cutting Edge on Advances in ICT Systems in Agriculture. Eng. Proc. 2021, 9, 46. [Google Scholar] [CrossRef]
- Liu, W.; Shao, X.-F.; Wu, C.-H.; Qiao, P. A systematic literature review on applications of information and communication technologies and blockchain technologies for precision agriculture development. J. Clean. Prod. 2021, 298, 126763. [Google Scholar] [CrossRef]
- Moysiadis, V.; Tsolakis, N.; Katikaridis, D.; Sørensen, C.G.; Pearson, S.; Bochtis, D. Mobile Robotics in Agricultural Operations: A Narrative Review on Planning Aspects. Appl. Sci. 2020, 10, 3453. [Google Scholar] [CrossRef]
- Benos, L.; Sørensen, C.G.; Bochtis, D. Field Deployment of Robotic Systems for Agriculture in Light of Key Safety, Labor, Ethics and Legislation Issues. Curr. Robot. Rep. 2022, 3, 49–56. [Google Scholar] [CrossRef]
- Marinoudi, V.; Sørensen, C.G.; Pearson, S.; Bochtis, D. Robotics and labour in agriculture. A context consideration. Biosyst. Eng. 2019, 184, 111–121. [Google Scholar] [CrossRef]
- Bechar, A.; Vigneault, C. Agricultural robots for field operations. Part 2: Operations and systems. Biosyst. Eng. 2017, 153, 110–128. [Google Scholar] [CrossRef]
- Vasconez, J.P.; Kantor, G.A.; Auat Cheein, F.A. Human–robot interaction in agriculture: A survey and current challenges. Biosyst. Eng. 2019, 179, 35–48. [Google Scholar] [CrossRef]
- Benos, L.; Bechar, A.; Bochtis, D. Safety and ergonomics in human-robot interactive agricultural operations. Biosyst. Eng. 2020, 200, 55–72. [Google Scholar] [CrossRef]
- Matheson, E.; Minto, R.; Zampieri, E.G.G.; Faccio, M.; Rosati, G. Human–Robot Collaboration in Manufacturing Applications: A Review. Robotics 2019, 8, 100. [Google Scholar] [CrossRef]
- Fang, H.C.; Ong, S.K.; Nee, A.Y.C. A novel augmented reality-based interface for robot path planning. Int. J. Interact. Des. Manuf. 2014, 8, 33–42. [Google Scholar] [CrossRef]
- Oudah, M.; Al-Naji, A.; Chahl, J. Hand Gesture Recognition Based on Computer Vision: A Review of Techniques. J. Imaging 2020, 6, 73. [Google Scholar] [CrossRef] [PubMed]
- Han, J.; Campbell, N.; Jokinen, K.; Wilcock, G. Investigating the use of Non-verbal Cues in Human-Robot Interaction with a Nao robot. In Proceedings of the IEEE 3rd International Conference on Cognitive Infocommunications (CogInfoCom), Kosice, Slovakia, 2–5 December 2012; pp. 679–683. [Google Scholar]
- Tran, D.-S.; Ho, N.-H.; Yang, H.-J.; Baek, E.-T.; Kim, S.-H.; Lee, G. Real-Time Hand Gesture Spotting and Recognition Using RGB-D Camera and 3D Convolutional Neural Network. Appl. Sci. 2020, 10, 722. [Google Scholar] [CrossRef]
- Varun, K.S.; Puneeth, I.; Jacob, T.P. Virtual Mouse Implementation using Open CV. In Proceedings of the 3rd International Conference on Trends in Electronics and Informatics (ICOEI), Tirunelveli, India, 23–25 April 2019; pp. 435–438. [Google Scholar]
- Cai, S.; Zhu, G.; Wu, Y.-T.; Liu, E.; Hu, X. A case study of gesture-based games in enhancing the fine motor skills and recognition of children with autism. Interact. Learn. Environ. 2018, 26, 1039–1052. [Google Scholar] [CrossRef]
- Rastgoo, R.; Kiani, K.; Escalera, S. Hand sign language recognition using multi-view hand skeleton. Expert Syst. Appl. 2020, 150, 113336. [Google Scholar] [CrossRef]
- Schulte, J.; Kocherovsky, M.; Paul, N.; Pleune, M.; Chung, C.-J. Autonomous Human-Vehicle Leader-Follower Control Using Deep-Learning-Driven Gesture Recognition. Vehicles 2022, 4, 243–258. [Google Scholar] [CrossRef]
- Pan, J.; Luo, Y.; Li, Y.; Tham, C.-K.; Heng, C.-H.; Thean, A.V.-Y. A Wireless Multi-Channel Capacitive Sensor System for Efficient Glove-Based Gesture Recognition with AI at the Edge. IEEE Trans. Circuits Syst. II Express Briefs 2020, 67, 1624–1628. [Google Scholar] [CrossRef]
- Dong, Y.; Liu, J.; Yan, W. Dynamic Hand Gesture Recognition Based on Signals from Specialized Data Glove and Deep Learning Algorithms. IEEE Trans. Instrum. Meas. 2021, 70, 2509014. [Google Scholar] [CrossRef]
- Huang, Y.; Yang, J. A multi-scale descriptor for real time RGB-D hand gesture recognition. Pattern Recognit. Lett. 2021, 144, 97–104. [Google Scholar] [CrossRef]
- Jaramillo-Yánez, A.; Benalcázar, M.E.; Mena-Maldonado, E. Real-Time Hand Gesture Recognition Using Surface Electromyography and Machine Learning: A Systematic Literature Review. Sensors 2020, 20, 2467. [Google Scholar] [CrossRef]
- Yamanoi, Y.; Togo, S.; Jiang, Y.; Yokoi, H. Learning Data Correction for Myoelectric Hand Based on “Survival of the Fittest”. Cyborg Bionic Syst. 2021, 2021, 9875814. [Google Scholar] [CrossRef]
- Bai, D.; Liu, T.; Han, X.; Yi, H. Application Research on Optimization Algorithm of sEMG Gesture Recognition Based on Light CNN + LSTM Model. Cyborg Bionic Syst. 2021, 2021, 9794610. [Google Scholar] [CrossRef]
- Jones, M.J.; Rehg, J.M. Statistical Color Models with Application to Skin Detection. Int. J. Comput. Vis. 2002, 46, 81–96. [Google Scholar] [CrossRef]
- Pun, C.-M.; Zhu, H.-M.; Feng, W. Real-Time Hand Gesture Recognition using Motion Tracking. Int. J. Comput. Intell. Syst. 2011, 4, 277–286. [Google Scholar] [CrossRef]
- Caputo, A.; Giachetti, A.; Soso, S.; Pintani, D.; D’Eusanio, A.; Pini, S.; Borghi, G.; Simoni, A.; Vezzani, R.; Cucchiara, R.; et al. SHREC 2021: Skeleton-based hand gesture recognition in the wild. Comput. Graph. 2021, 99, 201–211. [Google Scholar] [CrossRef]
- Li, Y. Hand gesture recognition using Kinect. In Proceedings of the IEEE International Conference on Computer Science and Automation Engineering, Beijing, China, 22–24 June 2012; pp. 196–199. [Google Scholar]
- Stergiopoulou, E.; Sgouropoulos, K.; Nikolaou, N.; Papamarkos, N.; Mitianoudis, N. Real time hand detection in a complex background. Eng. Appl. Artif. Intell. 2014, 35, 54–70. [Google Scholar] [CrossRef]
- Kakumanu, P.; Makrogiannis, S.; Bourbakis, N. A survey of skin-color modeling and detection methods. Pattern Recognit. 2007, 40, 1106–1122. [Google Scholar] [CrossRef]
- Molina, J.; Pajuelo, J.A.; Martínez, J.M. Real-time Motion-based Hand Gestures Recognition from Time-of-Flight Video. J. Signal Process. Syst. 2017, 86, 17–25. [Google Scholar] [CrossRef]
- Fu, L.; Gao, F.; Wu, J.; Li, R.; Karkee, M.; Zhang, Q. Application of consumer RGB-D cameras for fruit detection and localization in field: A critical review. Comput. Electron. Agric. 2020, 177, 105687. [Google Scholar] [CrossRef]
- De Smedt, Q.; Wannous, H.; Vandeborre, J.-P.; Guerry, J.; Saux, B.L.; Filliat, D. 3D hand gesture recognition using a depth and skeletal dataset: SHREC’17 track. In Proceedings of the Workshop on 3D Object Retrieval, Lyon, France, 23–24 April 2017; pp. 23–24. [Google Scholar]
- Chen, Y.; Luo, B.; Chen, Y.-L.; Liang, G.; Wu, X. A real-time dynamic hand gesture recognition system using kinect sensor. In Proceedings of the IEEE International Conference on Robotics and Biomimetics (ROBIO), Zhuhai, China, 6–9 December 2015; pp. 2026–2030. [Google Scholar]
- Xi, C.; Chen, J.; Zhao, C.; Pei, Q.; Liu, L. Real-time Hand Tracking Using Kinect. In Proceedings of the 2nd International Conference on Digital Signal Processing, Tokyo, Japan, 25–27 February 2018; pp. 37–42. [Google Scholar]
- Tang, A.; Lu, K.; Wang, Y.; Huang, J.; Li, H. A Real-Time Hand Posture Recognition System Using Deep Neural Networks. ACM Trans. Intell. Syst. Technol. 2015, 6, 1–23. [Google Scholar] [CrossRef]
- Mujahid, A.; Awan, M.J.; Yasin, A.; Mohammed, M.A.; Damaševičius, R.; Maskeliūnas, R.; Abdulkareem, K.H. Real-Time Hand Gesture Recognition Based on Deep Learning YOLOv3 Model. Appl. Sci. 2021, 11, 4164. [Google Scholar] [CrossRef]
- Agrawal, M.; Ainapure, R.; Agrawal, S.; Bhosale, S.; Desai, S. Models for Hand Gesture Recognition using Deep Learning. In Proceedings of the IEEE 5th International Conference on Computing Communication and Automation (ICCCA), Greater Noida, India, 30–31 October 2020; pp. 589–594. [Google Scholar]
- Niloy, E.; Meghna, J.; Shahriar, M. Hand Gesture-Based Character Recognition Using OpenCV and Deep Learning. In Proceedings of the International Conference on Automation, Control and Mechatronics for Industry 4.0 (ACMI), Rajshahi, Bangladesh, 8–9 July 2021; pp. 1–5. [Google Scholar]
- Devineau, G.; Moutarde, F.; Xi, W.; Yang, J. Deep Learning for Hand Gesture Recognition on Skeletal Data. In Proceedings of the 13th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2018), Xi’an, China, 15–19 May 2018; pp. 106–113. [Google Scholar]
- Zengeler, N.; Kopinski, T.; Handmann, U. Hand Gesture Recognition in Automotive Human-Machine Interaction Using Depth Cameras. Sensors 2019, 19, 59. [Google Scholar] [CrossRef] [PubMed]
- Wang, P.; Li, W.; Ogunbona, P.; Wan, J.; Escalera, S. RGB-D-based human motion recognition with deep learning: A survey. Comput. Vis. Image Underst. 2018, 171, 118–139. [Google Scholar] [CrossRef]
- Liu, H.; Wang, L. Gesture recognition for human-robot collaboration: A review. Int. J. Ind. Ergon. 2018, 68, 355–367. [Google Scholar] [CrossRef]
- Benos, L.; Tagarakis, A.C.; Dolias, G.; Berruto, R.; Kateris, D.; Bochtis, D. Machine Learning in Agriculture: A Comprehensive Updated Review. Sensors 2021, 21, 3758. [Google Scholar] [CrossRef] [PubMed]
- Vasconez, J.P.; Guevara, L.; Cheein, F.A. Social robot navigation based on HRI non-verbal communication: A case study on avocado harvesting. In Proceedings of the ACM Symposium on Applied Computing, Limassol, Cyprus, 8–12 April 2019; Association for Computing Machinery: New York, NY, USA, 2019. Volume F147772. pp. 957–960. [Google Scholar]
- Hurtado, J.P.V. Human-Robot Interaction Strategies in Agriculture; Universidad Técnica Federico Santa María: Valparaíso, Chile, 2020. [Google Scholar]
- Zhang, P.; Lin, J.; He, J.; Rong, X.; Li, C.; Zeng, Z. Agricultural Machinery Virtual Assembly System Using Dynamic Gesture Recognitive Interaction Based on a CNN and LSTM Network. Math. Probl. Eng. 2021, 2021, 5256940. [Google Scholar] [CrossRef]
- Tsolakis, N.; Bechtsis, D.; Bochtis, D. AgROS: A Robot Operating System Based Emulation Tool for Agricultural Robotics. Agronomy 2019, 9, 403. [Google Scholar] [CrossRef]
- Benos, L.; Tsaopoulos, D.; Bochtis, D. A Review on Ergonomics in Agriculture. Part II: Mechanized Operations. Appl. Sci. 2020, 10, 3484. [Google Scholar] [CrossRef]
- Benos, L.; Kokkotis, C.; Tsatalas, T.; Karampina, E.; Tsaopoulos, D.; Bochtis, D. Biomechanical Effects on Lower Extremities in Human-Robot Collaborative Agricultural Tasks. Appl. Sci. 2021, 11, 11742. [Google Scholar] [CrossRef]
- Tagarakis, A.C.; Benos, L.; Aivazidou, E.; Anagnostis, A.; Kateris, D.; Bochtis, D. Wearable Sensors for Identifying Activity Signatures in Human-Robot Collaborative Agricultural Environments. Eng. Proc. 2021, 9, 5. [Google Scholar] [CrossRef]
- Anagnostis, A.; Benos, L.; Tsaopoulos, D.; Tagarakis, A.; Tsolakis, N.; Bochtis, D. Human activity recognition through recurrent neural networks for human-robot interaction in agriculture. Appl. Sci. 2021, 11, 2188. [Google Scholar] [CrossRef]
- Lugaresi, C.; Tang, J.; Nash, H.; McClanahan, C.; Uboweja, E.; Hays, M.; Zhang, F.; Chang, C.-L.; Yong, M.G.; Lee, J.; et al. MediaPipe: A Framework for Building Perception Pipelines. arXiv 2019, arXiv:1906.08172. [Google Scholar]
- Veluri, R.K.; Sree, S.R.; Vanathi, A.; Aparna, G.; Vaidya, S.P. Hand Gesture Mapping Using MediaPipe Algorithm. In Proceedings of the Third International Conference on Communication, Computing and Electronics Systems, Coimbatore, India, 28–29 October 2021; Bindhu, V., Tavares, J.M.R.S., Du, K.-L., Eds.; Springer Singapore: Singapore, 2022; pp. 597–614. [Google Scholar]
- Damindarov, R.; Fam, C.A.; Boby, R.A.; Fahim, M.; Klimchik, A.; Matsumaru, T. A depth camera-based system to enable touch-less interaction using hand gestures. In Proceedings of the International Conference “Nonlinearity, Information and Robotics” (NIR), Innopolis, Russia, 26–29 August 2021; pp. 1–7. [Google Scholar]
- Boruah, B.J.; Talukdar, A.K.; Sarma, K.K. Development of a Learning-aid tool using Hand Gesture Based Human Computer Interaction System. In Proceedings of the Advanced Communication Technologies and Signal Processing (ACTS), Rourkela, India, 15–17 December 2021; pp. 1–5. [Google Scholar]
- MediaPipe. MediaPipe Hands. Available online: https://google.github.io/mediapipe/solutions/hands.html (accessed on 13 April 2022).
- Chawla, N.; Bowyer, K.; Hall, L.; Kegelmeyer, W. SMOTE: Synthetic Minority Over-sampling Technique. J. Artif. Intell. Res. 2002, 16, 321–357. [Google Scholar] [CrossRef]
- Fabian, P.; Varoquaux, G.; Gramfort, A.; Michel, V.; Thirion, B.; Grisel, O.; Blondel, M.; Prettenhofer, P.; Weiss, R.; Dubourg, V.; et al. Scikit-learn: Machine Learning in Python. J. Mach. Learn. Res. 2011, 12, 2825–2830. [Google Scholar]
- Dash, S.S.; Nayak, S.K.; Mishra, D. A review on machine learning algorithms. In Proceedings of the Smart Innovation, Systems and Technologies, Virtual Event, 14–16 June 2021; Springer Science and Business Media: Berlin/Heidelberg, Germany, 2021; Volume 153, pp. 495–507. [Google Scholar]
- Singh, A.; Thakur, N.; Sharma, A. A review of supervised machine learning algorithms. In Proceedings of the 3rd International Conference on Computing for Sustainable Global Development (INDIACom), New Delhi, India, 16–18 March 2016; pp. 1310–1315. [Google Scholar]
- Sen, P.C.; Hajra, M.; Ghosh, M. Supervised Classification Algorithms in Machine Learning: A Survey and Review. In Emerging Technology in Modelling and Graphics; Mandal, J.K., Bhattacharya, D., Eds.; Springer Singapore: Singapore, 2020; pp. 99–111. [Google Scholar]
- NVIDIA Jetson: The AI platform for autonomous machines. Available online: https://www.nvidia.com/en-us/autonomous-machines/embedded-systems/ (accessed on 15 April 2022).
- ROS-Robot Operating System. Available online: https://www.ros.org/ (accessed on 13 December 2021).
- Hinas, A.; Ragel, R.; Roberts, J.; Gonzalez, F. A Framework for Multiple Ground Target Finding and Inspection Using a Multirotor UAS. Sensors 2020, 20, 272. [Google Scholar] [CrossRef] [PubMed]
- Tagarakis, A.C.; Filippou, E.; Kalaitzidis, D.; Benos, L.; Busato, P.; Bochtis, D. Proposing UGV and UAV Systems for 3D Mapping of Orchard Environments. Sensors 2022, 22, 1571. [Google Scholar] [CrossRef] [PubMed]
- Grimstad, L.; From, P.J. The Thorvald II Agricultural Robotic System. Robotics 2017, 6, 24. [Google Scholar] [CrossRef]
- Lin, T.-Y.; Maire, M.; Belongie, S.; Hays, J.; Perona, P.; Ramanan, D.; Dollár, P.; Zitnick, C.L. Microsoft COCO: Common Objects in Context. In Proceedings of the European Conference on Computer Vision (ECCV 2014), Zurich, Switzerland, 6–12 September 2014; Fleet, D., Pajdla, T., Schiele, B., Tuytelaars, T., Eds.; Springer International Publishing: Cham, Switzerland, 2014; pp. 740–755. [Google Scholar]
- Foote, T. tf: The transform library. In Proceedings of the IEEE Conference on Technologies for Practical Robot Applications (TePRA), Woburn, MA, USA, 22–23 April 2013; pp. 1–6. [Google Scholar]
- Navigation: Package Summary. Available online: https://www.opera.com/client/upgraded (accessed on 28 June 2022).
- Zheng, K. ROS Navigation Tuning Guide. In Robot Operating System (ROS); Springer: Cham, Switzerland, 2021; pp. 197–226. [Google Scholar] [CrossRef]
- Kateris, D.; Kalaitzidis, D.; Moysiadis, V.; Tagarakis, A.C.; Bochtis, D. Weed Mapping in Vineyards Using RGB-D Perception. Eng. Proc. 2021, 9, 30. [Google Scholar] [CrossRef]
- Hershberger, D.; Gossow, D.; Faust, J.; William, W. RVIZ Package Summary. Available online: http://wiki.ros.org/rviz (accessed on 15 April 2022).
- Akalin, N.; Kristoffersson, A.; Loutfi, A. Do you feel safe with your robot? Factors influencing perceived safety in human-robot interaction based on subjective and objective measures. Int. J. Hum. Comput. Stud. 2022, 158, 102744. [Google Scholar] [CrossRef]
- Marinoudi, V.; Lampridi, M.; Kateris, D.; Pearson, S.; Sørensen, C.G.; Bochtis, D. The Future of Agricultural Jobs in View of Robotization. Sustainability 2021, 13, 12109. [Google Scholar] [CrossRef]
- Arena, P.; Baglio, S.; Fortuna, L.; Manganaro, G. Cellular Neural Networks: A Survey. IFAC Proc. Vol. 1995, 28, 43–48. [Google Scholar] [CrossRef]
- Arena, P.; Fortuna, L.; Frasca, M.; Patane, L. A CNN-based chip for robot locomotion control. IEEE Trans. Circuits Syst. I Regul. Pap. 2005, 52, 1862–1871. [Google Scholar] [CrossRef]
Recognized Hand Gesture | Corresponding Class | UGV Action |
---|---|---|
“Fist” | 0 | Lock |
“Flat” | 1 | Stop |
“Okay” | 2 | Unlock |
“Rock” | 3 | Return to the target location |
“Victory” | 4 | Follow |
Algorithm | Accuracy | Precision | Recall | F1-Score |
---|---|---|---|---|
LR | 0.831 | 0.744 | 0.730 | 0.737 |
LDA | 0.875 | 0.811 | 0.810 | 0.811 |
KNN | 0.875 | 0.839 | 0.821 | 0.830 |
CART | 0.800 | 0.703 | 0.723 | 0.713 |
NB | 0.753 | 0.561 | 0.618 | 0.588 |
SVM | 0.872 | 0.829 | 0.810 | 0.819 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Moysiadis, V.; Katikaridis, D.; Benos, L.; Busato, P.; Anagnostis, A.; Kateris, D.; Pearson, S.; Bochtis, D. An Integrated Real-Time Hand Gesture Recognition Framework for Human–Robot Interaction in Agriculture. Appl. Sci. 2022, 12, 8160. https://doi.org/10.3390/app12168160
Moysiadis V, Katikaridis D, Benos L, Busato P, Anagnostis A, Kateris D, Pearson S, Bochtis D. An Integrated Real-Time Hand Gesture Recognition Framework for Human–Robot Interaction in Agriculture. Applied Sciences. 2022; 12(16):8160. https://doi.org/10.3390/app12168160
Chicago/Turabian StyleMoysiadis, Vasileios, Dimitrios Katikaridis, Lefteris Benos, Patrizia Busato, Athanasios Anagnostis, Dimitrios Kateris, Simon Pearson, and Dionysis Bochtis. 2022. "An Integrated Real-Time Hand Gesture Recognition Framework for Human–Robot Interaction in Agriculture" Applied Sciences 12, no. 16: 8160. https://doi.org/10.3390/app12168160
APA StyleMoysiadis, V., Katikaridis, D., Benos, L., Busato, P., Anagnostis, A., Kateris, D., Pearson, S., & Bochtis, D. (2022). An Integrated Real-Time Hand Gesture Recognition Framework for Human–Robot Interaction in Agriculture. Applied Sciences, 12(16), 8160. https://doi.org/10.3390/app12168160