Terrain Perception Using Wearable Parrot-Inspired Companion Robot, KiliRo
Abstract
:1. Introduction
2. Materials and Methods
- height < 250 mm;
- weight < 250 g;
- head rotation range: 180°;
- operate between 10° and 45° Celsius.
2.1. Feature Extraction and Classification
2.1.1. SURF
2.1.2. KNN
- N is the dimension of the feature.
- D (x, y) gives the Euclidean distance between points X and Y.
- Xi refers the ith feature of X.
- Yi refers to the ith feature of Y.
2.1.3. Experimental Setup
3. Results
4. Discussion
5. Conclusions
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
References
- World Health Organization. Falls; World Health Organization: Geneva, Switzerland, 2021; Available online: https://www.who.int/news-room/fact-sheets/detail/falls#:~:text=Falls%20are%20the%20second%20leading,greatest%20number%20of%20fatal%20falls (accessed on 13 April 2022).
- Statista. Smartphone Subscriptions Worldwide 2016–2027. 23 February 2022. Available online: https://www.statista.com/statistics/330695/number-of-smartphone-users-worldwide/ (accessed on 13 April 2022).
- Markham, F.; Young, M. “Big gambling”: The rise of the global industry-state gambling complex. Addict. Res. Theory 2015, 23, 1–4. [Google Scholar] [CrossRef]
- Kapuria, P.; Nalawade, H.S. Digitising Indian Retail: Analysing Challenges and Exploring Growth Models. Obs. Res. Found. 2021, 304, 1–32. [Google Scholar]
- Smith, D.C.; Schreiber, K.M.; Saltos, A.; Lichenstein, S.B.; Lichenstein, R. Ambulatory cell phone injuries in the United States: An emerging national concern. J. Saf. Res. 2013, 47, 19–23. [Google Scholar] [CrossRef]
- White, M.P.; Eiser, J.R.; Harris, P.R. Risk perceptions of mobile phone use while driving. Risk Anal. 2004, 24, 323–334. [Google Scholar] [CrossRef] [PubMed]
- Available online: http://www.dailymail.co.uk/health/article-3310195/Rise-smartphone-injuries-43-people-walked-glued-screen-60-dropped-phone-face-reading.html (accessed on 20 March 2022).
- Nasar, J.L.; Troyer, D. Pedestrian injuries due to mobile phone use in public places. Accid. Anal. Prev. 2013, 57, 91–95. [Google Scholar] [CrossRef] [PubMed]
- BOptom RQ, I.; Cumming, R.G.; Mitchell, P.; Attebo, K. Visual impairment and falls in older adults: The Blue Mountains Eye Study. J. Am. Geriatr. Soc. 1998, 46, 58–64. [Google Scholar] [CrossRef] [PubMed]
- Available online: https://www.cdc.gov/visionhealth/basics/ced/fastfacts.htm#:~:text=Approximately%2012%20million%20people%2040,due%20to%20uncorrected%20refractive%20error (accessed on 20 March 2022).
- Rahman, M.H.; Saad, M.; Kenné, J.P.; Archambault, P.S. Exoskeleton robot for rehabilitation of elbow and forearm movements. In Proceedings of the 2010 18th Mediterranean Conference on Control & Automation (MED), Marrakech, Morocco, 23–25 June 2010; pp. 1567–1572. [Google Scholar]
- In, H.; Kang, B.B.; Sin, M.; Cho, K.J. Exo-Glove: A wearable robot for the hand with a soft tendon routing system. IEEE Robot. Autom. Mag. 2015, 22, 97–105. [Google Scholar] [CrossRef]
- Mayol, W.W.; Tordoff, B.J.; Murray, D.W. Designing a miniature wearable visual robot. In Proceedings of the 2002 IEEE International Conference on Robotics and Automation (Cat. No. 02CH37292), Washington, DC, USA, 11–15 May 2002; Volume 4, pp. 3725–3730. [Google Scholar]
- Davison, A.J.; Mayol, W.W.; Murray, D.W. Real-time localization and mapping with wearable active vision. In Proceedings of the Second IEEE and ACM International Symposium on Mixed and Augmented Reality, Tokyo, Japan, 10 October 2003; pp. 18–27. [Google Scholar]
- Kostov, V.; Ozawa, J.; Matsuura, S. Wearable accessory robot for context-aware apprises of personal information. In Proceedings of the RO-MAN 2004, 13th IEEE International Workshop on Robot and Human Interactive Communication (IEEE Catalog No. 04TH8759), Kurashiki, Japan, 22 September 2004; pp. 595–600. [Google Scholar]
- Tsetserukou, D.; Neviarouskaya, A. iFeel_IM!: Augmenting emotions during online communication. IEEE Comput. Graph. Appl. 2010, 30, 72–80. [Google Scholar] [CrossRef] [PubMed]
- Dakopoulos, D.; Bourbakis, N.G. Wearable obstacle avoidance electronic travel aids for blind: A survey. IEEE Trans. Syst. Man Cybern. Part C 2009, 40, 25–35. [Google Scholar] [CrossRef]
- Mann, S.; Huang, J.; Janzen, R.; Lo, R.; Rampersad, V.; Chen, A.; Doha, T. Blind navigation with a wearable range camera and vibrotactile helmet. In Proceedings of the 19th ACM international conference on Multimedia, New York, NY, USA, 28 November 2011; pp. 1325–1328. [Google Scholar]
- Goto, H.; Tanaka, M. Text-tracking wearable camera system for the blind. In Proceedings of the 2009 10th International Conference on Document Analysis and Recognition, Barcelona, Spain, 26–29 July 2009; pp. 141–145. [Google Scholar]
- Pradeep, V.; Medioni, G.; Weiland, J. Robot vision for the visually impaired. In Proceedings of the 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition—Workshops, San Francisco, CA, USA, 13–18 June 2010; pp. 15–22. [Google Scholar]
- Papadakis, P. Terrain traversability analysis methods for unmanned ground vehicles: A survey. Eng. Appl. Artif. Intell. 2013, 26, 1373–1385. [Google Scholar] [CrossRef] [Green Version]
- Montemerlo, M.; Thrun, S. A multi-resolution pyramid for outdoor robot terrain perception. In Proceedings of the AAAI, San Jose, CA, USA, 25–29 July 2004; Volume 4, pp. 464–469. [Google Scholar]
- Belter, D.; Skrzypczyński, P. Rough terrain mapping and classification for foothold selection in a walking robot. J. Field Robot. 2011, 28, 497–528. [Google Scholar] [CrossRef]
- Ivanchenko, V.; Coughlan, J.; Gerrey, W.; Shen, H. Computer vision-based clear path guidance for blind wheelchair users. In Proceedings of the 10th international ACM SIGACCESS conference on Computers and accessibility, New York, NY, USA, 13–15 October 2008; pp. 291–292. [Google Scholar]
- Available online: https://www.ny.gov/sites/ny.gov/files/atoms/files/pedestriansafetyactionplan.pdf (accessed on 23 March 2022).
- Wells, H.L.; McClure, L.A.; Porter, B.E.; Schwebel, D.C. Distracted pedestrian behavior on two urban college campuses. J. Community Health 2018, 43, 96–102. [Google Scholar] [CrossRef] [PubMed]
- Krägeloh, C.U.; Bharatharaj, J.; Sasthan Kutty, S.K.; Nirmala, P.R.; Huang, L. Questionnaires to measure acceptability of social robots: A critical review. Robotics 2019, 8, 88. [Google Scholar] [CrossRef] [Green Version]
- Bharatharaj, J. Investigation into the development of a parrot-inspired therapeutic robot to improve learning and social interaction of children with autism spectrum disorder. Ph.D. Thesis, Auckland University of Technology, Auckland, New Zealand, 2018. [Google Scholar]
- Bharatharaj, J.; Kumar, S.S. Considerations in Autism therapy using robotics. In Proceedings of the 2013 Fourth International Conference on Computing, Communications and Networking Technologies (ICCCNT), Tiruchengode, India, 4–6 July 2013; pp. 1–5. [Google Scholar]
- Bharatharaj, J.; Huang, L.; Al-Jumaily, A. Bio-inspired therapeutic pet robots: Review and future direction. In Proceedings of the 2015 10th International Conference on Information, Communications and Signal Processing (icics), Singapore, 2–4 December 2015; pp. 1–5. [Google Scholar]
- Azhagar Killai Vidu Thoothu-Author: Palappattai Chokkanaatha Pulavar, Tamil Ancient Literature. Year of publication is not known.
- Available online: https://aparnabandodkar.wordpress.com/2017/06/17/parrot-god-worship-in-tamil-nadu/ (accessed on 23 March 2022).
- Available online: https://en.wikipedia.org/wiki/Meenakshi_Temple,_Madurai (accessed on 23 March 2022).
- Shiney, M.J.; Hepsiba, D.; Anand, L.V. Medical Image Denoising Using Two-Stage Iterative Down-Up CNN and SURF Features. J. Phys. Conf. Ser. 2021, 1937, 012056. [Google Scholar] [CrossRef]
- Oyallon, E.; Rabin, J. An analysis of the SURF method. Image Processing Online 2015, 5, 176–218. [Google Scholar] [CrossRef]
- Kweon, I.S.; Hebert, M.; Kanade, T. Perception for rugged terrain. In Mobile Robots III; SPIE: Bellingham, WA, USA, 1989; Volume 1007, pp. 103–119. [Google Scholar]
- Zhu, H.; Wang, D.; Boyd, N.; Zhou, Z.; Ruan, L.; Zhang, A.; Luo, J. Terrain-Perception-Free Quadrupedal Spinning Locomotion on Versatile Terrains: Modeling, Analysis, and Experimental Validation. Front. Robot. AI 2021, 8, 724138. [Google Scholar] [CrossRef] [PubMed]
- Bar-Cohen, Y.; Breazeal, C. Biologically inspired intelligent robots. In Smart Structures and Materials 2003: Electroactive Polymer Actuators and Devices (EAPAD); SPIE: Bellingham, WA, USA, 2003; Volume 5051, pp. 14–20. [Google Scholar]
Robot Body Material | PLA (Poly Lactic Acid) |
---|---|
Dimensions W × H | 160 mm × 80 mm |
Weight | 140 g |
Head rotation | 180° |
Head tilt | 45° |
Hardware | Specification |
---|---|
Controller | Raspberry pi |
Servo motor | TowerPro SG90 |
Servo controller | Pololu-Micro Maestro 18-channel USB servo controller |
Camera | Ai—ball camera |
Battery | Li-Po 1200 mAh 7.4v |
Power regulator | Dimension Engineering De-SW033 |
Terrain | Grass | Interior | Pathway | Road | Staircase | Accuracy |
---|---|---|---|---|---|---|
Grass | 45 | 0 | 0 | 0 | 0 | 100 |
Interior | 0 | 35 | 0 | 0 | 0 | 100 |
Pathway | 2 | 0 | 39 | 0 | 0 | 95.12 |
Road | 0 | 0 | 0 | 45 | 0 | 100 |
Staircase | 0 | 0 | 0 | 0 | 62 | 100 |
Recall (%) | 95.74 | 100 | 100 | 100 | 100 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Bharatharaj, J.; Huang, L.; Al-Jumaily, A.M.; Kutty, S.K.S.; Krägeloh, C. Terrain Perception Using Wearable Parrot-Inspired Companion Robot, KiliRo. Biomimetics 2022, 7, 81. https://doi.org/10.3390/biomimetics7020081
Bharatharaj J, Huang L, Al-Jumaily AM, Kutty SKS, Krägeloh C. Terrain Perception Using Wearable Parrot-Inspired Companion Robot, KiliRo. Biomimetics. 2022; 7(2):81. https://doi.org/10.3390/biomimetics7020081
Chicago/Turabian StyleBharatharaj, Jaishankar, Loulin Huang, Ahmed M. Al-Jumaily, Senthil Kumar Sasthan Kutty, and Chris Krägeloh. 2022. "Terrain Perception Using Wearable Parrot-Inspired Companion Robot, KiliRo" Biomimetics 7, no. 2: 81. https://doi.org/10.3390/biomimetics7020081
APA StyleBharatharaj, J., Huang, L., Al-Jumaily, A. M., Kutty, S. K. S., & Krägeloh, C. (2022). Terrain Perception Using Wearable Parrot-Inspired Companion Robot, KiliRo. Biomimetics, 7(2), 81. https://doi.org/10.3390/biomimetics7020081