Evaluation of a Remote-Controlled Drone System for Bedridden Patients Using Their Eyes Based on Clinical Experiment
Abstract
:1. Introduction
2. Materials and Methods
2.1. Drone System
2.1.1. Overview
2.1.2. Operation Method and Control Screen
2.2. Clinical Experiment
2.2.1. Subjects
- (1)
- Preliminary Experiment
- (2)
- Principal Experiment
2.2.2. Methods of Indoor Experiment
- (i).
- The drone took off and rose to a height of 1.2 m, controlled by the experiment staff. After that, the patient looked at Area 3, according to the experiment staff’s instructions.
- (ii).
- After rotating the drone to the right at the preset rotational velocity, the patient looked at Area 4, according to the experiment staff’s instructions.
- (iii).
- After rotating the drone to the left at the preset rotational velocity, the patient looked at Area 5, according to the experiment staff’s instructions.
- (iv).
- After ascending the drone at the preset translational velocity, the patient looked at Area 6, according to the experiment staff’s instructions.
- (v).
- After descending the drone at the preset translational velocity, the patient looked at Area 2, according to the experiment staff’s instructions.
- (vi).
- After moving the drone forward at the preset translational velocity, the patient looked at Area 3, according to the experiment staff’s instructions.
- (vii).
- After the patient rotated the drone to the right and saw students holding flower bouquets, the patient looked at Area 1, according to the experiment staff’s instructions.
- (viii).
- After the patient hovered the drone and saw the students holding flower bouquets for a moment, the drone was landed by the experiment staff.
2.2.3. Methods of Outdoor Experiment
- (i).
- After the drone took off and rose to a height of 1.2 m, controlled by the experiment staff, the patient saw a student with a message board in which “Welcome to Tokai University” was written in Japanese (a), and, then, the patient rotated the drone to the right, according to the experiment staff’s instructions.
- (ii).
- After this, the patient saw Tokai University’s buildings (b), students juggling (c), students holding message boards in which “Thank you for participating” and “Mt. Fuji is here” were written in Japanese (d), and students holding flowers and waving their hands, and the view of Mt. Fuji (e). Finally, the experiment staff landed the drone.
3. Results
3.1. Indoor Experiment
3.1.1. Preliminary Experiment
3.1.2. Principal Experiment
3.2. Outdoor Experiment
3.2.1. Preliminary Experiment
3.2.2. Principal Experiment
4. Discussion
4.1. Could the Patients Operate the Drone Remotely?
4.2. Could the Patients Enjoy Viewing the Scenery and Talking with the Experiment Staff in the Distant Place?
4.3. Significance of the Principal Experiment
4.4. Limitations and Future Scope
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Kato, H.; Hashimoto, R.; Ogawa, T.; Tagawa, A. Activity report on the occasion of the 10th anniversary of the Center for Intractable Neurological Diseases. J. Int. Univ. Health Welf. 2014, 19, 16–23. (In Japanese) [Google Scholar]
- Saito, A.; Kobayashi, A. The Study on Caregivers’ Subjective Burden in Caregiving of Amyotrophic Lateral Sclerosis Patients at Home. J. Jpn. Acad. Community Health Nurs. 2001, 3, 38–45. (In Japanese) [Google Scholar]
- Tourism Industry Division Japan Tourism Agency. Effectiveness Verification Report on the Promotion of Universal Tourism; Japan Tourism Agency: Tokyo, Japan, 2016; pp. 1–144. (In Japanese)
- Ministry of Land Infrastructure Transport and Tourism. Creating an Environment Where Everyone Can Enjoy Travel: Aiming for “Universal Design” of Tourism; Ministry of Land Infrastructure Transport and Tourism: Tokyo, Japan, 2008; pp. 1–7. (In Japanese)
- Cabinet Office. Survey on the Economic Lives of the Elderly; Cabinet Office: Tokyo, Japan, 2011; pp. 129–134. (In Japanese)
- Principle of Normalization in Human Services: Office of Justice Programs. Available online: https://www.ojp.gov/ncjrs/virtual-library/abstracts/principle-normalization-human-services (accessed on 29 November 2022).
- Bengt Nirje, “The Normalization Principle and Its Human Management Implications,” 1969: The Autism History Project. Available online: https://blogs.uoregon.edu/autismhistoryproject/archive/bengt-nirje-the-normalization-principle-and-its-human-management-implications-1969/ (accessed on 29 November 2022).
- Tourism Industry Division Japan Tourism Agency. Report on Promotion Project for Barrier-Free Travel Consultation Service; Japan Tourism Agency: Tokyo, Japan, 2020; pp. 4–48. (In Japanese)
- Tourism Agency. Practical Measures for Tour Guidance Corresponding to Universal Tourism; Japan Tourism Agency: Tokyo, Japan, 2017; pp. 2–34. (In Japanese)
- Kubota, M.; Suzuki, K. Considerations for the Development of Universal Tourism: Collaboration between travel agencies and local support organizations. Jpn. Found. Int. Tour. 2020, 27, 103–111. (In Japanese) [Google Scholar]
- What is Universal Tourism for Those Who Need Nursing Care or Are Bedridden? For Details, Application Process, and Inquiries. Available online: https://rakan-itoshima.com/universal_tourism_about (accessed on 29 November 2022). (In Japanese).
- Tourism Industry Division Japan Tourism Agency. Universal Tourism Promotion Services Report; Japan Tourism Agency: Tokyo, Japan, 2019; pp. 26–29. (In Japanese)
- Universal Tourism Promotion Project. Marketing Data Related to Universal Tourism; Japan Tourism Agency: Tokyo, Japan, 2014; pp. 2–58. (In Japanese)
- Ureshino Onsen Accommodation Universal Tourism, Travel for Those Who Need Nursing Care or Are Bedridden. Available online: https://rakan-itoshima.com/universal_tourism (accessed on 29 November 2022). (In Japanese).
- Chugoku Industrial Creation Center Foundation. Report on a Study of Tourism Promotion Policies Based on Universal Tourism in an Aging Society; Chugoku Industrial Creation Center Foundation: Hiroshima, Japan, 2015; pp. 93–94. (In Japanese) [Google Scholar]
- Tourism Industry Division Japan Tourism Agency. Report on “Demonstration Project for Strengthening Barrier-Free Travel Support System”; Japan Tourism Agency: Tokyo, Japan, 2021; pp. 4–15. (In Japanese)
- Takeuchi, T. A Study for Promoting Universal Tourism: Awareness Raising of Travel Agents and Its Practice. Jpn. Found. Int. Tour. 2019, 26, 23–31. (In Japanese) [Google Scholar] [CrossRef]
- Ministry of Land Infrastructure Transport and Tourism. Universal Design for Tourism Guidance Manual: To Create an Environment in Which Everyone Can Enjoy Travel; Ministry of Land Infrastructure Transport and Tourism: Tokyo, Japan, 2008; pp. 34–50. (In Japanese)
- Duan, M.; Li, K.; Liao, X.; Li, K. A Parallel Multiclassification Algorithm for Big Data Using an Extreme Learning Machine. IEEE Trans. Neural Netw. Learn. Syst. 2017, 29, 2337–2351. [Google Scholar] [CrossRef]
- Pu, B.; Li, K.; Li, S.; Zhu, N. Automatic Fetal Ultrasound Standard Plane Recognition Based on Deep Learning and IIoT. IEEE Trans. Ind. Inform. 2021, 17, 7771–7780. [Google Scholar] [CrossRef]
- Wang, J.; Yang, Y.; Wang, T.; Sherratt, R.; Zhang, J. Big Data Service Architecture: A Survey. J. Internet Technol. 2020, 21, 393–405. [Google Scholar] [CrossRef]
- Li, H.; Ota, K.; Dong, M.; Guo, M. Learning Human Activities through Wi-Fi Channel State Information with Multiple Access Points. IEEE Commun. Mag. 2018, 56, 124–129. [Google Scholar] [CrossRef] [Green Version]
- Zhang, J.; Zhong, S.; Wang, T.; Chao, H.-C.; Wang, J. Blockchain-Based Systems and Applications: A Survey. J. Internet Technol. 2020, 21, 1–14. [Google Scholar] [CrossRef]
- Betriana, F.; Tanioka, R.; Kogawa, A.; Suzuki, R.; Seki, Y.; Osaka, K.; Zhao, Y.; Kai, Y.; Tanioka, T.; Locsin, R. Remote-Controlled Drone System through Eye Movements of Patients Who Need Long-Term Care: An Intermediary’s Role. Healthcare 2022, 10, 827. [Google Scholar] [CrossRef]
- Jiang, H.; Wachs, J.P.; Pendergast, M.; Duerstock, B.S. 3D joystick for robotic arm control by individuals with high level spinal cord injuries. In Proceedings of the 2013 IEEE 13th International Conference on Rehabilitation Robotics (ICORR), Seattle, WA, USA, 24–26 June 2013; pp. 1–5. [Google Scholar] [CrossRef]
- Lee, D.-H. Operator-centric joystick mapping for intuitive manual operation of differential drive robots. Comput. Electr. Eng. 2022, 104, 108427. [Google Scholar] [CrossRef]
- Naveen, R.S.; Julian, A. Brain computing interface for wheel chair control. In Proceedings of the 2013 Fourth International Conference on Computing, Communications and Networking Technologies (ICCCNT), Tiruchengode, India, 4–6 July 2013; pp. 1–5. [Google Scholar] [CrossRef]
- Liao, L.-Z.; Tseng, Y.-L.; Chiang, H.-H.; Wang, W.-Y. EMG-based Control Scheme with SVM Classifier for Assistive Robot Arm. In Proceedings of the 2018 International Automatic Control Conference (CACS), Taoyuan, Taiwan, 4–7 November 2018; pp. 1–5. [Google Scholar] [CrossRef]
- Wang, W.; Zhang, Z.; Suga, Y.; Iwata, H.; Sugano, S. Intuitive operation of a wheelchair mounted robotic arm for the upper limb disabled: The mouth-only approach. In Proceedings of the 2012 IEEE International Conference on Robotics and Biomimetics (ROBIO), Guangzhou, China, 11–14 December 2012; pp. 1733–1740. [Google Scholar] [CrossRef]
- Nagy, G.; Varkonyi-Koczy, A.R.; Toth, J. An Anytime Voice Controlled Ambient Assisted Living System for motion disabled persons. In Proceedings of the 2015 IEEE International Symposium on Medical Measurements and Applications (MeMeA) Proceedings, Turin, Italy, 7–9 May 2015; pp. 163–168. [Google Scholar] [CrossRef]
- Espiritu, N.M.D.; Chen, S.A.C.; Blasa, T.A.C.; Munsayac, F.E.T.; Arenos, F.E.T.M.R.P.; Baldovino, R.G.; Bugtai, N.T.; Co, H.S. BCI-controlled Smart Wheelchair for Amyotrophic Lateral Sclerosis Patients. In Proceedings of the 2019 7th International Conference on Robot Intelligence Technology and Applications (RiTA), Daejeon, Korea, 1–3 November 2019; pp. 258–263. [Google Scholar] [CrossRef]
- Mallikarachchi, S.; Chinthaka, D.; Sandaruwan, J.; Ruhunage, I.; Lalitharatne, T.D. Motor Imagery EEG-EOG Signals Based Brain Machine Interface (BMI) for a Mobile Robotic Assistant (MRA). In Proceedings of the 2019 IEEE 19th International Conference on Bioinformatics and Bioengineering (BIBE), Athens, Greece, 28–30 October 2019; pp. 812–816. [Google Scholar] [CrossRef]
- Varghese, V.A.; Amudha, S. Dual mode appliance control system for people with severe disabilities. In Proceedings of the 2015 International Conference on Innovations in Information, Embedded and Communication Systems, Coimbatore, India, 19–20 March 2015; pp. 1–5. [Google Scholar] [CrossRef]
- Inoue, Y.; Kai, Y.; Tanioka, T. Development of service dog robot. Kochi Univ. Technol. Res. Bull. 2004, 1, 52–56. (In Japanese) [Google Scholar]
- Yuan, L.; Reardon, C.; Warnell, G.; Loianno, G. Human Gaze-Driven Spatial Tasking of an Autonomous MAV. IEEE Robot. Autom. Lett. 2019, 4, 1343–1350. [Google Scholar] [CrossRef]
- BN, P.K.; Balasubramanyam, A.; Patil, A.K.; Chai, Y.H. GazeGuide: An Eye-Gaze-Guided Active Immersive UAV Camera. Appl. Sci. 2020, 10, 1668. [Google Scholar] [CrossRef] [Green Version]
- Maddirala, A.K.; Shaik, R.A. Removal of EMG artifacts from single channel EEG signal using singular spectrum analysis. In Proceedings of the 2015 IEEE International Circuits and Systems Symposium (ICSyS), Langkawi, Malaysia, 2–4 September 2015; pp. 111–115. [Google Scholar] [CrossRef]
- Laura, L. Computer Hygiene. CAUT Health Saf. Fact Sheet 2008, 18, 1–3. [Google Scholar]
- Lutz, O.H.-M.; Burmeister, C.; Dos Santos, L.F.; Morkisch, N.; Dohle, C.; Krüger, J. Application of head-mounted devices with eye-tracking in virtual reality therapy. Curr. Dir. Biomed. Eng. 2017, 3, 53–56. [Google Scholar] [CrossRef]
- Zhe, Z.; Sai, L.; Hao, C.; Hailong, L.; Yang, L.; Yu, F.; Felix, W.S. GaVe: A Webcam-Based Gaze Vending Interface Using One-Point Calibration. arXiv 2022, arXiv:2201.05533. [Google Scholar]
- Yamashita, A.; Asama, H.; Arai, T.; Ota, J.; Kaneko, T. A Survey on Trends of Mobile Robot Mechanisms. J. Robot. Soc. Jpn. 2003, 21, 282–292. [Google Scholar] [CrossRef]
- Sato, Y.; Mishima, H.; Kai, Y.; Tanioka, T.; Yasuhara, Y.; Osaka, K.; Zhao, Y. Development of a Velocity-Based Mechanical Safety Brake for Wheeled Mobile Nursing Robots: Proposal of the Mechanism and Experiments. Int. J. Adv. Intell. 2021, 12, 69–81. [Google Scholar]
- Kogawa, A.; Onda, M.; Kai, Y. Development of a Remote-Controlled Drone System by Using Only Eye Movements: Design of a Control Screen Considering Operability and Microsaccades. J. Robot. Mechatron. 2021, 33, 301–312. [Google Scholar] [CrossRef]
- Veling, W.; Lestestuiver, B.; Jongma, M.; Hoenders, H.J.R.; van Driel, C. Virtual Reality Relaxation for Patients With a Psychiatric Disorder: Crossover Randomized Controlled Trial. J. Med. Internet Res. 2021, 23, e17233. [Google Scholar] [CrossRef]
- Hansen, J.P.; Alapetite, A.; MacKenzie, I.S.; Møllenbach, E. The use of gaze to control drones. In Proceedings of the Symposium on Eye Tracking Research and Applications, Safety Harbor, FL, USA, 26–28 March 2014. [Google Scholar] [CrossRef]
- Hansen, J.P.; Lund, H.; Aoki, H.; Itoh, K. Gaze communication systems for people with ALS. In Proceedings of the ALS Workshop, in conjunction with the 17th International Symposium on ALS/MND, Yokohama, Japan, 30 November–2 December 2006; pp. 35–44. [Google Scholar]
- Itoh, K.; Sudoh, Y.; Yamamoto, T. Evaluation on eye gaze communication system. Inst. Image Inf. Telev. Eng. 1998, 22, 85–90. (In Japanese) [Google Scholar] [CrossRef]
- Juhong, A.; Treebupachatsakul, T.; Pintavirooj, C. Smart eye-tracking system. In Proceedings of the 2018 International Workshop on Advanced Image Technology (IWAIT), Chiang Mai, Thailand, 7–9 January 2018; pp. 1–4. [Google Scholar] [CrossRef]
- Martinez-Conde, S.; Macknik, S.; Hubel, D.H. The role of fixational eye movements in visual perception. Nat. Rev. Neurosci. 2004, 5, 229–240. [Google Scholar] [CrossRef] [PubMed]
- Degree to Which Elderly Disabled People Can Manage Their Daily Lives (Bedridden Degree): Ministry of Health Labour and Welfare of Japan. Available online: https://www.mhlw.go.jp/file/06-Seisakujouhou-12300000-Roukenkyoku/0000077382.pdf (accessed on 22 November 2022). (In Japanese).
- Kim, B.H.; Kim, M.; Jo, S. Quadcopter flight control using a low-cost hybrid interface with EEG-based classification and eye tracking. Comput. Biol. Med. 2014, 51, 82–92. [Google Scholar] [CrossRef] [PubMed]
Experiment | Patients | ADL | Age | Gender | Eye condition | Disease |
---|---|---|---|---|---|---|
Preliminary | A | J1 (Ambulatory) | 56 | Male | Naked | Schizophrenia |
B | J2 (Ambulatory) | 58 | Male | Naked | Schizophrenia | |
C | J2 (Ambulatory) | 65 | Male | Glasses | Schizophrenia | |
D | B1 (Wheelchair) | 80 | Female | Naked | Bipolar disorder | |
Principal | E | C1 (Bedridden) | 62 | Female | Naked | Schizophrenia |
F | C1 (Bedridden) | 85 | Female | Naked | Schizophrenia |
Questions | Patients | |||
---|---|---|---|---|
A | B | C | D | |
| Yes | Yes | Yes | Yes |
| Yes | No | No | No |
| Yes | No | No | No |
| Suitable | Suitable | Suitable | Suitable |
| Yes | Yes | Yes | Neutral |
| Yes | Yes | Yes | Yes |
Patients | |||
---|---|---|---|
A | B | C | D |
Flight Time [s] | Flight Time [s] | Flight Time [s] | Flight Time [s] |
150 | 164 | 193 | 132 |
Patients | |||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
A | B | C | D | ||||||||
Time Lag [s] | Time Lag [s] | Time Lag [s] | Time Lag [s] | ||||||||
Max. | Avg. | Min. | Max. | Avg. | Min. | Max. | Avg. | Min. | Max. | Avg. | Min. |
8.53 | 5.50 | 4.07 | 9.78 | 7.63 | 6.63 | 12.79 | 9.00 | 7.55 | 7.76 | 3.55 | 0.84 |
Questions | Patients | |
---|---|---|
E | F | |
| Neutral | Yes |
| Neutral | No |
| Neutral | No |
| Neutral | Slow |
| Neutral | Yes |
| Yes | Yes |
Patients | |
---|---|
E | F |
Flight Time [s] | Flight Time [s] |
129 | 166 |
Patients | |||||
---|---|---|---|---|---|
E | F | ||||
Time Lag [s] | Time Lag [s] | ||||
Max. | Avg. | Min. | Max. | Avg. | Min. |
5.92 | 3.21 | 1.32 | 5.91 | 3.67 | 1.52 |
Questions | Patients | |||
---|---|---|---|---|
A | B | C | D | |
Did you have fun? | Yes | Yes | Yes | Yes |
Do you want to fly the drone again? | Yes | Yes | Yes | Yes |
Patients | |||
---|---|---|---|
A | B | C | D |
Flight Time [s] | Flight Time [s] | Flight Time [s] | Flight Time [s] |
123 | 121 | 100 | 90 |
Patients | |||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
A | B | C | D | ||||||||
Time Lag [s] | Time Lag [s] | Time Lag [s] | Time Lag [s] | ||||||||
Max. | Avg. | Min. | Max. | Avg. | Min. | Max. | Avg. | Min. | Max. | Avg. | Min. |
1.23 | 0.79 | 0.23 | 1.08 | 0.74 | 0.38 | 0.94 | 0.80 | 0.71 | 1.00 | 0.98 | 0.96 |
Patient A | Patient A bowed to the students holding message boards. |
Patient B | Patient B waved his hand to the students holding message boards. Patient B talked with the experiment staff about Mt. Fuji in winter. |
Patient C | Patient C responded to the experiment staff member’s instructions and questions. |
Patient D | Patient D responded to the experiment staff member’s instructions and waved her hand to the students. Patient D bowed at the end of the experiment. |
Questions | Patients | |
---|---|---|
E | F | |
Did you have fun? | Yes | Yes |
Do you want to fly the drone again? | Yes | Yes |
Patients | |
---|---|
E | F |
Flight Time [s] | Flight Time [s] |
102 | 202 |
Patients | |||||
---|---|---|---|---|---|
E | F | ||||
Time Lag [s] | Time Lag [s] | ||||
Max. | Avg. | Min. | Max. | Avg. | Min. |
1.65 | 1.27 | 0.88 | 1.79 | 1.00 | 0.33 |
Patient E | Patient E enjoyed talking with the experiment staff. Patient E waved her hand to the students. Patient E said “I wish you the best.” Patient E bowed at the end of the experiment. |
Patient F | Patient F read the messages out loud. Patient F said “thank you” at the end of the experiment. |
Patients | |||
---|---|---|---|
A | B | C | D |
Number of Times | Number of Times | Number of Times | Number of Times |
3 | 5 | 3 | 3 |
Patients | |
---|---|
E | F |
Number of Times | Number of Times |
4 | 3 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Kai, Y.; Seki, Y.; Suzuki, R.; Kogawa, A.; Tanioka, R.; Osaka, K.; Zhao, Y.; Tanioka, T. Evaluation of a Remote-Controlled Drone System for Bedridden Patients Using Their Eyes Based on Clinical Experiment. Technologies 2023, 11, 15. https://doi.org/10.3390/technologies11010015
Kai Y, Seki Y, Suzuki R, Kogawa A, Tanioka R, Osaka K, Zhao Y, Tanioka T. Evaluation of a Remote-Controlled Drone System for Bedridden Patients Using Their Eyes Based on Clinical Experiment. Technologies. 2023; 11(1):15. https://doi.org/10.3390/technologies11010015
Chicago/Turabian StyleKai, Yoshihiro, Yuuki Seki, Riku Suzuki, Atsunori Kogawa, Ryuichi Tanioka, Kyoko Osaka, Yueren Zhao, and Tetsuya Tanioka. 2023. "Evaluation of a Remote-Controlled Drone System for Bedridden Patients Using Their Eyes Based on Clinical Experiment" Technologies 11, no. 1: 15. https://doi.org/10.3390/technologies11010015
APA StyleKai, Y., Seki, Y., Suzuki, R., Kogawa, A., Tanioka, R., Osaka, K., Zhao, Y., & Tanioka, T. (2023). Evaluation of a Remote-Controlled Drone System for Bedridden Patients Using Their Eyes Based on Clinical Experiment. Technologies, 11(1), 15. https://doi.org/10.3390/technologies11010015