Hands-Free Human–Machine Interfaces Using Piezoelectric Sensors and Accelerometers for Simulated Wheelchair Control in Older Adults and People with Physical Disabilities
Abstract
:1. Introduction
2. Materials and Methods
2.1. Proposed Hands-Free HMI System Using Piezoelectric Sensors and Accelerometers for Wheelchair Control
2.2. Prototyping
2.2.1. Head-Mounted Device
2.2.2. Sensing Modules and Processor
- Displacement sensor modules
- Microprocessor
2.3. Proposed Actions and Commands
2.4. Proposed Algorithm
2.4.1. Data Preprocessing
2.4.2. Face–Machine Interface
- (1)
- Calibration and Parameter Setting
- (2)
- Decision Making and Command Translation
2.4.3. Head–Machine Interface
- (1)
- Calibration and Parameter Setting
- (2)
- Decision Making and Command Translation
3. Results
3.1. Participants
3.2. Experiment I: Verification of the Proposed Hands-Free HMI System
3.3. Experiment II: Performance of the Proposed Hands-Free HMI System for Real-Time Simulated Wheelchair Control
3.4. Satisfaction with the Proposed Hands-Free HMIs
4. Discussion
4.1. Study Limitations
- (1)
- The study sample size is small, primarily aiming for a preliminary evaluation of the proposed head–machine interface (HMI) system with older adults for feasibility, usability, and safety before expanding to a larger, more diverse cohort.
- (2)
- The experiment was executed in a simulated environment, which may not precisely reflect real-world conditions.
4.2. Recommendations
- (1)
- A prototype of the head-mounted device must be designed and built for practical applications. The shoulder section, which houses piezoelectric sensors, should be flexible and shaped to fit the face.
- (2)
- Regarding hardware, the Arduino Nano ESP32 with Bluetooth was selected for rapid prototyping. A real-world implementation will need higher-performance hardware and faster communication.
- (3)
- The proposed hands-free HMIs require visual or audio feedback for training sessions and for creating real-time commands.
- (4)
- Implementing machine-learning techniques could enhance the classification accuracy of both hands-free HMIs.
- (5)
- The hands-free HMIs require validation for controlling powered wheelchairs in real environments. Additionally, validation in patients with quadriplegia should be performed.
4.3. Future Work
- (1)
- Increasing the number of older participants and including individuals with quadriplegia in future studies will help enhance the reliability of the results and the applicability of hands-free HMI systems.
- (2)
- The efficiency of command generation can be improved by implementing machine-learning techniques, which may enhance classification accuracy.
- (3)
- We will interface the prototype with a real wheelchair to test its usability, reliability, and adaptability in real-world environments. Additionally, a powered wheelchair will be modified for semi-automatic operation with an integrated obstacle avoidance system.
- (4)
- We plan to upgrade the hardware to support more efficient real-time operations by enabling faster and more stable communication protocols, such as Wi-Fi and radio frequency (RF), along with increasing battery capacity for extended use.
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- World Health Organization. Global Report on Health Equity for Persons with Disabilities; World Health Organization: Geneva, Switzerland, 2022. [Google Scholar]
- Dino, M.J.S.; Davidson, P.M.; Dion, K.W.; Szanton, S.L.; Ong, I.L. Nursing and human–computer interaction in healthcare robots for older people: An integrative review. Int. J. Nurs. Stud. Adv. 2022, 4, 100072. [Google Scholar] [CrossRef] [PubMed]
- Anaya, D.F.V.; Yuce, M.R. A hands-free human-computer-interface platform for paralyzed patients using a TENG-based eyelash motion sensor. In Proceedings of the 2020 42nd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC), Montreal, QC, Canada, 20–24 July 2020; pp. 4567–4570. [Google Scholar]
- Barrington, N.; Gayle, C.; Hensien, E.; Ko, G.; Lin, M.; Palnati, S.; Gerling, G.J. Developing a multimodal entertainment Tool with intuitive navigation, hands-free control, and avatar features, to increase user interactivity. In Proceedings of the Systems and Information Engineering Design Symposium (SIEDS), Charlottesville, VA, USA, 28–29 April 2022; pp. 304–309. [Google Scholar] [CrossRef]
- Qi, J.; Jiang, G.; Li, G.; Sun, Y.; Tao, B. Intelligent human–computer interaction based on surface EMG gesture recognition. IEEE Access 2019, 7, 61378–61387. [Google Scholar] [CrossRef]
- Rojas, M.; Ponce, P.; Molina, A. Development of a sensing platform based on hands-free interfaces for controlling electronic devices. Front. Hum. Neurosci. 2022, 16, 867377. [Google Scholar] [CrossRef] [PubMed]
- Rechy-Ramirez, E.J.; Hu, H. Bio-signal based control in assistive robots: A survey. Digit. Commun. Netw. 2015, 1, 85–101. [Google Scholar] [CrossRef]
- Schultz, T.; Wand, M.; Hueber, T.; Krusienski, D.J.; Herff, C.; Brumberg, J.S. Biosignal-based spoken communication: A survey. IEEE ACM Trans. Aud. Speech Lang. Process. 2017, 25, 2257–2271. [Google Scholar] [CrossRef]
- Paing, M.P.; Juhong, A.; Pintavirooj, C. Design and development of an assistive system based on eye tracking. Electronics 2022, 11, 535. [Google Scholar] [CrossRef]
- Ramya, S.; Prashanth, K. Human computer interaction using bio-signals. Int. J. Adv. Res. Comput. Commun. Eng. 2020, 9, 67–71. [Google Scholar]
- Ashok, S. High-level hands-free control of wheelchair—A review. J. Med. Eng. Technol. 2017, 41, 46–64. [Google Scholar] [CrossRef]
- Saichoo, T.; Boonbrahm, P.; Punsawad, Y. A face-machine interface utilizing EEG artifacts from a neuroheadset for simulated wheelchair control. Int. J. Smart Sens. Intell. Syst. 2021, 14, 1–10. [Google Scholar] [CrossRef]
- Zhu, B.; Zhang, D.; Chu, Y.; Zhao, X. A Novel Limbs-Free Variable Structure wheelchair based on Face-Computer Interface (FCI) with Shared Control. In Proceedings of the International Conference on Robotics and Automation (ICRA), Philadelphia, PA, USA, 23–27 May 2022; pp. 5480–5486. [Google Scholar] [CrossRef]
- Berjón, R.; Mateos, M.; Barriuso, A.; Muriel, I.; Villarrubia, G. Head tracking system for wheelchair movement control. In Highlights in Practical Applications of Agents and Multiagent Systems; Pérez, J.B., Corchado, J.M., Moreno, M.N., Julián, V., Mathieu, P., Canada-Bago, J., Ortega, A., Caballero, A.F., Eds.; Springer: Berlin/Heidelberg, Germany, 2011; pp. 307–315. [Google Scholar] [CrossRef]
- Manta, L.F.; Cojocaru, D.; Vladu, I.C.; Dragomir, A.; Mariniuc, A.M. Wheelchair control by head motion using A noncontact method in relation to the Pacient. In Proceedings of the 20th International Carpathian Control Conference (ICCC), Krakow-Wieliczka, Poland, 26–29 May 2019; pp. 1–6. [Google Scholar]
- Jameel, H.F.; Mohammed, S.L.; Gharghan, S.K. Wheelchair Control System based on Gyroscope of Wearable Tool for the Disabled. IOP Conf. Ser. Mater. Sci. Eng. 2020, 745, 012091. [Google Scholar] [CrossRef]
- Tesfamikael, H.H.; Fray, A.; Mengsteab, I.; Semere, A.; Amanuel, Z. Simulation of Eye Tracking Control based Electric wheelchair Construction by Image Segmentation Algorithm. J. Innocative Image Process. (JIIP) 2021, 3, 21–35. [Google Scholar] [CrossRef]
- Higa, S.; Yamada, K.; Kamisato, S. Intelligent eye-controlled electric wheelchair based on estimating visual intentions using one-dimensional convolutional neural network and long short-term memory. Sensors 2023, 23, 4028. [Google Scholar] [CrossRef]
- Dahmani, M.; Chowdhury, M.E.H.; Khandakar, A.; Rahman, T.; Al-Jayyousi, K.; Hefny, A.; Kiranyaz, S. AN intelligent and low-cost eye-tracking system for motorized wheelchair control. Sensors 2020, 20, 3936. [Google Scholar] [CrossRef] [PubMed]
- Sivarajah, Y.; Chandrasena, L.U.R.; Umakanthan, A.; Vasuki, Y.; Munasinghe, R. Controlling a wheelchair by use of EOG signal. In Proceedings of the 4th International Conference on Information and Automation for Sustainability (ICIAFS), Colombo, Sri Lanka, 12–14 December 2008; pp. 283–288. [Google Scholar]
- Huang, Q.; Chen, Y.; Zhang, Z.; He, S.; Zhang, R.; Liu, J.; Zhang, Y.; Shao, M.; Li, Y. An EOG-based wheelchair robotic arm system for assisting patients with severe spinal cord injuries. J. Neural Eng. 2019, 16, 026021. [Google Scholar] [CrossRef]
- Bhuyain, M.F.; Kabir Shawon, M.A.-U.; Sakib, N.; Faruk, T.; Islam, M.K.; Salim, K.M. Design and development of an EOG-based system to control electric wheelchair for people suffering from quadriplegia or quadriparesis. In Proceedings of the International Conference on Robotics, Electrical and Signal Processing Techniques (ICREST), Dhaka, Bangladesh, 10–12 January 2019; pp. 460–465. [Google Scholar] [CrossRef]
- Manero, A.C.; McLinden, S.L.; Sparkman, J.; Oskarsson, B. Evaluating surface EMG control of motorized wheelchairs for amyotrophic lateral sclerosis patients. J. Neuroeng. Rehabil. 2022, 19, 88. [Google Scholar] [CrossRef]
- Alibhai, Z.; Burreson, T.; Stiller, M.; Ahmad, I.; Huber, M.; Clark, A. A human-computer interface for smart wheelchair control using forearm EMG signals. In Proceedings of the 3rd International Conference on Data Intelligence and Security (ICDIS), South Padre Island, TX, USA, 10–12 November 2020; pp. 34–39. [Google Scholar] [CrossRef]
- Rakasena, E.P.G.; Herdiman, L. Electric wheelchair with forward-reverse control using electromyography (EMG) control of arm muscle. J. Phys. Conf. Ser. 2020, 1450, 012118. [Google Scholar] [CrossRef]
- Siribunyaphat, N.; Punsawad, Y. Brain–computer interface based on steady-state visual evoked potential using quick-response code pattern for wheelchair control. Sensors 2023, 23, 2069. [Google Scholar] [CrossRef] [PubMed]
- Banach, K.; Małecki, M.; Rosół, M.; Broniec, A. Brain–computer interface for electric wheelchair based on alpha waves of EEG signal. Bio Algor. Med. Syst. 2021, 17, 165–172. [Google Scholar] [CrossRef]
- Pawuś, D.; Paszkiel, S. BCI wheelchair control using expert system classifying EEG signals based on power spectrum estimation and nervous tics detection. Appl. Sci. 2022, 12, 10385. [Google Scholar] [CrossRef]
- Liu, Y.; Yiu, C.; Song, Z.; Huang, Y.; Yao, K.; Wong, T.; Zhou, J.; Zhao, L.; Huang, X.; Nejad, S.K.; et al. Electronic skin as wireless human–machine interfaces for robotic VR. Sci. Adv. 2022, 8, eabl6700. [Google Scholar] [CrossRef]
- Shi, Q.; Zhang, Z.; Chen, T.; Lee, C. Minimalist and multi-functional human machine interface (HMI) using a flexible wearable triboelectric patch. Nano Energy 2019, 62, 355–366. [Google Scholar] [CrossRef]
- Zhong, J.; Ma, Y.; Song, Y.; Zhong, Q.; Chu, Y.; Karakurt, I.; Bogy, D.B.; Lin, L. A flexible Piezoelectret actuator/sensor patch for mechanical human–machine interfaces. ACS Nano 2019, 13, 7107–7116. [Google Scholar] [CrossRef] [PubMed]
- Zhang, H.; Zhang, D.; Zhang, B.; Wang, D.; Tang, M. Wearable pressure sensor array with layer-by-layer assembled MXene nanosheets/ag nanoflowers for motion monitoring and human–machine interfaces. ACS Appl. Mater. Interfaces 2022, 14, 48907–48916. [Google Scholar] [CrossRef]
- Lu, L.; Jiang, C.; Hu, G.; Liu, J.; Yang, B. Flexible noncontact sensing for human–machine interaction. Adv. Mater. 2021, 33, e2100218. [Google Scholar] [CrossRef]
- Yu, Y.; Li, J.; Solomon, S.A.; Min, J.; Tu, J.; Guo, W.; Xu, C.; Song, Y.; Gao, W. All-printed soft human–machine interface for robotic physicochemical sensing. Sci. Robot. 2022, 7, eabn0495. [Google Scholar] [CrossRef] [PubMed]
- Bouyam, C.; Punsawad, Y. Human–machine interface-based wheelchair Control using Piezoelectric Sensors based on Face and Tongue Movements. Heliyon 2022, 8, e11679. [Google Scholar] [CrossRef]
- Zhou, H.; Alici, G. Non-invasive human–machine interface (HMI) systems with hybrid on-body sensors for controlling upper-limb prosthesis: A review. IEEE Sens. J. 2022, 22, 10292–10307. [Google Scholar] [CrossRef]
- Iarlori, S.; Perpetuini, D.; Tritto, M.; Cardone, D.; Tiberio, A.; Chinthakindi, M.; Filippini, C.; Cavanini, L.; Freddi, A.; Ferracuti, F.; et al. An Overview of Approaches and Methods for the Cognitive Workload Estimation in Human–Machine Interaction Scenarios through Wearables Sensors. BioMedInformatics 2024, 4, 1155–1173. [Google Scholar] [CrossRef]
- Lu, Z.; Zhou, Y.; Hu, L.; Zhu, J.; Liu, S.; Huang, Q.; Li, Y. A wearable human–machine interactive instrument for controlling a wheelchair robotic arm system. IEEE Trans. Instrum. Meas. 2024, 73, 4005315. [Google Scholar] [CrossRef]
- Routhier, F.; Archambault, P.S.; Choukou, M.A.; Giesbrecht, E.; Lettre, J.; Miller, W.C. Barriers and facilitators of integrating the miWe immersive wheelchair simulator as a clinical Tool for training Powered wheelchair-driving skills. Ann. Phys. Rehabil. Med. 2018, 61, e91. [Google Scholar] [CrossRef]
Command | Symbol | Face–Machine Interface | Head–Machine Interface | ||
---|---|---|---|---|---|
Actions | Target Sensors | Actions | Target Rotations | ||
Turn Left | ← | Winking the left eye. | PZ1 | Tilting the head left. | (+) x-axis |
Turn Right | → | Winking the right eye. | PZ2 | Tilting the head right. | (−) x-axis |
Forward | ↑ | Pushing the tongue on the left/right cheek. | PZ3 | Tilting the head forward. | (+) z-axis |
Backward | ↓ | Blinking both eyes. | PZ1 and PZ2 | Tilting the head back. | (−) z-axis |
Participant ID | Gender | Age (Years) | Physical Condition |
---|---|---|---|
1 | Male | 60 | No impairment |
2 | Female | 61 | Lower limb impairment |
3 | Female | 62 | No impairment |
4 | Female | 64 | No impairment |
5 | Female | 65 | No impairment |
6 | Male | 66 | No impairment |
7 | Male | 65 | No impairment |
8 | Male | 63 | Left arm weakness |
9 | Female | 69 | No impairment |
10 | Male | 69 | No impairment |
11 | Male | 67 | No impairment |
12 | Female | 68 | No impairment |
Sequence No. | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 |
---|---|---|---|---|---|---|---|---|---|---|---|---|
Commands | ← | ↑ | → | ↓ | ↑ | → | ← | ↓ | ↑ | ← | ↓ | → |
Participants | Average Classification Accuracy (%) | ||
---|---|---|---|
Joystick | Face–Machine Interface | Head–Machine Interface | |
1 | 100 | 91.7 | 100 |
2 | 100 | 83.3 | 100 |
3 | 100 | 75.0 | 100 |
4 | 100 | 87.5 | 100 |
5 | 100 | 70.8 | 100 |
6 | 100 | 87.5 | 100 |
7 | 95.8 | 83.3 | 91.7 |
8 | 95.8 | 87.5 | 91.7 |
9 | 95.8 | 83.3 | 87.5 |
10 | 100 | 87.5 | 100 |
11 | 100 | 70.8 | 100 |
12 | 100 | 91.7 | 100 |
Mean ± S.D. | 99.0 ± 1.82 | 83.3± 7.32 | 97.6 ± 4.32 |
Output | Classification Metrics (%) | |||||||
---|---|---|---|---|---|---|---|---|
Command | ← | → | ↑ | ↓ | Success | Precision | Sensitivity | Accuracy |
← | 57 | 6 | 0 | 3 | 79.2 | 89.1 | 86.4 | 93.8 |
→ | 9 | 58 | 0 | 0 | 80.6 | 86.6 | 86.6 | 93.2 |
↑ | 0 | 0 | 62 | 0 | 86.1 | 100 | 100 | 100 |
↓ | 2 | 3 | 0 | 63 | 87.5 | 95.5 | 92.6 | 97.0 |
Mean | 83.3 | 98.0 | 97.9 | 99.0 |
Output | Classification Metrics (%) | |||||||
---|---|---|---|---|---|---|---|---|
Command | ← | → | ↑ | ↓ | Success | Precision | Sensitivity | Accuracy |
← | 69 | 0 | 0 | 2 | 98.6 | 100 | 98.6 | 99.7 |
→ | 1 | 69 | 0 | 2 | 100 | 94.7 | 100 | 98.6 |
↑ | 1 | 0 | 71 | 0 | 95.8 | 97.2 | 97.2 | 98.6 |
↓ | 0 | 0 | 0 | 72 | 95.8 | 100 | 95.8 | 99.0 |
Mean | 97.6 | 98.0 | 97.9 | 99.0 |
Participant | Joystick | Face–Machine Interface | Head–Machine Interface | |||
---|---|---|---|---|---|---|
Time (s) | Checkpoint | Time (s) | Checkpoint | Time (s) | Checkpoint | |
1 | 40 | E | 99 | E | 62 | E |
2 | 63 | E | 305 | E | 104 | E |
3 | 70 | E | 360 * | B | 112 | E |
4 | 55 | E | 244 | E | 80 | E |
5 | 51 | E | 360 * | C | 78 | E |
6 | 48 | E | 343 | E | 92 | E |
7 | 57 | E | 343 | E | 191 | E |
8 | 62 | E | 143 | E | 117 | E |
9 | 53 | E | 354 | E | 262 | E |
10 | 49 | E | 202 | E | 78 | E |
11 | 95 | E | 360 * | A | 147 | E |
12 | 65 | E | 143 | E | 82 | E |
Mean ± S.D. | 59.0 ± 14.1 | 241.8 ± 77.8 † | 117.1 ± 57.9 |
Participant | Joystick | Face–Machine Interface | Head–Machine Interface | |||
---|---|---|---|---|---|---|
Time (s) | Checkpoint | Time (s) | Checkpoint | Time (s) | Checkpoint | |
1 | 67 | E | 217 | E | 97 | E |
2 | 118 | E | 720 * | D | 198 | E |
3 | 160 | E | 720 * | C | 242 | E |
4 | 105 | E | 405 | E | 154 | E |
5 | 107 | E | 720 * | B | 190 | E |
6 | 110 | E | 380 | E | 165 | E |
7 | 103 | E | 380 | E | 353 | E |
8 | 134 | E | 494 | E | 146 | E |
9 | 94 | E | 673 | E | 532 | E |
10 | 88 | E | 536 | E | 193 | E |
11 | 136 | E | 720 * | B | 193 | E |
12 | 102 | E | 431 | E | 174 | E |
Mean ± S.D. | 110.3 ± 24.4 | 439.5 ± 133.4 † | 219.8 ± 116.2 |
Questions No. | Survey Statements |
---|---|
1 | I can easily wear the device. |
2 | I feel comfortable while wearing the device. |
3 | I can easily create commands for steering the simulated wheelchair. |
4 | The HMI device and system demonstrate stability during operation. |
5 | I feel slightly fatigued after completing the experiment. |
6 | I feel confident operating the electric wheelchair with the device. |
7 | Overall, I am satisfied with my HMI system experience. |
Question No | HMI Modalities | Satisfaction Score | ||
---|---|---|---|---|
Range | Median | Mean ± S.D. | ||
1 | Face–machine interface | 4–5 | 5 | 4.58 ± 0.51 |
Head–machine interface | ||||
2 | Face–machine interface | 4–5 | 4 | 4.50 ± 0.52 |
Head–machine interface | ||||
3 | Face–machine interface | 2–4 | 4 | 3.58 ± 0.79 |
Head–machine interface | 3–5 | 5 | 4.42 ± 0.90 | |
4 | Face–machine interface | 3–5 | 4 | 4.25 ± 0.75 |
Head–machine interface | 3–5 | 4 | 4.50 ± 0.43 | |
6 | Face–machine interface | 3–5 | 5 | 4.42 ± 0.90 |
Head–machine interface | 4–5 | 5 | 4.92 ± 0.29 | |
5 | Face–machine interface | 3–5 | 3 | 3.67 ± 0.89 |
Head–machine interface | 3–5 | 5 | 4.42 ± 0.79 | |
7 | Face–machine interface | 3–5 | 4 | 4.00 ± 0.90 |
Head–machine interface | 3–5 | 5 | 4.83 ± 0.58 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Bouyam, C.; Siribunyaphat, N.; Anopas, D.; Thu, M.; Punsawad, Y. Hands-Free Human–Machine Interfaces Using Piezoelectric Sensors and Accelerometers for Simulated Wheelchair Control in Older Adults and People with Physical Disabilities. Sensors 2025, 25, 3037. https://doi.org/10.3390/s25103037
Bouyam C, Siribunyaphat N, Anopas D, Thu M, Punsawad Y. Hands-Free Human–Machine Interfaces Using Piezoelectric Sensors and Accelerometers for Simulated Wheelchair Control in Older Adults and People with Physical Disabilities. Sensors. 2025; 25(10):3037. https://doi.org/10.3390/s25103037
Chicago/Turabian StyleBouyam, Charoenporn, Nannaphat Siribunyaphat, Dollaporn Anopas, May Thu, and Yunyong Punsawad. 2025. "Hands-Free Human–Machine Interfaces Using Piezoelectric Sensors and Accelerometers for Simulated Wheelchair Control in Older Adults and People with Physical Disabilities" Sensors 25, no. 10: 3037. https://doi.org/10.3390/s25103037
APA StyleBouyam, C., Siribunyaphat, N., Anopas, D., Thu, M., & Punsawad, Y. (2025). Hands-Free Human–Machine Interfaces Using Piezoelectric Sensors and Accelerometers for Simulated Wheelchair Control in Older Adults and People with Physical Disabilities. Sensors, 25(10), 3037. https://doi.org/10.3390/s25103037