A Photovoltaic Light Sensor-Based Self-Powered Real-Time Hover Gesture Recognition System for Smart Home Control
Abstract
1. Introduction
- A development of a novel supervised machine learning pipeline for a smart home hover gesture recognition system.
- A feasibility demonstration of a PV light sensor-based gesture recognition system comparing one-size-fits-all and user-specific models with offline and real-time testing studies.
- An end-user evaluation of user experiences with the real-time gesture recognition system for smart home device control.
2. Related Work
2.1. Self-Powered Sensor Technologies for Gesture Recognition
2.2. Light-Sensor-Based Gesture Recognition
2.3. Machine Learning Pipeline for Gesture Recognition Using PV Light Sensors
3. A Self-Powered Real-Time Gesture Recognition System
3.1. Hover Gestures for PV Light Sensors
3.2. Experimental Prototype
4. One-Size-Fits-All Model Testing
5. Real-Time Testing User Study
5.1. Real-Time Testing Session
Real-Time Testing Results
5.2. User Experience Evaluation
5.2.1. Quantitative Survey Results
5.2.2. Qualitative Semi-Structured Interview Results
- (1)
- Natural Interaction: This theme shows that the participants felt the hovering hand gestures were natural and intuitive. For example, P1 said they are everyday actions (“I really thought it was like a daily expression of my body language. So, when I was interacting or doing the gestures, it was like I was performing natural everyday actions, like moving my hand left or right.”). P6 said they are normal (“I felt they were easy, quick to perform, and very natural, close to human movements in daily life, as when we moved forward, right or left, it felt like normal, everyday human motions.”). P4 said they are commonsense (“I do believe that they are intuitive and natural. Personally, I think most of the gestures come from common sense for different animations, like increasing or decreasing the TV volume, which I would use in my daily life.”).
- (2)
- Social Appropriateness: This theme shows that the participants viewed the hovering gestures as non-intrusive and socially appropriate in various private and public settings. For example, P10 said they are expressive: “They are appropriate and expressive. I do not think there is anything inappropriate about them, as they can be used anywhere and anytime. You can use them at home or in your car.” P6 said they are relatable: “I felt the gestures are relatable to all societies and to people in general. I did not feel they were limited to a specific group of people or a specific context, like just in homes or only in shopping malls. Especially as a mum with kids, I think it would even be easy for children to use at home. I feel it would simplify household tasks and definitely improve our quality of life at home by keeping all surfaces clean.” P1 said they are hygienic: “I think in public places it is nice, maybe because of some diseases or viruses, people do not want to touch things, so this will increase hygiene. But I think it will be more enjoyable at home because you are in your own comfort zone, so it would be convenient in both places.”
- (3)
- Comfort: This theme shows that the participants found the gestures comfortable to perform without much physical effort and causing no strain on the body. For example, P4 said they all are comfortable (“No, they were all comfortable. I was comfortable applying them. I do not recall any gesture that I felt uncomfortable with.”). P7 said they cause no strain (“No, not at all. On the contrary, they were simple, easy, and comfortable. There was no issue or strain on the body.”). P10 said they do not need much physical effort (“No, I do not think so. They are all comfortable and easy, and they do not need any effort to perform.”). P5 said they are easy to perform (“For me, I thought it was really comfortable, even the gestures were not that hard. It was an easy opening. Nothing very tricky, you just need to remember it.”). P9 said they are simple (easy) to perform (“I felt completely comfortable, no effort at all. Compared to physically handling a device to perform actions like playing or skipping music, the gestures would be much simpler and not complicated at all.”).
- (4)
- Usability: This theme shows that the participants found the gestures easy to learn and perform after a brief description and demonstration. For example, P8 needed a short demonstration to learn (“After practicing the gestures a few times, around the third attempt, I learned how to execute them consistently without any mistakes. They were easy to learn and remember after a short period of use.”). P2 said the descriptions were self-explanatory (“They are all very easy to learn because just a couple of words description was pretty self-explanatory and you could tell straight away what the action needed to be.”). P7 said the gestures were easy to execute (“The gestures are completely fine, there was not any issue, it was clear and easy to execute.”). P10 said the gestures need no effort to perform (“They were all easy gestures, easy to perform. It needed no effort, actually.”). Overall, this theme shows that the participants found the gestures easy to learn, remember, recall, and perform.
- (5)
- Source of Frustration: This theme shows that frustration was largely absent in most participants. For example, P3 found no source of annoyance (“No, I think nothing annoyed me while making the gestures as it was easy.”). P8 found no inconvenience (“No, everything was clear, and there was no inconvenience during the experience. Even in the future, if these gestures are implemented, I will use them without any issues.”). However, some participants did express a degree of frustration. P1 was scared of making mistakes (“It was fun, and I liked it, and it was also exciting to do within the application, but I was a bit scared of making some mistakes with the sensor, which frustrated me a little.”). P5 was also worried about making a mistake (“Never, but I thought if I made a mistake in the move down and the move down quickly gestures, maybe the sensor would not understand it. This is frustrating, but other than that, everything was good for me.”). However, P1 and P5 also said they were excited and happy with the interaction.
6. Discussion
6.1. Machine Learning Pipeline for PV-Light-Sensor-Based Hand Gesture Recognition
6.2. User Experience of Using a Real-Time Hovering Gesture Recognition System
6.3. Limitations of the User Studies
6.4. Limitations of PV Light Sensors
6.5. Ethical and Privacy Concerns
6.6. A Portable Prototype
7. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
ADCs | Analogue-to-Digital Converters |
BLE | Bluetooth Low Energy |
CFAR | Constant False Alarm Rate |
DT | Decision Tree |
DWT | Discrete Wavelet Transform |
DTW | Dynamic Time Warping |
EMG | Electromyography |
GB | Gradient Boosting |
GUI | Graphical User Interface |
HCI | Human–Computer Interaction |
KNN | K-Nearest Neighbor |
LR | Logistic Regression |
LED | Light-Emitting Diode |
ML | Machine Learning |
PV | Photovoltaic |
PC | Personal Computer |
PD | Photodiode |
RF | Random Forest |
PCA | Principal Component Analysis |
SVM | Support Vector Machine |
UEQ | User Experience Questionnaire |
UX | User Experience |
VR | Virtual Reality |
VR/AR | Virtual/Augmented Reality |
Appendix A
Gestures | Music Player Functions |
---|---|
G1: Move down | Decrease the volume of the song |
G2: Move up | Increase the volume of the song |
G3: Move right | Play the next song |
G4: Move left | Play the previous song |
G5: Closing fist | Toggle mute |
G6: Move forward | Stop the song |
G7: Clockwise | Skip the song forward 5 s |
G8: Counterclockwise | Rewind the song 5 s |
G9: Move down quickly | Toggle play and pause the song |
G10: Split hands | Select the music folder |
G11: Combine hands | Clear the song list |
No. | Statement | Heuristics | Aspect |
---|---|---|---|
A1 | Gesture control will improve my overall experience with smart homes. | Perceived usefulness | Acceptability |
A2 | Gesture control will make interacting with smart homes easier. | Perceived ease of use | Acceptability |
A3 | I will easily get used to smart home interactions with the help of gestures. | Attitude towards using the technology | Acceptability |
A4 | Gestures will be a typical way of interacting with technology in the future. | Behavior towards intention of use | Acceptability |
U1 | The gestures are generally easy to remember. | Memorability | Usability |
U2 | Most gestures are easy to learn because they correspond well with the commands. | Learnability | Usability |
U3 | The gestures are generally very complex and complicated to perform. | Efficiency | Usability |
U4 | It is easy to make errors or mistakes with the current set of gestures. | Error | Usability |
U5 | I am generally satisfied with the gestures used for smart home interaction. | Satisfaction | Usability |
U6 | Most gestures are straining to the arms and hands. | Comfort | Usability |
S1 | Compared to other media (voice command and remote controls), I am open to using gestures to interact with smart homes. | Comparison to other available mode of interactions | Summary/Debriefing |
S2 | I would buy (or invest in) smart home devices to control my home. | Perceived investment | Summary/Debriefing |
S3 | I would buy (or invest in) gesture technologies to interact with my smart home. | Perceived investment | Summary/Debriefing |
# | Question |
---|---|
1 | How intuitive and natural do you think the gestures are while performing them? |
2 | How expressive do you think these gestures are and socially appropriate to be used in different environments (e.g., at home or in public)? |
3 | How comfortable were you while performing the gestures? Are there any gestures that cause discomfort? |
4 | How easy was it to learn and perform the gestures? Did you encounter any difficulties? |
5 | Did you encounter any frustration while performing these gestures? If so, what were the causes of frustration? |
References
- Iqbal, M.Z.; Campbell, A.G. From luxury to necessity: Progress of touchless interaction technology. Technol. Soc. 2021, 67, 101796. [Google Scholar] [CrossRef]
- Pearson, J.; Bailey, G.; Robinson, S.; Jones, M.; Owen, T.; Zhang, C.; Reitmaier, T.; Steer, C.; Carter, A.; Sahoo, D.R.; et al. Can’t Touch This: Rethinking Public Technology in a COVID-19 Era. In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems, New Orleans, LA, USA, 29 April–5 May 2022. [Google Scholar] [CrossRef]
- Almania, N.; Alhouli, S.; Sahoo, D. User Preferences for Smart Applications for a Self-Powered Tabletop Gestural Interface. In Human Interaction and Emerging Technologies (IHIET-AI 2024): Artificial Intelligence and Future Applications; AHFE International: St Honolulu, HI, USA, 2024. [Google Scholar] [CrossRef]
- Gerba, C.P.; Wuollet, A.L.; Raisanen, P.; Lopez, G.U. Bacterial contamination of computer touch screens. Am. J. Infect. Control 2016, 44, 358–360. [Google Scholar] [CrossRef]
- Iqbal, M.Z.; Campbell, A. The emerging need for touchless interaction technologies. Interactions 2020, 27, 51–52. [Google Scholar] [CrossRef]
- Meena, Y.K.; Seunarine, K.; Sahoo, D.R.; Robinson, S.; Pearson, J.; Zhang, C.; Carnie, M.; Pockett, A.; Prescott, A.; Thomas, S.K.; et al. PV-Tiles: Towards Closely-Coupled Photovoltaic and Digital Materials for Useful, Beautiful and Sustainable Interactive Surfaces. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 25–30 April 2020; pp. 1–12. [Google Scholar] [CrossRef]
- Caon, M.; Carrino, S.; Ruffieux, S.; Khaled, O.A.; Mugellini, E. Augmenting interaction possibilities between people with mobility impairments and their surrounding environment. In Advanced Machine Learning Technologies and Applications, Proceedings of the First International Conference, AMLTA 2012, Cairo, Egypt, 8–10 December 2012; Proceedings 1; Springer: Berlin/Heidelberg, Germany, 2012; pp. 172–181. [Google Scholar] [CrossRef]
- Hincapié-Ramos, J.D.; Guo, X.; Moghadasian, P.; Irani, P. Consumed Endurance: A Metric to Quantify Arm Fatigue of Mid-Air Interactions. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Toronto, ON, Canada, 26 April–1 May 2014; pp. 1063–1072. [Google Scholar] [CrossRef]
- Ruiz, J.; Vogel, D. Soft-Constraints to Reduce Legacy and Performance Bias to Elicit Whole-body Gestures with Low Arm Fatigue. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, Seoul, Republic of Korea, 18–23 April 2015; pp. 3347–3350. [Google Scholar] [CrossRef]
- Vogiatzidakis, P.; Koutsabasis, P. Frame-Based Elicitation of Mid-Air Gestures for a Smart Home Device Ecosystem. Informatics 2019, 6, 23. [Google Scholar] [CrossRef]
- Panger, G. Kinect in the Kitchen: Testing Depth Camera Interactions in Practical Home Environments. In Proceedings of the CHI ’12 Extended Abstracts on Human Factors in Computing Systems, Austin, TX, USA, 5–10 May 2012; pp. 1985–1990. [Google Scholar] [CrossRef]
- He, W.; Martinez, J.; Padhi, R.; Zhang, L.; Ur, B. When Smart Devices Are Stupid: Negative Experiences Using Home Smart Devices. In Proceedings of the 2019 IEEE Security and Privacy Workshops (SPW), San Francisco, CA, USA, 19–23 May 2019; pp. 150–155. [Google Scholar] [CrossRef]
- Lu, Y.; Wang, X.; Gong, J.; Zhou, L.; Ge, S. Classification, application, challenge, and future of midair gestures in augmented reality. J. Sens. 2022, 2022, 3208047. [Google Scholar] [CrossRef]
- Modaberi, M. The Role of Gesture-Based Interaction in Improving User Satisfaction for Touchless Interfaces. Int. J. Adv. Hum. Comput. Interact. 2024, 2, 20–32. Available online: https://www.ijahci.com/index.php/ijahci/article/view/17 (accessed on 2 September 2025).
- Amaravati, A.; Xu, S.; Cao, N.; Romberg, J.; Raychowdhury, A. A Light-Powered Smart Camera With Compressed Domain Gesture Detection. IEEE Trans. Circuits Syst. Video Technol. 2018, 28, 3077–3085. [Google Scholar] [CrossRef]
- Kartsch, V.; Benatti, S.; Mancini, M.; Magno, M.; Benini, L. Smart Wearable Wristband for EMG based Gesture Recognition Powered by Solar Energy Harvester. In Proceedings of the 2018 IEEE International Symposium on Circuits and Systems (ISCAS), Florence, Italy, 27–30 May 2018; pp. 1–5. [Google Scholar] [CrossRef]
- Bono, F.M.; Polinelli, A.; Radicioni, L.; Benedetti, L.; Castelli-Dezza, F.; Cinquemani, S.; Belloli, M. Wireless Accelerometer Architecture for Bridge SHM: From Sensor Design to System Deployment. Future Internet 2025, 17, 29. [Google Scholar] [CrossRef]
- Thomas, G.; Pockett, A.; Seunarine, K.; Carnie, M. Simultaneous Energy Harvesting and Hand Gesture Recognition in Large Area Monolithic Dye-Sensitized Solar Cells. arXiv 2023, arXiv:2311.16284. [Google Scholar] [CrossRef]
- Li, J.; Xu, Q.; Xu, Z.; Ge, C.; Ruan, L.; Liang, X.; Ding, W.; Gui, W.; Zhang, X.P. PowerGest: Self-Powered Gesture Recognition for Command Input and Robotic Manipulation. In Proceedings of the 2024 IEEE 30th International Conference on Parallel and Distributed Systems (ICPADS), Belgrade, Serbia, 10–14 October 2024; pp. 528–535. [Google Scholar] [CrossRef]
- Li, J.; Xu, Q.; Zhu, Q.; Xie, Z.; Xu, Z.; Ge, C.; Ruan, L.; Fu, H.Y.; Liang, X.; Ding, W.; et al. Demo: SolarSense: A Self-powered Ubiquitous Gesture Recognition System for Industrial Human-Computer Interaction. In Proceedings of the 22nd Annual International Conference on Mobile Systems, Applications and Services, Tokyo, Japan, 3–7 June 2024; pp. 600–601. [Google Scholar] [CrossRef]
- Ma, D.; Lan, G.; Hassan, M.; Hu, W.; Upama, M.B.; Uddin, A.; Youssef, M. SolarGest: Ubiquitous and Battery-free Gesture Recognition using Solar Cells. In Proceedings of the 25th Annual International Conference on Mobile Computing and Networking, Los Cabos, Mexico, 21–25 October 2019. [Google Scholar] [CrossRef]
- Kyung-Chan, A.; Jun-Ying, L.; Chu-Feng, Y.; Timothy, N.S.E.; Varku, S.; Qinjie, W.; Kajal, P.; Mathews, N.; Basu, A.; Kim, T.T.H. A Dynamic Gesture Recognition Algorithm Using Single Halide Perovskite Photovoltaic Cell for Human-Machine Interaction. In Proceedings of the 2024 International Conference on Electronics, Information, and Communication (ICEIC), Taipei, Taiwan, 28–31 January 2024; pp. 1–4. [Google Scholar] [CrossRef]
- Sorescu, C.; Meena, Y.K.; Sahoo, D.R. PViMat: A Self-Powered Portable and Rollable Large Area Gestural Interface Using Indoor Light. In Proceedings of the Adjunct Proceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology, Virtual Event, 20–23 October 2020; pp. 80–83. [Google Scholar] [CrossRef]
- Almania, N.A.; Alhouli, S.Y.; Sahoo, D.R. Dynamic Hover Gesture Classification using Photovoltaic Sensor and Machine Learning. In Proceedings of the 2024 8th International Conference on Advances in Artificial Intelligence, London, UK, 17–19 October 2024; pp. 268–275. [Google Scholar] [CrossRef]
- Sandhu, M.M.; Khalifa, S.; Geissdoerfer, K.; Jurdak, R.; Portmann, M. SolAR: Energy Positive Human Activity Recognition using Solar Cells. In Proceedings of the 2021 IEEE International Conference on Pervasive Computing and Communications (PerCom), Kassel, Germany, 22–26 March 2021; pp. 1–10. [Google Scholar] [CrossRef]
- Li, Y.; Li, T.; Patel, R.A.; Yang, X.D.; Zhou, X. Self-Powered Gesture Recognition with Ambient Light. In Proceedings of the 31st Annual ACM Symposium on User Interface Software and Technology, Berlin, Germany, 14–17 October 2018; pp. 595–608. [Google Scholar] [CrossRef]
- Venkatnarayan, R.H.; Shahzad, M. Gesture Recognition Using Ambient Light. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2018, 2, 1–28. [Google Scholar] [CrossRef]
- Kaholokula, M. Reusing Ambient Light to Recognize Hand Gestures. Technical Report. 2016. Available online: https://digitalcommons.dartmouth.edu/senior_theses/105/ (accessed on 2 September 2025).
- Shahmiri, F.; Chen, C.; Waghmare, A.; Zhang, D.; Mittal, S.; Zhang, S.L.; Wang, Y.C.; Wang, Z.L.; Starner, T.E.; Abowd, G.D. Serpentine: A Self-Powered Reversibly Deformable Cord Sensor for Human Input. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow, UK, 4–9 May 2019; pp. 1–14. [Google Scholar] [CrossRef]
- Zhu, J.; Liu, Z.; Sun, Y.; Yang, Y. SemiGest: Recognizing Hand Gestures via Visible Light Sensing with Fewer Labels. In Proceedings of the 2023 19th International Conference on Mobility, Sensing and Networking (MSN), Nanjing, China, 14–16 December 2023; pp. 837–842. [Google Scholar] [CrossRef]
- Wen, F.; Sun, Z.; He, T.; Shi, Q.; Zhu, M.; Zhang, Z.; Li, L.; Zhang, T.; Lee, C. Machine learning glove using self-powered conductive superhydrophobic triboelectric textile for gesture recognition in VR/AR applications. Adv. Sci. 2020, 7, 2000261. [Google Scholar] [CrossRef]
- Khalid, S.; Raouf, I.; Khan, A.; Kim, N.; Kim, H.S. A review of human-powered energy harvesting for smart electronics: Recent progress and challenges. Int. J. Precis. Eng.-Manuf.-Green Technol. 2019, 6, 821–851. [Google Scholar] [CrossRef]
- Chen, Z.; Gao, F.; Liang, J. Kinetic energy harvesting based sensing and IoT systems: A review. Front. Electron. 2022, 3. [Google Scholar] [CrossRef]
- Ding, Q.; Rasheed, A.; Zhang, H.; Ajmal, S.; Dastgeer, G.; Saidov, K.; Ruzimuradov, O.; Mamatkulov, S.; He, W.; Wang, P. A Coaxial Triboelectric Fiber Sensor for Human Motion Recognition and Rehabilitation via Machine Learning. Nanoenergy Adv. 2024, 4, 355–366. [Google Scholar] [CrossRef]
- Ni, T.; Sun, Z.; Han, M.; Xie, Y.; Lan, G.; Li, Z.; Gu, T.; Xu, W. REHSense: Towards Battery-Free Wireless Sensing via Radio Frequency Energy Harvesting. In Proceedings of the Twenty-Fifth International Symposium on Theory, Algorithmic Foundations, and Protocol Design for Mobile Networks and Mobile Computing, Athens, Greece, 14–17 October 2024; pp. 211–220. [Google Scholar] [CrossRef]
- Alijani, M.; Cock, C.D.; Joseph, W.; Plets, D. Device-Free Visible Light Sensing: A Survey. IEEE Commun. Surv. Tutor. 2025. Early Access. [Google Scholar] [CrossRef]
- Li, T.; An, C.; Tian, Z.; Campbell, A.T.; Zhou, X. Human Sensing Using Visible Light Communication. In Proceedings of the 21st Annual International Conference on Mobile Computing and Networking, Paris, France, 7–11 September 2015; pp. 331–344. [Google Scholar] [CrossRef]
- Li, T.; Liu, Q.; Zhou, X. Practical Human Sensing in the Light. In Proceedings of the 14th Annual International Conference on Mobile Systems, Applications, and Services, Singapore, 26–30 June 2016; pp. 71–84. [Google Scholar] [CrossRef]
- Hu, Q.; Yu, Z.; Wang, Z.; Guo, B.; Chen, C. ViHand: Gesture Recognition with Ambient Light. In Proceedings of the 2019 IEEE SmartWorld, Ubiquitous Intelligence & Computing, Advanced & Trusted Computing, Scalable Computing & Communications, Cloud & Big Data Computing, Internet of People and Smart City Innovation (SmartWorld/SCALCOM/UIC/ATC/CBDCom/IOP/SCI), Leicester, UK, 19–23 August 2019; pp. 468–474. [Google Scholar] [CrossRef]
- Mahmood, S.; Venkatnarayan, R.H.; Shahzad, M. Recognizing Human Gestures Using Ambient Light. In Proceedings of the 2020 IEEE Intl Conf on Dependable, Autonomic and Secure Computing, Intl Conf on Pervasive Intelligence and Computing, Intl Conf on Cloud and Big Data Computing, Intl Conf on Cyber Science and Technology Congress (DASC/PiCom/CBDCom/CyberSciTech), Calgary, AB, Canada, 17–22 August 2020; pp. 420–428. [Google Scholar] [CrossRef]
- Li, T.; Xiong, X.; Xie, Y.; Hito, G.; Yang, X.D.; Zhou, X. Reconstructing Hand Poses Using Visible Light. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2017, 1, 1–20. [Google Scholar] [CrossRef]
- Manabe, H.; Fukumoto, M. Touch sensing by partial shadowing of PV module. In Proceedings of the Adjunct 25th Annual ACM Symposium on User Interface Software and Technology, Cambridge, MA, USA, 7–10 October 2012; pp. 7–8. [Google Scholar] [CrossRef]
- Varshney, A.; Soleiman, A.; Mottola, L.; Voigt, T. Battery-free Visible Light Sensing. In Proceedings of the 4th ACM Workshop on Visible Light Communication Systems (VLCS’17), Snowbird, UT, USA, 16 October 2017; pp. 3–8. [Google Scholar] [CrossRef]
- Ma, D.; Lan, G.; Hassan, M.; Hu, W.; Upama, M.B.; Uddin, A.; Youssef, M. Gesture Recognition with Transparent Solar Cells: A Feasibility Study. In Proceedings of the 12th International Workshop on Wireless Network Testbeds, Experimental Evaluation & Characterization, New Delhi, India, 2 November 2018; pp. 79–88. [Google Scholar] [CrossRef]
- Berman, S.; Stern, H. Sensors for Gesture Recognition Systems. IEEE Trans. Syst. Man Cybern. Part C (Appl. Rev.) 2012, 42, 277–290. [Google Scholar] [CrossRef]
- Vogiatzidakis, P.; Koutsabasis, P. Address and command: Two-handed mid-air interactions with multiple home devices. Int. J. Hum.-Comput. Stud. 2022, 159, 102755. [Google Scholar] [CrossRef]
- Andrei, A.T.; Bilius, L.B.; Vatavu, R.D. Take a Seat, Make a Gesture: Charting User Preferences for On-Chair and From-Chair Gesture Input. In Proceedings of the CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 11–16 May 2024. [Google Scholar] [CrossRef]
- Hosseini, M.; Ihmels, T.; Chen, Z.; Koelle, M.; Müller, H.; Boll, S. Towards a Consensus Gesture Set: A Survey of Mid-Air Gestures in HCI for Maximized Agreement Across Domains. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems, Hamburg, Germany, 23–28 April 2023. [Google Scholar] [CrossRef]
- Hosseini, M.; Mueller, H.; Boll, S. Controlling the Rooms: How People Prefer Using Gestures to Control Their Smart Homes. In Proceedings of the CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 11–16 May 2024. [Google Scholar] [CrossRef]
- Vogiatzidakis, P.; Koutsabasis, P. Mid-air gesture control of multiple home devices in spatial augmented reality prototype. Multimodal Technol. Interact. 2020, 4, 61. [Google Scholar] [CrossRef]
- Le Guennec, A.; Malinowski, S.; Tavenard, R. Data Augmentation for Time Series Classification using Convolutional Neural Networks. In Proceedings of the ECML/PKDD Workshop on Advanced Analytics and Learning on Temporal Data, Riva del Garda, Italy, 19–23 September 2016; Available online: https://shs.hal.science/halshs-01357973 (accessed on 2 September 2025).
- Islam, M.Z.; Hossain, M.S.; ul Islam, R.; Andersson, K. Static Hand Gesture Recognition using Convolutional Neural Network with Data Augmentation. In Proceedings of the 2019 Joint 8th International Conference on Informatics, Electronics & Vision (ICIEV) and 2019 3rd International Conference on Imaging, Vision & Pattern Recognition (icIVPR), Spokane, WA, USA, 30 May–2 June 2019; pp. 324–329. [Google Scholar] [CrossRef]
- Shannon, C.E. A mathematical theory of communication. Bell Syst. Tech. J. 1948, 27, 379–423. [Google Scholar] [CrossRef]
- Chaovalit, P.; Gangopadhyay, A.; Karabatis, G.; Chen, Z. Discrete wavelet transform-based time series analysis and mining. ACM Comput. Surv. 2011, 43, 1–37. [Google Scholar] [CrossRef]
- Tallarida, R.J.; Murray, R.B. Chi-square test. In Manual of Pharmacologic Calculations: With Computer Programs; Springer: New York, NY, USA, 1987; pp. 140–142. [Google Scholar] [CrossRef]
- Ross, A.; Willson, V.L.; Ross, A.; Willson, V.L. Paired samples T-test. In Basic and Advanced Statistical Tests: Writing Results Sections and Creating Tables and Figures; SensePublishers: Rotterdam, The Netherlands, 2017; pp. 17–19. [Google Scholar] [CrossRef]
- Gałecki, A.; Burzykowski, T. Linear mixed-effects model. In Linear Mixed-Effects Models Using R: A Step-by-Step Approach; Springer: Berlin/Heidelberg, Germany, 2012; pp. 245–273. [Google Scholar] [CrossRef]
- Villanueva, M.; Drögehorn, O. Using Gestures to Interact with Home Automation Systems: A Socio-Technical Study on Motion Capture Technologies for Smart Homes. In Proceedings of the SEEDS International Conference-Sustainable, Ecological, Engineering Design for Society; 2018. Available online: https://hal.science/hal-02179898/ (accessed on 2 September 2025).
- Schrepp, M.; Hinderks, A.; Thomaschewski, J. Construction of a benchmark for the user experience questionnaire (UEQ). Int. J. Interact. Multimed. Artif. Intell. 2017, 4, 40–44. [Google Scholar] [CrossRef]
- Schrepp, M. User experience questionnaire handbook. In All You Need to Know to Apply the UEQ Successfully in Your Project; 2015; Volume 10, Available online: https://www.ueq-online.org/Material/Handbook.pdf (accessed on 2 September 2025).
- Gentile, V.; Adjorlu, A.; Serafin, S.; Rocchesso, D.; Sorce, S. Touch or touchless? evaluating usability of interactive displays for persons with autistic spectrum disorders. In Proceedings of the 8th ACM International Symposium on Pervasive Displays, Palermo, Italy, 12–14 June 2019. [Google Scholar] [CrossRef]
- Khamis, M.; Trotter, L.; Mäkelä, V.; Zezschwitz, E.v.; Le, J.; Bulling, A.; Alt, F. CueAuth: Comparing Touch, Mid-Air Gestures, and Gaze for Cue-based Authentication on Situated Displays. In ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies; Association for Computing Machinery: New York, NY, USA, 2018; Volume 2. [Google Scholar] [CrossRef]
- Guhr, N.; Werth, O.; Blacha, P.P.H.; Breitner, M.H. Privacy concerns in the smart home context. Appl. Sci. 2020, 2, 1–12. [Google Scholar] [CrossRef]
- Gomes, A.; Priyadarshana, L.L.; Visser, A.; Carrascal, J.P.; Vertegaal, R. Magicscroll: A rollable display device with flexible screen real estate and gestural input. In Proceedings of the 20th International Conference on Human-Computer Interaction with Mobile Devices and Services, Barcelona, Spain, 3–6 September 2018. [Google Scholar] [CrossRef]
Participant | Light Intensity (Lux) Data Collection | Light Intensity (Lux) Real-Time Testing | Offline Accuracy (%) | Real-Time Accuracy (%) |
---|---|---|---|---|
P1 | 1270 | 1075 | 83.64 | 82 |
P2 | 1966 | 1854 | 87.27 | 100 |
P3 | 1788 | 1495 | 67.36 | 64 |
P4 | 1430 | 1240 | 74.55 | 91 |
P5 | 1566 | 1780 | 85.45 | 73 |
P6 | 1725 | 1836 | 81.82 | 100 |
P7 | 1522 | 1231 | 69.09 | 73 |
P8 | 1224 | 1390 | 87.27 | 100 |
P9 | 1513 | 1400 | 89.09 | 100 |
P10 | 1304 | 1152 | 85.45 | 91 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Almania, N.; Alhouli, S.; Sahoo, D. A Photovoltaic Light Sensor-Based Self-Powered Real-Time Hover Gesture Recognition System for Smart Home Control. Electronics 2025, 14, 3576. https://doi.org/10.3390/electronics14183576
Almania N, Alhouli S, Sahoo D. A Photovoltaic Light Sensor-Based Self-Powered Real-Time Hover Gesture Recognition System for Smart Home Control. Electronics. 2025; 14(18):3576. https://doi.org/10.3390/electronics14183576
Chicago/Turabian StyleAlmania, Nora, Sarah Alhouli, and Deepak Sahoo. 2025. "A Photovoltaic Light Sensor-Based Self-Powered Real-Time Hover Gesture Recognition System for Smart Home Control" Electronics 14, no. 18: 3576. https://doi.org/10.3390/electronics14183576
APA StyleAlmania, N., Alhouli, S., & Sahoo, D. (2025). A Photovoltaic Light Sensor-Based Self-Powered Real-Time Hover Gesture Recognition System for Smart Home Control. Electronics, 14(18), 3576. https://doi.org/10.3390/electronics14183576