Automated Remote Detection of Falls Using Direct Reconstruction of Optical Flow Principal Motion Parameters
Abstract
Highlights
- Falls can be reliably detected using optical flow video processing algorithms.
- Real-time performance is enhanced by direct reconstruction of principal motion parameters.
- The proposed algorithm allows for modular integration into existing patient care observation systems.
- It provides non-obstructive, maintenance-free, and privacy-respecting tools for safety.
Abstract
1. Introduction
1.1. Motivation
1.2. Related Works
1.3. Organization
2. Materials and Methods
2.1. GLORIA Net
2.2. LSTM
2.3. Datasets
2.3.1. UP Fall Detection Dataset
2.3.2. LE2I Video Dataset
2.3.3. UR Fall Detection Dataset
2.4. Evaluation Parameters
- True positives (TP)—number of times we have correctly predicted an event as positive (e.g., our model predicts a fall, and a fall occurred indeed);
- False positives (FP)—number of times we have incorrectly predicted an event as positive (Type I error);
- True negative (TN)—number of times we have correctly predicted an event as negative (e.g., our model predicts no fall, and a fall did not occur);
- False positives (FN)—number of times we have incorrectly predicted an event as negative (Type II error).
3. Results
3.1. GLORIA Net Results
3.2. Comparison in Processing Times
4. Discussion and Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Abbreviations
OF | Optical Flow |
SVM | Support Vector Machine |
CNN | Convolutional Neural Network |
ROC | Receiver Operator Characteristic |
AUC | Area Under the Curve |
FPS | Frames Per Second |
References
- Davis, J.C.; Husdal, K.; Rice, J.; Loomba, S.; Falck, R.S.; Dimri, V.; Pinheiro, M.; Cameron, I.; Sherrington, C.; Madden, K.M.; et al. Cost-effectiveness of falls prevention strategies for older adults: Protocol for a living systematic review. BMJ Open 2024, 14, e088536. [Google Scholar] [CrossRef]
- European Public Health Association. Falls in Older Adults in the EU: Factsheet. Available online: https://eupha.org/repository/sections/ipsp/Factsheet_falls_in_older_adults_in_EU.pdf (accessed on 4 June 2025).
- Davis, J.C.; Robertson, M.C.; Ashe, M.C.; Liu-Ambrose, T.; Khan, K.M.; Marra, C.A. International comparison of cost of falls in older adults living in the community: A systematic review. Osteoporos. Int. 2010, 21, 1295–1306. [Google Scholar] [CrossRef]
- Wang, X.; Ellul, J.; Azzopardi, G. Elderly Fall Detection Systems: A Literature Survey. Front. Robot. AI 2020, 7, 71. [Google Scholar] [CrossRef] [PubMed]
- Alon, N. Fall Detection and Prevention System and Method. WO2025082457, 24 April 2025. Available online: https://patentscope.wipo.int/search/en/detail.jsf?docId=WO2025082457 (accessed on 16 June 2025).
- John, C.-F. Method and System for Fall Detection. US8217795B2, 10 July 2012. Available online: https://patents.google.com/patent/US8217795B2/en (accessed on 16 June 2025).
- González, C.S. Device, System and Method for Fall Detection. EP 3 796 282 A2, 21 March 2021. Available online: https://patentimages.storage.googleapis.com/e9/e8/a1/fc9d181803c231/EP3796282A2.pdf#page=19.77 (accessed on 16 June 2025).
- Kalitzin, S.; Geertsema, E.; Petkov, G. Scale-Iterative Optical Flow Reconstruction from Multi-Channel Image Sequences. Front. Artif. Intell. Appl. 2018, 310, 302–314. [Google Scholar] [CrossRef]
- Kalitzin, S.; Geertsema, E.; Petkov, G. Optical Flow Group-Parameter Reconstruction from Multi-Channel Image Sequences. Front. Artif. Intell. Appl. 2018, 310, 290–301. [Google Scholar] [CrossRef]
- Lucas, B.D.; Kanade, T. An Iterative Image Registration Technique with an Application to Stereo Vision (IJCAI). Available online: https://www.researchgate.net/publication/215458777 (accessed on 1 June 2025).
- Horn, B.K.P.; Schunck, B.G. Determining optical flow. Artif. Intell. 1981, 17, 185–203. [Google Scholar] [CrossRef]
- Jeyasingh-Jacob, J.; Crook-Rumsey, M.; Shah, H.; Joseph, T.; Abulikemu, S.; Daniels, S.; Sharp, D.J.; Haar, S. Markerless Motion Capture to Quantify Functional Performance in Neurodegeneration: Systematic Review. JMIR Aging 2024, 7, e52582. [Google Scholar] [CrossRef]
- Karpuzov, S.; Petkov, G.; Ilieva, S.; Petkov, A.; Kalitzin, S. Object Tracking Based on Optical Flow Reconstruction of Motion-Group Parameters. Information 2024, 15, 296. [Google Scholar] [CrossRef]
- Vargas, V.; Ramos, P.; Orbe, E.A.; Zapata, M.; Valencia-Aragón, K. Low-Cost Non-Wearable Fall Detection System Implemented on a Single Board Computer for People in Need of Care. Sensors 2024, 24, 5592. [Google Scholar] [CrossRef]
- Wu, L.; Huang, C.; Zhao, S.; Li, J.; Zhao, J.; Cui, Z.; Yu, Z.; Xu, Y.; Zhang, M. Robust fall detection in video surveillance based on weakly supervised learning. Neural Netw. 2023, 163, 286–297. [Google Scholar] [CrossRef]
- Chhetri, S.; Alsadoon, A.; Al-Dala’IN, T.; Prasad, P.W.C.; Rashid, T.A.; Maag, A. Deep Learning for Vision-Based Fall Detection System: Enhanced Optical Dynamic Flow. Comput. Intell. 2021, 37, 578–595. [Google Scholar] [CrossRef]
- Gaya-Morey, F.X.; Manresa-Yee, C.; Buades-Rubio, J.M. Deep learning for computer vision based activity recognition and fall detection of the elderly: A systematic review. Appl. Intell. 2024, 54, 8982–9007. [Google Scholar] [CrossRef]
- Geertsema, E.E.; Visser, G.H.; Viergever, M.A.; Kalitzin, S.N. Automated remote fall detection using impact features from video and audio. J. Biomech. 2019, 88, 25–32. [Google Scholar] [CrossRef]
- Cortes, C.; Vapnik, V.; Saitta, L. Support-vector networks. Mach. Learn. 1995, 20, 273–297. [Google Scholar] [CrossRef]
- De Miguel, K.; Brunete, A.; Hernando, M.; Gambao, E. Home camera-based fall detection system for the elderly. Sensors 2017, 17, 2864. [Google Scholar] [CrossRef]
- Piccardi, M. Background subtraction techniques: A review. In Proceedings of the 2004 IEEE International Conference on Systems, Man and Cybernetics, The Hague, The Netherlands, 10–13 October 2004; Volume 4, pp. 3099–3104. [Google Scholar] [CrossRef]
- Kalman, R.E. A New Approach to Linear Filtering and Prediction Problems. J. Basic Eng. 1960, 82, 35–45. [Google Scholar] [CrossRef]
- Cover, T.M.; Hart, P.E. Nearest Neighbor Pattern Classification. IEEE Trans. Inf. Theory 1967, 13, 21–27. [Google Scholar] [CrossRef]
- Hsieh, Y.Z.; Jeng, Y.L. Development of Home Intelligent Fall Detection IoT System Based on Feedback Optical Flow Convolutional Neural Network. IEEE Access 2017, 6, 6048–6057. [Google Scholar] [CrossRef]
- Hinton, G.E. Computation by neural networks. Nat. Neurosci. 2000, 3, 1170. [Google Scholar] [CrossRef] [PubMed]
- Hochreiter, S.; Schmidhuber, J. Long Short-Term Memory. Neural Comput. 1997, 9, 1735–1780. [Google Scholar] [CrossRef]
- Martínez-Villaseñor, L.; Ponce, H.; Brieva, J.; Moya-Albor, E.; Núñez-Martínez, J.; Peñafort-Asturiano, C. UP-Fall Detection Dataset: A Multimodal Approach. Sensors 2019, 19, 1988. [Google Scholar] [CrossRef] [PubMed]
- Charfi, I.; Miteran, J.; Dubois, J.; Atri, M.; Tourki, R. Optimized spatio-temporal descriptors for real-time fall detection: Comparison of support vector machine and Adaboost-based classification. J. Electron. Imaging 2013, 22, 041106. [Google Scholar] [CrossRef]
- Kwolek, B.; Kepski, M. Human fall detection on embedded platform using depth maps and wireless accelerometer. Comput. Methods Programs Biomed. 2014, 117, 489–501. [Google Scholar] [CrossRef] [PubMed]
- Carreira, J.; Zisserman, A. Quo Vadis, Action Recognition? A New Model and the Kinetics Dataset. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HA, USA, 21–26 July 2017. [Google Scholar]
- Tran, D.; Bourdev, L.; Fergus, R.; Torresani, L.; Paluri, M. Learning Spatiotemporal Features with 3D Convolutional Networks. In Proceedings of the IEEE International Conference on Computer Vision (ICCV), Santiago, Chile, 7–13 December 2015. [Google Scholar] [CrossRef]
Dataset | Accuracy | Sensitivity | Specificity | Precision | F1-Score |
---|---|---|---|---|---|
UP-Fall + LE2I | 97.7% | 98.1% | 96.9% | 97.0% | 98.0% |
UR | 83.3% | 83.3% | 83.3% | 83.3% | 83.3% |
Dataset | Accuracy | Sensitivity | Specificity | Precision | F1-Score |
---|---|---|---|---|---|
UR (+LSTM) | 91.7% (↑ 8.4%) | 83.3% | 100.0% (↑ 16.7%) | 100.0% (↑ 16.7%) | 90.9% (↑ 7.6%) |
UR (no LSTM) | 83.3% | 83.3% | 83.3% | 83.3% | 83.3% |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Karpuzov, S.; Kalitzin, S.; Georgieva, O.; Trifonov, A.; Stoyanov, T.; Petkov, G. Automated Remote Detection of Falls Using Direct Reconstruction of Optical Flow Principal Motion Parameters. Sensors 2025, 25, 5678. https://doi.org/10.3390/s25185678
Karpuzov S, Kalitzin S, Georgieva O, Trifonov A, Stoyanov T, Petkov G. Automated Remote Detection of Falls Using Direct Reconstruction of Optical Flow Principal Motion Parameters. Sensors. 2025; 25(18):5678. https://doi.org/10.3390/s25185678
Chicago/Turabian StyleKarpuzov, Simeon, Stiliyan Kalitzin, Olga Georgieva, Alex Trifonov, Tervel Stoyanov, and George Petkov. 2025. "Automated Remote Detection of Falls Using Direct Reconstruction of Optical Flow Principal Motion Parameters" Sensors 25, no. 18: 5678. https://doi.org/10.3390/s25185678
APA StyleKarpuzov, S., Kalitzin, S., Georgieva, O., Trifonov, A., Stoyanov, T., & Petkov, G. (2025). Automated Remote Detection of Falls Using Direct Reconstruction of Optical Flow Principal Motion Parameters. Sensors, 25(18), 5678. https://doi.org/10.3390/s25185678