Automatic fall detection is a very active research area, which has grown explosively since the 2010s, especially focused on elderly care. Rapid detection of falls favors early awareness from the injured person, reducing a series of negative consequences in the health of the elderly. Currently, there are several fall detection systems (FDSs), mostly based on predictive and machine-learning approaches. These algorithms are based on different data sources, such as wearable devices, ambient-based sensors, or vision/camera-based approaches. While wearable devices like inertial measurement units (IMUs) and smartphones entail a dependence on their use, most image-based devices like Kinect sensors generate video recordings, which may affect the privacy of the user. Regardless of the device used, most of these FDSs have been tested only in controlled laboratory environments, and there are still no mass commercial FDS. The latter is partly due to the impossibility of counting, for ethical reasons, with datasets generated by falls of real older adults. All public datasets generated in laboratory are performed by young people, without considering the differences in acceleration and falling features of older adults. Given the above, this article presents the eHomeSeniors dataset, a new public dataset which is innovative in at least three aspects: first, it collects data from two different privacy-friendly infrared thermal sensors; second, it is constructed by two types of volunteers: normal young people (as usual) and performing artists, with the latter group assisted by a physiotherapist to emulate the real fall conditions of older adults; and third, the types of falls selected are the result of a thorough literature review.
This is an open access article distributed under the Creative Commons Attribution License
which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited