Recent developments in the field of preventive healthcare have received considerable attention due to the effective management of various chronic diseases including diabetes, heart stroke, obesity, and cancer. Various automated systems are being used for activity and food recognition in preventive healthcare. The automated systems lack sophisticated segmentation techniques and contain multiple sensors, which are inconvenient to be worn in real-life settings. To monitor activity and food together, our work presents a novel wearable system that employs the motion sensors in a smartwatch together with a piezoelectric sensor embedded in a necklace. The motion sensor generates distinct patterns for eight different physical activities including eating activity. The piezoelectric sensor generates different signal patterns for six different food types as the ingestion of each food is different from the others owing to their different characteristics: hardness, crunchiness, and tackiness. For effective representation of the signal patterns of the activities and foods, we employ dynamic segmentation. A novel algorithm called event similarity search (ESS) is developed to choose a segment with dynamic length, which represents signal patterns with different complexities equally well. Amplitude-based features and spectrogram-generated images from the segments of activity and food are fed to convolutional neural network (CNN)-based activity and food recognition networks, respectively. Extensive experimentation showed that the proposed system performs better than the state of the art methods for recognizing eight activity types and six food categories with an accuracy of 94.3% and 91.9% using support vector machine (SVM) and CNN, respectively.
This is an open access article distributed under the Creative Commons Attribution License
which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited