In sensory evaluation, there have been many attempts to obtain responses from the autonomic nervous system (ANS) by analyzing heart rate, body temperature, and facial expressions. However, the methods involved tend to be intrusive, which interfere with the consumers’ responses as they are more aware of the measurements. Furthermore, the existing methods to measure different ANS responses are not synchronized among them as they are measured independently. This paper discusses the development of an integrated camera system paired with an Android PC application to assess sensory evaluation and biometric responses simultaneously in the Cloud, such as heart rate, blood pressure, facial expressions, and skin-temperature changes using video and thermal images acquired by the integrated system and analyzed through computer vision algorithms written in Matlab®
, and FaceReaderTM
. All results can be analyzed through customized codes for multivariate data analysis, based on principal component analysis and cluster analysis. Data collected can be also used for machine-learning modeling based on biometrics as inputs and self-reported data as targets. Based on previous studies using this integrated camera and analysis system, it has shown to be a reliable, accurate, and convenient technique to complement the traditional sensory analysis of both food and nonfood products to obtain more information from consumers and/or trained panelists.
This is an open access article distributed under the Creative Commons Attribution License
which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited