Next Article in Journal
Enhanced Rehydration of Micellar Casein Powder: Effects of Electrodialysis Treatment
Previous Article in Journal
Thermal, Rheological, and Surface Properties of Brewer’s Spent Grain and Its Oligo and Polysaccharides Fractions
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Integrating Cutting-Edge Technologies in Food Sensory and Consumer Science: Applications and Future Directions

Department of Food Science and Nutrition, Dankook University, Cheonan-si 31116, Republic of Korea
*
Author to whom correspondence should be addressed.
Foods 2025, 14(24), 4169; https://doi.org/10.3390/foods14244169
Submission received: 30 October 2025 / Revised: 24 November 2025 / Accepted: 3 December 2025 / Published: 5 December 2025

Abstract

With the introduction of emerging digital technologies, sensory and consumer science has evolved beyond traditional laboratory-based and self-response-centered sensory evaluations toward more objective assessments that reflect real-world consumption contexts. This review examines recent trends and potential applications in sensory evaluation research focusing on key enabling technologies—artificial intelligence (AI) and machine learning (ML), extended reality (XR), biometrics, and digital sensors. Furthermore, it explores strategies for establishing personalized, multimodal, and intelligent–adaptive sensory evaluation systems through the integration of these technologies, as well as the applicability of sensory evaluation software. Recent studies report that AI/ML models used for sensory or preference prediction commonly achieve RMSE values of approximately 0.04–24.698, with prediction accuracy ranging from 79 to 100% (R2 = 0.643–0.999). In XR environment, presence measured by the IPQ (7-point scale) is generally considered adequate when scores exceed 3. Finally, the review discusses ethical considerations arising throughout data collection, interpretation, and utilization processes and proposes future directions for the advancement of sensory and consumer science research. This systematic literature review aims to identify emerging technologies rather than provide a quantitative meta-analysis and therefore does not cover domain-specific analytical areas such as chemometrics beyond ML approaches or detailed flavor and aroma chemistry.

1. Introduction

Sensory evaluation is a scientific discipline that employs assessors’ five senses to identify product attributes and to measure and interpret foods both qualitatively and quantitatively [1]. This approach has evolved through the integration of psychology, physiology, chemistry, physics, and statistics, and has been systematically implemented based on standardized procedures and consistent protocols to elucidate the interactions between stimuli and human responses, as well as to predict consumer preferences and behaviors [1,2]. However, because it relies on assessors’ subjective perception, the results may vary depending on inter-individual differences, cultural background, and affective state, thereby introducing potential cognitive biases [2,3]. In addition, the recruitment, training, and management of panels for descriptive analysis require considerable time and financial resources, imposing practical constraints on the widespread implementation of sensory evaluation [3]. Consequently, in real industrial and research settings, sensory evaluation often fails to achieve adequate levels of prediction accuracy, ecological validity, and throughput.
To overcome these limitations, researchers in sensory science have explored novel approaches, and the recent incorporation of digital technologies has demonstrated considerable potential to greatly enhance the objectivity, efficiency, and scalability of sensory evaluation processes [4]. In particular, artificial intelligence (AI) and machine learning (ML), biosignal measurement technologies, extended reality (XR), and digital sensing and automation systems are being actively applied in sensory and consumer research. AI and ML enable the processing and analysis of large-scale datasets, thereby refining and enhancing the prediction of consumer preferences and behaviors, while biosignal measurement techniques—such as electromyography (EMG), electroencephalography (EEG), eye tracking, skin conductance (SC), heart-rate variability (HRV), and facial-expression analysis (FEA)—capture implicit physiological responses, thereby complementing the limitations of subjective self-report measures [5,6]. Furthermore, XR technologies including virtual reality (VR), augmented reality (AR), and mixed reality (MR) simulate realistic consumption contexts to enhance the ecological validity of sensory evaluations, whereas such digital sensing and automation technologies as the Internet of Things (IoT), robotics, electronic tongues (E-tongue), and electronic noses (E-nose) improve the precision and efficiency of data acquisition and analysis [7,8]. When used synergistically, these digital technologies can effectively address the intrinsic limitations of conventional sensory evaluation and significantly broaden its research and industrial applicability. Nevertheless, systematic reviews that synthesize sufficient evidence on the extent to which individual technologies improve specific aspects of sensory evaluation, and that clearly delineate their scope of application and limitations, remain scarce. Therefore, it is necessary to systematically examine, for each digital technology currently used in sensory evaluation, whether adequate evidence has accumulated regarding its ability to enhance key indicators—such as prediction accuracy, ecological validity, and throughput—and to clarify its practical application range and constraints.
The purpose of this review is to provide a comprehensive overview of the development and significance of emerging technologies applied in sensory evaluation. Specifically, the aim of this review is to critically synthesize the evidence on how these technologies improve prediction accuracy, ecological validity, and throughput in sensory evaluation. Specifically, this review seeks to address the following key questions.
(i)
Which major digital technologies (e.g., AI, XR, biometrics, digital sensors) are currently used in sensory and consumer evaluation, and how have they been applied in sensory studies?
(ii)
How do these technologies differ in their structural and functional characteristics, and in what ways do they improve key indicators in research and practice, such as predictive power, validity, and efficiency?
(iii)
How can the individual and integrated use of these technologies, together with sensory software, enhance resource-management efficiency and evaluation objectivity, and what ethical and practical issues arise in this process?
In this review, we synthesize recent research trends on AI, XR, biosignal measurement, and digital sensing technologies, and critically discuss the new possibilities these technologies offer for data analysis and interpretation. In addition to highlighting the roles of individual technologies, this review emphasizes the importance of their integrated application, examines the potential of sensory software, and explores strategies to enhance both efficiency in resource management and objectivity in evaluation. Moreover, it addresses ethical considerations associated with the adoption of these technologies and proposes future directions for the advancement of sensory science. Figure 1 schematically illustrates the overall conceptual framework of this review.

2. Methodology

In this study, we conducted a systematic literature review to synthesize and critically evaluate previous work applying cutting-edge digital technologies to sensory evaluation, consumer research, and related food studies. We searched major databases (Scopus, PubMed, Google Scholar) for articles published between 2015 and 2025 using combinations of sensory- and consumer-related terms (e.g., “sensory evaluation”, “consumer test”, “food”) and technology-related keywords (e.g., “artificial intelligence”, “machine learning”, “extended reality”, “biometrics”, “digital sensor”, “internet of things”). Targeted searches based on key terms and seminal papers identified in the initial search were additionally performed, and the reference lists of preliminarily eligible articles were screened to capture further relevant studies.
After removing duplicate records, studies were included if they (i) involved food or beverage products or food-related sensory stimuli, (ii) reported primary outcomes on sensory responses, liking, choice behavior, or food-quality monitoring and management, and (iii) incorporated at least one of the following technologies into the study design: AI, ML, XR, biometric measurements, digital sensors, or the IoT. Studies were excluded when the methodology was insufficiently described, the article was not written in English, or full-text access was not available, preventing adequate evaluation.
For the studies that met the inclusion criteria, we used standardized extraction forms to collect information on the study context, food category, technology applied, main objectives and outcomes, and implications for sensory and consumer science, and organized these data into tables by technology type. Based on this dataset, we compared research aims and key findings across technology categories, examined the potential for integrated use of these tools in sensory, consumer, and food-quality research, and evaluated related software tools as well as the methodological and ethical issues, advantages, limitations, and application domains of each technology, thereby identifying major gaps that warrant further investigation.

3. Advances in Technologies for Sensory and Consumer Sciences

3.1. Artificial Intelligence and Machine Learning

Artificial intelligence collects vast amounts of data and learns complex patterns through processes of rule learning and generalization [9,10]. In consumer sensory science, AI functions as a powerful analytical tool that can predict food taste and elucidate the interaction mechanisms between consumer preferences and the sensory attributes of foods [9,10]. As the food industry continues to advance and large volumes of product-related data are accumulated throughout food quality development processes, the necessity and significance of applying AI for food quality prediction have become increasingly recognized [11,12].
In recent years, research employing AI–based ML techniques has grown rapidly. ML has been utilized not only for predicting food quality but also for elucidating the correlations between consumer preferences and product sensory characteristics, as well as for developing predictive models that can complement or even substitute for human sensory evaluations. Moreover, with the increasing prevalence of natural language processing (NLP)–based large language models (LLMs), several studies have demonstrated their potential applications in product development through the analysis of consumer-generated text reviews.
More recently, the integration of molecular docking techniques with ML has been actively explored for the identification of peptide candidates capable of mimicking or replacing specific taste sensations. Therefore, these advances suggest that AI is becoming an indispensable tool in consumer sensory science. Figure 2 presents the application mechanisms of AI and ML models in this field.

3.1.1. Machine Learning Approaches for Sensory and Quality Prediction

Machine learning, a subfield of AI, learns highly complex patterns within large datasets and identifies relationships among data through training and testing processes. Owing to its high predictive accuracy, ML has become a powerful alternative to traditional predictive models, such as partial least squares (PLS) regression and Bayesian statistics, in the field of consumer sensory science [13,14]. Recently, a variety of ML algorithms—including Support Vector Regression (SVR), Support Vector Machine (SVM), Extreme Gradient Boosting (XGBoost), Random Forest (RF), K-Nearest Neighbors (KNN), Artificial Neural Networks (ANNs), and Deep Learning (DL)—have been applied to predict and classify sensory attributes and food taste [11,15,16]. Among these, SVM and SVR utilize kernel functions to process both linear and nonlinear data. SVM has shown excellent performance in classification problems involving small-sample and high-dimensional pattern recognition, while its extended variant, SVR, also demonstrates strong capability in regression analysis [17,18].
Random Forest, a bagging-based learning model, predicts complex patterns by training multiple decision trees and effectively prevents overfitting, thereby exhibiting excellent performance in classification tasks. In contrast, XGBoost, an ensemble algorithm that combines tree-based learning with gradient boosting, is widely applied to both classification and regression problems and is recognized for its relatively high predictive accuracy [14,18,19]. KNN, an instance-based learning algorithm, effectively captures data similarity without the need for complex training processes and can be utilized for both classification and regression. Meanwhile, ANN, with its multilayered neural architecture, is capable of learning nonlinear and complex data patterns, making it suitable for pattern recognition, classification, and regression tasks [18]. DL, a subfield of ML, is an AI technique that automatically learns key features from large-scale datasets and achieves high accuracy in data classification and prediction [11,15].
Table 1 summarizes the key findings of previous ML-based studies across various food categories and the ML algorithms employed. It also presents the data size used for model development, the risk of overfitting, and the validation protocols adopted in each study.
As shown in Table 1, ML technologies have consistently demonstrated high predictive accuracy across various food categories, reinforcing their growing applicability in modern sensory evaluation research. By providing objective and rapid insights from large-scale datasets, ML serves as a promising complement to traditional sensory assessment. Recent studies particularly highlight the effectiveness of hybrid ML models and diverse validation protocols, which help reduce overfitting and enhance model stability as well as classification and prediction performance [14]. Consequently, these findings indicate that ML is rapidly expanding its role across the food industry and is becoming a key technological component in sensory and consumer research.

3.1.2. Text Mining-Based Natural Language Processing and Large Language Models

In food choice behavior, taste serves as a primary motivational factor for consumers; however, identifying the taste of foods and the degree of liking from consumers’ language is limited using unstructured and ambiguous expressions, making it difficult to achieve a systematic understanding of taste through conventional consumer surveys [24]. Therefore, for food companies to discover new trends and gain a competitive advantage, it has become essential to collect vast amounts of consumer-related data through web-based platforms [25]. As one of the efficient and rapid AI-based approaches for this purpose, text mining and NLP have emerged as key trends in sensory science in recent years [25]. NLP, a subfield of text mining, grounded in AI, computer science, and linguistics [24,26]. In addition, it represents a branch of AI that systematically analyzes language through text preprocessing, syntactic structures, and semantic analysis, enabling the interpretation of consumers’ free-form or ambiguous textual expressions [24,26]. LLM are DL–based models trained on massive amounts of natural language text and images to generate human-like text outputs, with representative examples including GPT-4, ChatGPT-5.1, BERT, Claude, and Gemini [27,28,29]. Table 2 summarizes the key findings of previous studies that employed NLP-based LLMs, as well as the AI and ML algorithms utilized in those studies.
Recently, in the field of sensory evaluation, data mining for recipe development and consumer feedback analysis have been applied to food flavor characterization and product development [24,26,30]. In particular, analyses focusing on consumers’ emotions not only provide insights into product perception but also enable the prediction of consumer preferences, demonstrating increasing applicability and expansion in sensory and consumer research [24,26,30]. In the previous studies summarized in Table 2, the adoption of text-mining-based NLP and LLM approaches demonstrated potential for consumer-driven data analysis; however, their predictive accuracy remained somewhat lower compared to traditional ML models. Therefore, future studies should focus on developing methodological approaches that enhance the reliability and robustness of these models to facilitate their practical implementation and commercialization.

3.1.3. Molecular Dynamics Simulation

Molecular Dynamics simulations play a crucial role in explaining the fundamental molecular mechanisms and the interactions between peptides and enzymes, while molecular docking serves as a molecular simulation technique that elucidates interaction mechanisms by calculating the binding sites between peptides and receptors based on receptor structures [16,33]. Traditional methods for screening peptides that contribute to specific taste sensations—such as sensory evaluation combined with filtration or chromatography techniques like High Performance Liquid Chromatography—are often time-consuming and costly [34,35]. Therefore, recent studies in the field of sensory evaluation have actively employed approaches that integrate peptide extraction from food sources with ML-based peptide screening and molecular docking to identify peptides contributing to specific taste perceptions. Table 3 summarizes the input data and ML models utilized in previous studies related to Molecular Dynamics (MD) simulations.
The previous studies summarized in Table 3 demonstrate that integrating ML with MD simulations enables rapid processing of complex peptide-screening procedures that traditionally required substantial time and effort, thereby accelerating the commercialization of bioactive or taste-contributing peptides. Moreover, ML-based MD simulation research in sensory evaluation has been increasingly active in recent years, and the scope of molecular-level investigations in sensory science is expected to further expand in the future.

3.1.4. Limitations of Artificial Intelligence and Machine Learning for Sensory and Consumer Science

Artificial intelligence technologies are emerging as an efficient alternative to traditional sensory evaluation methods. In particular, the use of ML has been progressively expanding across the food industry in recent years, including applications in MD simulations and molecular docking studies. However, the utilization of AI in this field still faces several challenges and limitations.
In ML models, as dataset size increases, issues such as overfitting may arise, which can diminish model accuracy [39]. To address these limitations, challenges related to data cleaning, missing values, class imbalance, data leakage, and noise must be resolved [40,41,42,43]. This requires appropriate data preprocessing procedures, including rebalancing, data transformation, normalization, and outlier removal [40,41,42,43]. Furthermore, a systematic validation protocol is essential for evaluating model performance and reliability. In the study by [44], external validation was performed to assess ML performance. The results showed that the model’s predictions aligned with actual human sensory evaluation outcomes within an acceptable error range, underscoring the importance of external validation in ensuring the reliability and stability of ML models [44].
Additionally, real-world data—including consumer liking data—can experience domain shift and drift over time due to changes in data patterns and environmental conditions [39]. These shifts can degrade a model’s predictive performance, making regular performance monitoring and periodic recalibration essential for maintaining generalization capability, stability, and high predictive accuracy [39].
Finally, when data collection relies solely on a single method, the classification and prediction accuracy of ML models may decrease. To overcome these limitations, the integration of diverse data sources—including electronic sensing devices (E-nose, E-tongue), spectroscopic methods, human sensory evaluation, and physicochemical indices—is essential. Moreover, combining multiple predictive models rather than relying on a single ML model can further enhance the reliability and robustness of the analytical outcomes.
In recent studies on text mining–based sensory evaluation technologies, ref. [31] reported that ChatGPT provided positive evaluations even for hypothetical products formulated with ingredients that may evoke somewhat negative consumer reactions. This finding suggests that LLMs may interpret product attributes differently from actual consumer perceptions, highlighting the necessity of human-based validation to ensure the reliability of such interpretations. Furthermore, ref. [31] emphasized the importance of consistency in the prompt environment and the standardization of evaluation procedures. This indicates that prompts should be structured in a uniform and consistent manner, and that clear instructions are essential for obtaining stable results [31].
According to [26], online review authors do not fully represent the broader consumer population and are often limited to specific linguistic and cultural groups. This implies that linguistic and cultural differences may introduce bias. Because web-based platforms include consumers from diverse backgrounds, linguistic and cultural variability, as well as the representativeness of consumer samples, must be carefully examined [9,26,28].
Although AI-based digital technologies continue to advance, clear limitations remain in fully understanding and analyzing consumer data that are rooted in complex and multidimensional emotions and experiences. To reduce the risk of model bias stemming from skewed training data, AI systems must incorporate a wide range of societal perspectives and information, and sufficient human verification is required to prevent the generation of misleading outputs [31,45]. Therefore, an integrated approach that combines AI-based analysis with human sensory data is essential.

3.2. Extended Reality: Virtual, Augmented and Mixed Reality

Food choice is the outcome of a complex interplay of factors and has long been treated as a major research topic in the fields of sensory and consumer science [46]. Recently, particular attention has been directed to the influence of consumption context on consumer behavior and on the perception of foods and beverages, and contextual cues—such as place of consumption, social situation, and auditory stimulation—have been reported to induce significant changes in perception and choice [46]. In line with the growing emphasis on these contextual factors, the fields of sensory and consumer science have actively pursued studies that apply immersive technologies—VR, AR, and MR—to reproduce real consumption situations [47]. This approach enables consumers to experience foods and beverages across diverse environments and contexts, thereby complementing the limitations of conventional sensory evaluation and providing an assessment environment that secures both internal validity and external validity [48].
Moreover, XR-based evaluation environments may represent a realistic alternative from a cost perspective. According to [49], studies conducted in real consumption settings required approximately 150% of the cost of tests performed in an already established sensory laboratory, whereas the relative cost of immersive consumption environments decreased as sample size increased, reaching about 125% of the laboratory cost at n = 120. These findings suggest that, despite the initial investment required for system setup, immersive environments can be more cost-efficient than traditional real-life consumption settings and, in consumer studies with sufficiently large sample sizes, may serve as an alternative or complementary tool to conventional sensory laboratories.
Against this background, the present review proposes a conceptual decision tree (Figure 3) to guide the selection of an appropriate test environment—conventional laboratory, VR, or AR/MR—for sensory and consumer evaluation. When the primary objective is to ensure high internal validity, a laboratory-based sensory test is most appropriate, whereas XR environments should be considered when the goal is to enhance external validity by incorporating realistic consumption contexts. Within XR, AR/MR-type hybrid settings are preferable when maintaining the real physical space, while overlaying limited virtual elements is desired, whereas fully virtual, VR-based environments are better suited when fine-grained control of contextual cues such as lighting, spatial layout, and social interactions is required.
Extended reality, also referred to as cross reality, is a concept that encompasses immersive technologies such as VR, AR, and MR, as well as human–machine interaction based on these technologies, metaverse environments, and spatial computing [50]. XR spans the continuum between real and virtual environments and realizes the convergence of the physical and digital worlds (Figure 4).
Virtual reality is a technology that, through a head-mounted display (HMD) that occludes the user’s field of view, replaces the real environment with three-dimensional or virtual imagery to provide a fully immersive experience [52]. In sensory evaluation, VR enables consumers to experience foods and beverages within a virtual environment, thereby allowing observation of perception and responses in contexts that approximate real-world conditions, while simultaneously imparting a sense of presence that supports interaction with the virtual environment [52]. AR is a technology that superimposes digital imagery or information onto physical space by means of devices such as smartphones, tablets, AR glasses, and headsets [53]. Through this approach, users can perceive virtual elements while remaining aware of their surrounding environment, and not only visual images but also auditory stimuli—such as spatial audio—can be combined with the real environment [54].
In augmented reality, interaction between the user and virtual stimuli is possible; however, unlike VR, direct interaction between virtual stimuli and elements of the real environment is limited [55]. MR is a concept situated between AR and VR, providing an environment in which real and virtual elements coexist and can interact in real time [56]. Users, wearing an HMD or AR glasses, can perceive physical elements—such as their hands and arms or evaluation samples—as integrated within the virtual scene and can manipulate virtual stimuli as if they were part of the physical environment, thereby affording a high degree of realism [53,55]. Table 4 summarizes the overall differences among VR, MR, and AR.
Extended reality presents new avenues for application in sensory evaluation and affords research opportunities to investigate consumer perception and behavior with greater precision. Notably, among XR-based immersive technologies, VR has been most actively applied, and recent prior studies have been reported in this domain. Ref. [58] demonstrated that enjoyment and purchase intention were significantly higher when virtual foods matched their real counterparts; this cross-reality effect was particularly salient for foods with delicate taste attributes and low familiarity, thereby evidencing that digital aesthetic cues can influence consumer perception and behavior. Ref. [59] reported that an immersive and realistic VR context elicited positive affective responses and increased beer acceptability, and machine-learning analyses identified affective responses as an important predictor of acceptability.
In a recent AR-based study, ref. [54] confirmed, using lasagna and strawberry yogurt, that visual- and auditory-based AR tools provide consumers with novel and memorable experiences; however, they further reported that nutritional information exerted a greater influence on changes in product perception irrespective of the presentation format. Ref. [60] showed that AR facilitates consumers’ mental simulation, thereby increasing desire for food and purchase likelihood, with these effects mediated by enhanced perceptions of personal relevance.
In a mixed reality study, ref. [56] found that congruency between product and contextual color/shape strengthened sourness perception; participants tended to associate red/pink with sweetness and green/yellow with sourness, and to link rounded shapes with sweetness and angular shapes with sourness, confirming MR as a useful tool for controlling visual cues to investigate taste perception. Ref. [61] demonstrated that sensory evaluation can be conducted in MR while maintaining immersion by employing a gesture-recognition-based virtual questionnaire and a customized wine glass.
Moreover, the implementation of a realistic virtual bar via audio and avatars, together with real-time gesture responses and wine-glass interactions, enabled multiple participants to provide data concurrently under full immersion. Collectively, these studies indicate that VR, AR, and MR technologies extend beyond mere contextual replication in sensory evaluation and open possibilities for the integrative investigation of consumer perception, cognition, and behavior—including purchasing behavior, perceptions of nutritional information, choice behavior, and crossmodal effects. In addition, Table 5 summarizes prior studies applying VR, AR, and MR technologies in sensory evaluation and presents the principal findings of each study.
Immersion, a core element of XR technologies, is a critical factor that separates users from the real environment and enables them to experience vivid and wide-ranging virtual worlds [71]. An effective immersive experience engenders a sense of presence—as if users exist in a world analogous to reality—which plays an important role in narrowing the gap between laboratory contexts and actual consumption contexts [74]. Presence is the user’s subjective perception of immersion, and it strengthens as the level of immersion increases [74]. High levels of immersion in VR, particularly when accompanied by mismatches between visual and bodily sensory cues, can induce cybersickness—manifesting as nausea, dizziness, headache, and eye strain—which, in the context of sensory evaluation, may reduce participants’ concentration, distort their responses, and increase the risk of dropout [75]. Accordingly, recent XR-based sensory studies have employed standardized questionnaires such as the igroup presence questionnaire (IPQ), system usability scale, and independent television commission-sense of presence inventory to assess immersion and presence and have used the Simulator Sickness Questionnaire (SSQ) to quantify cybersickness. Several studies have reported significantly higher presence in 360° VR and context-evoking conditions than in conventional laboratory or traditional control settings [7,66,76]; in particular, because the IPQ is rated on a 7-point scale, factor scores at or above the midpoint of 3 can be interpreted as reflecting a reasonably sufficient level of presence. Moreover, XR-based sensory-evaluation environments should be designed and managed such that total SSQ scores preferably remain below 10, corresponding to negligible or minimal levels of cybersickness [65].
In summary, XR-based sensory evaluation is a novel approach that helps overcome limitations in internal and external validity between traditional sensory booths and real consumption environments. By employing VR, AR, and MR in line with specific research objectives, it enables the design of consumption contexts that are difficult to realize with conventional methods, and its applications are increasingly extending across the food domain.
There are several limitations inherent in applying XR technologies to sensory evaluation. First, wearing equipment such as HMDs can induce fatigue during prolonged experiments and, for some participants, physiological discomfort such as cybersickness [77], highly immersive media can markedly increase participants’ information-processing demands, potentially leading to cognitive overload, which may distort intrinsic sensory responses or reduce attentional focus [75]. Third, at early stages of exposure—particularly among users unfamiliar with virtual environments—a novelty effect may arise during adaptation to new interaction modalities, imposing additional cognitive burden and thereby lowering experimental reliability [78].
Finally, the costs of equipment and system deployment are high, and practicality for application in large-scale consumer studies remains limited. Therefore, to address these constraints and expand the applicability of XR in sensory evaluation, further research is needed to pursue improvements in equipment, the refinement of measurement indices for immersion and presence, the implementation of multisensory stimuli, and strategies for scaling to large-sample consumer research.

3.3. Biometrics and Physiological Measurements

Traditional sensory evaluation has relied on respondents’ subjective reports, which has limitations as it is influenced by various factors [79]. Therefore, research combining existing sensory evaluation protocols with emerging technologies has been conducted to more comprehensively reflect consumer responses and provide deeper insights [2], and the number of such studies is steadily increasing [80]. Among these innovations, biometrics has emerged as a particularly promising approach [1]. Biometric data serve to provide additional information beyond explicit measurement data [81]. Biometrics is defined as “automated recognition of individuals based on their biological or behavioral characteristics” [82].
Biometric technologies utilized in sensory evaluation include EEG, functional near-infrared spectroscopy (fNIRS), functional magnetic resonance imaging (fMRI), electrodermal activity/galvanic skin response (EDA/GSR), electrocardiography (ECG), respiration, blood pressure, FEA, and EMG [80,83,84]. This section classifies the major biometric technologies applied in sensory evaluation by type and summarizes recent research trends.

3.3.1. Nerve and Brain Activity

Electroencephalogram signals are collected by measuring the potential difference between the active and reference electrodes [85]. Although the specific frequency ranges vary across studies, the recorded EEG signals are generally classified into five frequency bands: delta, theta, alpha, beta, and gamma [86]. Unlike early approaches that required drilling or perforating the skull to place electrodes, modern techniques allow the acquisition of high-quality EEG data through non-invasive methods. Furthermore, research has been conducted to develop smaller and wearable wireless EEG devices for practical applications [87]. According to [88], dry EEG systems provide a solution to overcome the limitations of wet EEG systems, which are unsuitable for long-term measurements, and the use of dry and wireless EEG systems is expected to increase. Compared to other neuroimaging techniques, EEG is considered a promising neuroscientific tool due to its relatively low cost and easy accessibility of devices for research purposes [89].
Functional near-infrared spectroscopy is defined by Marco [90] as a “neuroimaging technique for mapping the functional activity of the human cerebral cortex.” It is a non-invasive and safe technology that uses laser diode or light-emitting diode light sources to deliver near-infrared light between the source and the detector through biological tissues. In addition, fNIRS has the advantage of being able to utilize relatively inexpensive, portable, and wireless devices compared to fMRI. fMRI is a technique that detects changes in the blood oxygenation level-dependent (BOLD) signal of MRI that arise from alterations in brain states induced by stimuli or tasks. Furthermore, fMRI is known as a useful tool capable of extracting new information from brain systems associated with complex responses through simple activation maps obtained under various experimental conditions [91].
Ref. [92] compared consumer survey responses with EEG results on citrus flavor and identified correlations, confirming consumers’ unconscious reactions to citrus flavor that were not revealed through self-reported evaluations. The study by [93] is noteworthy for providing the first insights into basic taste sensitivity using fNIRS and for exploring the predictability of consumer preference. In this study, consumer preference data for two types of chocolate were obtained through subjective evaluations using a 7-point Likert scale, while participants’ unconscious responses were measured by recording oxyhemoglobin (OxyHb) concentrations through an fNIRS headband. Regarding basic taste, neural activity tended to decrease in response to sweetness and increase in response to bitterness. In terms of preference, participants were classified as chocolate likers and dislikers, and a significant difference between the two groups was observed at channel 16. This study demonstrated that responses in certain prefrontal channels could predict consumer preferences, suggesting that fNIRS holds potential as a complementary tool for future consumer evaluation research.
Building on these studies, the measurement of neural activation related to sweet taste perception using fNIRS was further investigated in depth by Jiayu [94]. In this study, the perceived sweetness intensity and emotional valence of three concentrations of sucrose solution were evaluated using a 15-point scale and a 9-point scale, respectively. As the sucrose concentration increased, the number of significantly activated channels rose from seven to eleven, indicating an expansion of brain activation areas. The classification accuracy of positive emotional responses identified by fNIRS was approximately 72%.
In contrast, the composite similarity index of the implicit–explicit regression model showed high overall consistency, with values of 0.998 for sweetness intensity and 0.889 for emotional valence, suggesting that fNIRS could serve as a supplementary tool in consumer evaluation. The study further proposed that future research should address temporal–spatial resolution by expanding channels or integrating fNIRS with EEG and fMRI and verify the classification potential for other basic tastes. Additionally, the latest studies employing EEG, fNIRS, and fMRI technologies in consumer sensory evaluation are summarized in Table 6.

3.3.2. Autonomic Nervous System Responses

Indicators of autonomic nervous system (ANS) responses include EDA, HR, HRV, and skin temperature (ST) [107]. GSR is mostly used interchangeably with EDA [108] and is one of the most widely utilized and well-established recording methods and biosignals [109]. The EDA commonly used in research is typically measured using direct current (DC) or constant voltage methods with silver/silver chloride (Ag/AgCl) electrodes and sodium chloride or potassium chloride electrolytes [110]. In exosomatic recording methods, when the voltage is held constant, the signal is recorded in units of skin conductance (SC), whereas when the current is held constant, it is recorded in units of skin resistance (SR) [110]. The level of sympathetic nervous system activity can be assessed through variations in skin conductance, which reflect physiological responses of the organism corresponding to cognitive and emotional states of the central nervous system [108]. Depending on the temporal dynamics of change, skin conductance is categorized into tonic activity, referred to as skin conductance level (SCL), and phasic responses, referred to as skin conductance response (SCR) (Figure 5) [111].
Electrocardiography is a technology that records cardiac signals with the highest degree of accuracy, and the voltage deflections that constitute the ECG correspond to the electromechanical phenomena of the heart and are represented by five letters [112]. Among ECG indices, heart rate (HR) generally indicates the number of heartbeats within 60 s [112], while heart rate variability (HRV) represents an immediate response of the autonomic nervous system to detected stimuli [113]. In particular, GSR and HRV sensors are among the most widely distributed biofeedback devices [114].
Alessandro [115] measured ECG and GSR signals associated with red wines of varying “emotional power” to identify differences in autonomic nervous system activity. The study found that as positive judgments toward wine increased, GSR characteristics (total GSR, tonic GSR) also increased, and significant emotional differences were observed in ECG parameters such as HR, Standard Deviation of Normal-to-Normal intervals (SDNN), cardiac sympathetic index, and NN50. These findings support the hypothesis that specific foods or beverages may serve as drivers of positive dietary behavior and suggest the potential for expanded studies with more diverse sample groups.
Ref. [116] examined the effects of HR, HRV, skin conductance, and frontal alpha asymmetry in response to solutions differing in both individual and general levels of preference. As a result, significant differences were observed between preferred and non-preferred beverages in HR and skin conductance, particularly in latency, while no significant differences were found in frontal alpha asymmetry. The study by [117] measured the SCR for samples with sweet, bitter, and astringent tastes and analyzed SC using implicit testing methods, exploring its potential as an auxiliary tool for understanding individual differences in sensory responses. Furthermore, the results suggested that SCR measurement can provide valuable insights into the relationship between physiological responses and gustatory perception.
Ref. [118] Facial emotion recognition (FER), GSR, and cardiac pulse measurements to predict consumer acceptability for chewing gums representing five basic tastes. The study introduced a system capable of predicting consumer acceptability of new food products and proposed a novel artificial intelligence–based approach combining facial emotion recognition and biosignal data. It was confirmed that FER alone was limited in predicting consumer acceptability, whereas prediction accuracy improved when GSR and cardiac pulse were incorporated. Among these measures, GSR was identified as the most significant variable for predicting consumer acceptance. Additionally, the latest studies utilizing EDA/GSR, ECG, and ST measurement technologies in consumer sensory evaluation are summarized in Table 7.

3.3.3. Eye Movements and Visual Responses

When consumers purchase food products in stores—whether in physical markets or online shops—the visual features of the products serve as crucial determinants [125]. Consequently, efforts have continued to investigate consumers’ visual responses to various stimuli [126]. Eye-trackers used for gaze tracking are defined as “devices used to estimate the direction of the eyes relative to the position of the head or gaze.” There are several types, including video-based eye trackers, electro-oculography, and scleral search coils [127], and they are generally classified into two main types: desktop-based and mobile-based systems [126].
Eye-tracking measures several parameters such as pupil dilation, smooth pursuit, microsaccades, saccades, blinks, and fixations [128]. Most studies in sensory and consumer science utilizing eye-tracking have focused primarily on fixation characteristics, which are interpreted as indicators of visual attention [126]. Eye-tracking technology determines the position of the gaze on a screen based on the pupil center and corneal reflection produced by infrared light and is regarded as the most accurate non-invasive method for measuring gaze position [81].
Ref. [129] employed the eye-tracking method to measure participants’ gaze patterns toward lemon-based food images under repeated olfactory stimulation with lemon scent. The results showed that the scent continuously directed participants’ visual attention to the product; however, under repeated exposure, while preference for the scent remained stable, its influence on product choice diminished. Ref. [130] found through eye-tracking that nutritional information attracted the highest visual attention, yet the main factor influencing purchase intention was the “Product of New Zealand” logo. This finding indicated that the orange juice package attracting the most visual attention was not necessarily the most preferred product.
In the study by [131], eye-tracking and preference evaluations were conducted simultaneously using high-calorie and low-calorie food images. Participants recognized high-calorie foods more quickly and maintained longer visual attention on them. The research by Savannah [132] demonstrated that consumers’ pupil responses statistically mediated the relationship between assortment size and product preference in display shelves, revealing that larger assortments led to pupil constriction and reduced cognitive processing capacity. Moreover, pupil size varied in the same direction as the price level of the chosen option but in the opposite direction of familiarity. Additionally, the latest studies utilizing eye-tracking technology in consumer sensory evaluation are summarized in Table 8.
Ref. [137] emphasized the importance of using advanced technologies to measure emotional responses in order to better understand consumer behavior and decision-making. Facial emotion recognition technology, which enables the identification of emotional expressions, can measure both cognitive and affective characteristics, particularly the degree of positive or negative emotions perceived by the participant [138]. Moreover, facial expression (FE)s serve as indicators of immediate emotional responses elicited by sensory stimuli such as the taste, texture, and appearance of food. The process of food consumption can be observed and recorded in real time, allowing specific emotions to be quantitatively analyzed [139]. According to [140], EMG technology analyzes the activity of specific facial muscles or muscle groups and classifies the measured facial behaviors into distinct emotions based on activation patterns. EMG records muscle activity by attaching electrodes to the skin surface, enabling observation of muscle activation associated with particular emotions [141].
Ref. [142] analyzed FE within the context of consumer product choice and measured the emotions elicited using FEA. The study demonstrated that using FE data improved the prediction of consumers’ product choices compared to relying solely on a single hedonic scale. Furthermore, by examining the role of emotions in the selection of credence goods such as organic and vintage wines, the study clarified the relationship between consumer emotions and product choice. For future research, it was suggested to identify the optimal method for emotion measurement and to explain why positive emotions influenced the selection of only specific wine attributes, which was further extended by [143]. In their study, consumer emotional responses to white, red, and port wines were distinguished using FaceReader, confirming that the type of wine significantly influenced the formation of emotional responses.
The study by [144] is noteworthy in that, unlike previous research which mainly analyzed single time points or averaged data, it investigated the dynamic correspondence between subjective evaluations and facial responses using EMG signals during meals. The research reflected hedonic experiences with temporal profiles occurring during the consumption of three types of gel-based foods. A negative correlation was observed between dynamic value ratings and electromyographic signals of mastication-related muscles, demonstrating that facial EMG signals can predict hedonic responses during eating.
In the study conducted by [145], EMG was measured while participants consumed five types of jellies representing sweet, salty, sour, bitter, and umami tastes, to examine muscle response tendencies according to taste. The results showed that the taste of the jelly influenced muscle synergy and symmetry, with marked differences observed for certain taste stimuli. Although the sample size was small (n = 5) and statistical significance was not achieved, the findings suggested an interaction between taste perception and muscle activity, thereby providing a foundation for future sensory perception research utilizing muscular responses. Additionally, the latest studies employing FEA and EMG technologies in consumer sensory evaluation are summarized in Table 9.

3.3.4. Limitation of Biometric and Physiological Measures in Sensory Evaluation

Various limitations arise when applying biometric technologies in sensory evaluation. Participants may unconsciously recognize that they are being continuously monitored by researchers, which can influence their behavioral and emotional responses [155,156]. Studies have shown that both mental and physical stress factors can affect biometric indicators. Moreover, in many studies, participants experience movement restrictions during biometric measurement due to chinrests [157,158] and electrode channels [159]. In some experiments involving biometric measurements, participants are instructed to minimize head and hand movements, keep their head fixed, and maintain a static or seated posture [102,109,141,160].
Such strict limitations on body movement can hinder participants from consuming food and beverages in a natural and unrestricted manner [93]. Additionally, the size and fixed nature of wired measurement equipment can restrict its usability and experimental environment [109,161]. Consequently, concerns have been raised about the ecological validity of such studies, as the testing environment often differs from actual consumer consumption contexts.
Another limitation is that physical movements during food consumption can generate artifacts, leading to increased data loss. One study found that jaw clenching and biting behaviors produced the most severe artifacts in scalp and ear electrodes [162]. Furthermore, consumer behavior studies utilizing biometric technologies are often limited by small sample sizes [163]. Ref. [156] reported that the number of participants who could be measured simultaneously was restricted due to equipment limitations. Another issue is susceptibility to external factors. According to [79] in food-related studies using biometric measurements, variables such as the type of product evaluated and the cultural background of participants or panels may influence the results.

3.4. Digital Sensing Technologies: IoT, Robotics, and Electronic Sensing Systems

3.4.1. Application of Robotics Technology in the Food Industry

With the advancement of robotics technology, research on the integration of robotics into the food industry has been actively increasing [164]. In the food industry, multiple processes are involved from the initial product concept development stage to the final product release, and the incorporation of robotic technologies can be applied at each stage. In particular, at the sensory evaluation stage—prior to product manufacturing—the combination of human sensory testing with robotic technology can enhance the objectivity and accuracy of sensory data. Furthermore, the application of robotics in the final production stage offers several advantages, including ensuring consistent product quality, reducing production time, and improving operational efficiency. Table 10 presents previous studies related to robotics technologies applied in the food industry and modern sensory evaluation techniques.
According to the previous studies summarized in Table 10, the use of robotic technologies has emerged as an important complementary tool that supports and enhances human roles in sensory evaluation research and the food industry. Recently, the use of robots to replace human labor has been increasing in food-related service sectors such as cafés, cafeterias, and restaurants. Consequently, human–robot interaction has been expanding its utility across various aspects of modern sensory evaluation, product development, and overall food industry operations [164].

3.4.2. Integration of Electronic Sensory Sensors and the Internet of Things

Quality control and improvement in the food industry are essential factors for the production of high-quality foods, and in recent years, the IoT framework—connecting physical objects through the internet—has been actively utilized for this purpose [167]. IoT enables the collection of sensory data through electronic sensing devices such as E-nose and E-tongue as well as physical and chemical sensors, allowing real-time data analysis and feedback to automate the system [168]. This integration simplifies many complex procedures required during food production, offering advantages such as reducing time and cost while enhancing operational efficiency [168]. Table 11 presents previous studies that have applied IoT technologies in the food industry and consumer sensory science.
In particular, with the recent advancement of IoT, the field of consumer sensory science has enabled real-time product monitoring, quality control during production, and stability assessment during storage, thereby facilitating the establishment of a more precise and systematic product manufacturing framework (Figure 6). The previous studies presented in Table 11 demonstrate that IoT technologies play an important role in areas such as food freshness evaluation, quality control, and storage environment monitoring within the food industry.

3.5. Comparison of Digital Technologies in Food Industry

Advancements in digital technologies—such as AI, XR, biometrics, and digital sensors—are introducing new methodological approaches to sensory science. Across sensory evaluation and the broader food industry, each technology is applied in distinct ways, and when integrated, they provide complementary strengths that enhance the precision, depth, and overall effectiveness of sensory analysis.
In sensory science, AI technologies primarily include ML, text-mining–based NLP, and LLMs. Despite limitations such as the risk of overfitting and reduced predictive performance resulting from single-method data collection [40,41,42,43], AI has been applied not only to analyze and predict consumer data but also for various purposes such as improving product quality through taste and quality prediction and identifying peptides that contribute to flavor perception [10]. AI learns patterns from large-scale data to construct models capable of prediction and interpretation and can be defined as a technology that mimics human cognitive processes [9,10].
Extended Reality technologies play an important role in capturing consumers’ sensory responses as they occur in natural environments by narrowing the gap between controlled laboratory settings and real-world consumption contexts [46,52,65]. In sensory evaluation research, XR has been applied for various purposes, including immersive sensory and consumer testing, evaluating contextual effects on perception, and constructing virtual food-choice environments [46]. Although XR still faces limitations such as discomfort caused by HMDs, cognitive overload, and high implementation costs, its accessibility and technological sophistication continue to improve [65]. XR can therefore be described as a technology that simulates real consumption environments and provides immersive experiences through virtual spaces [52,65].
In sensory evaluation research, biometric technologies measure a wide range of consumer responses, including emotional reactions, physiological and neurological responses, attention levels, and perceptual states, thereby providing additional consumer information [52,81,86,155]. Although limitations such as restrictions from wired devices, limited movement, and relatively small participant groups still exist [93,155,156], biometric technologies are expected to become a key tool in future sensory evaluation research due to their ability to capture consumer responses in a more comprehensive manner [22]. Thus, biometric technologies can be defined as major emerging digital tools that enhance the understanding of consumer behavior and preferences based on physiological responses and strengthen the prediction of consumer reactions [52,53].
Digital sensor technologies—including robotics, E-nose, E-tongue, and IoT systems—are increasingly being utilized in sensory evaluation research, with their applications expanding across the broader food industry. Robotics, based on human–robot interaction, is widely applied not only in sensory evaluation but also throughout various stages of food production processes [164]. E-nose and E-tongue technologies, which mimic human sensory organs, enable the objective measurement of the physicochemical properties of food products [52]. IoT technologies are primarily used for real-time monitoring and quality management within the food supply chain, and they play an important role in enhancing product quality through automated systems that provide rapid feedback [168,171].
Despite their expanding use, digital sensor technologies still have limitations that prevent them from fully replacing human sensory evaluation in both sensory research and the food industry. To overcome these limitations, it is necessary to utilize integrated data derived from various digital sensor technologies [52,164]. In conclusion, digital sensor technologies can be primarily defined as tools that complement human sensory evaluation and enhance product quality in sensory assessment and across the broader food industry [52].
Table 12 summarizes the advantages, limitations, and application areas of state-of-the-art digital technologies in the food industry.

4. Integrated Application of Advanced Sensory Evaluation Technologies

4.1. Digital Sensing Technologies and Machine Learning

Electronic sensory systems offer the advantage of being free from human subjectivity while detecting subtle sensory characteristics that are imperceptible to humans and generating chemical data that contribute to sensory attributes [175]. Accordingly, in the field of consumer sensory science, such systems have been utilized as complementary tools to human-based sensory evaluations, and recent research has increasingly focused on combining these systems with ML technologies for quality prediction, classification, and as potential alternatives to conventional sensory evaluation data. Table 13 presents a summary of previous studies that utilized electronic sensory sensors as input data for ML models.

4.2. Integrating Advanced Sensory Evaluation Technologies

Although both XR and AI are based on computer technologies, they differ in their applications: XR provides virtual environments that offer consumers experiences similar to real-world settings, while AI replicates the structure of the human brain through computational models to collect and process vast amounts of data [179]. Recently, research has increasingly focused on integrating IoT technologies—which enable real-time monitoring and data automation—with XR or AI systems, along with the convergence of various other sensory evaluation technologies.
In particular, studies that collect sensory data through robotics and electronic sensory systems and apply them across different sensory evaluation frameworks are now reaching a stage where the automation of sensory assessment can be practically realized. Therefore, the integrated approach of these technologies can enhance the precision of traditional sensory evaluation and more realistically simulate and analyze consumers’ product experiences, indicating high potential for practical application in the future food industry. Table 14 presents a summary of previous studies that integrated multiple advanced sensory evaluation technologies.

5. Developments in Sensory Software

Traditionally employed paper-based questionnaires in sensory evaluation require considerable time and labor to review responses and to enter and organize data after the evaluation has concluded [184]. These procedures, depending on the tools and modes adopted, increase the researcher’s burden and reduce the efficiency of data management. To overcome these limitations, the use of sensory-evaluation software has been proposed, and it has recently been reported that, when combined with electronic survey tools, the efficiency of data collection and analysis can be improved [184].
According to the classification framework presented in prior research [185], currently commercialized sensory-evaluation software can be grouped into three functional categories. The first category comprises integrated programs that support the entire workflow—from experimental design and panel management to data analysis and reporting—including Compusense (Compusense 25.0.30), Fizz, RedJade, EyeQuestion (version6), and SIMS Sensory Software Cloud (version 6); more recently, cloud-based extensions such as Compusense Cloud have also been utilized. The second category includes programs specialized for statistical analysis and visualization, among which Senstools, XLSTAT, and SensoMineR are widely used; multivariate analysis tools such as Unscrambler X and FactoMineR likewise support the interpretation and visualization of complex sensory data. The third category consists of software aimed at evaluating panel performance and test reliability—such as PanelCheck, V-Power, and SensCheck PanelView—which analyze assessor consistency, discriminative ability, and repeatability to enhance panel operations and improve test reliability.
Recently, sensory-evaluation software and platforms incorporating advanced technologies have been developed. Among these, the BioSensory App simultaneously presents questionnaire items on a tablet PC and records participants’ responses on video, which are analyzed by computer-vision algorithms and ML techniques to derive biosignals such as affective and physiological responses [186]. Ref. [186] proposed a system that integrates eye tracking and affective biosignals using the BioSensory App, and [187] employed it to collect and analyze self-reported responses and biosignals in immersive terrestrial and space environments, thereby demonstrating new possibilities for sensory evaluation. In parallel, iMotions is a platform capable of integrative acquisition and analysis of multiple biosignals—EEG, GSR, HRV, and eye tracking—and has been widely utilized to elucidate relationships between sensory stimuli and physiological responses.
Andrade and BastosIn addition, the Affectiva Affdex SDK automatically analyzes FE to quantify consumers’ emotional responses; ref. [146] used Affectiva Affdex SDK 4.0 together with iMotions to capture both implicit and explicit affective responses during children’s food tasting. Alpha MOS’s dedicated software for electronic-nose and E-tongue systems (AlphaSoft) provides an integrated platform for instrument control, data acquisition, preprocessing, and statistical analysis, and is employed to quantify flavor attributes and explore quality differences in foods. Recent studies have attempted to combine data collected with AlphaSoft with algorithms or to link such data to IoT-based production monitoring systems. Ref. [188] used AlphaSoft to fuse GC–E-nose, E-tongue, and electronic eye (E-eye) measurements for a comprehensive evaluation of the aroma, taste, and color of black tea.
Furthermore, Ref. [177] applied NIR and E-nose data for coffee samples, combined these with HS-SPME–GC–MS and QDA, and implemented regression-based ML (ANN) models to predict volatile compounds and sensory attributes according to fermentation and roasting levels. XR-based sensory-evaluation software remains at an early research stage and relies more on research-oriented platforms designed by investigators than on commercial products. Ref. [55] constructed a virtual sensory-evaluation booth to examine differences from traditional evaluation environments, and [64] implemented an olfactory identification procedure using a Unity3D-based VR platform, thereby indicating new possibilities for the assessment of olfactory stimuli.
Therefore, these contemporary software systems enable multidimensional data collection and analysis through the convergence of AI, ML, biosensing, digital sensors, and XR technologies, extending the scope of sensory evaluation from conventional questionnaire-based approaches to immersive and automation-oriented paradigms. Moreover, as sensory-evaluation software has been combined with electronic survey tools and remote-session capabilities, it has evolved to support online assessments that minimize temporal and spatial constraints. In practice, ref. [189] compared three remote consumer-evaluation methods and examined differences in acceptability, engagement, and practicality, while [190] assessed the validity and reproducibility of remote testing relative to laboratory testing and proposed guidelines for remote operation. These studies indicate that electronic-survey and remote-session functionalities constitute important development trajectories for sensory software.

6. Ethical Considerations

6.1. AI Technology

Artificial intelligence has emerged as a new paradigm that can advance traditional sensory evaluation methodologies. Human perception and description of food “taste” lack absolute reference points and often exhibit diverse and unpredictable patterns. Given this inherent complexity, the adoption of AI for large-scale data analysis has become an increasingly effective methodological approach for sensory evaluation. However, the implementation of AI in the field of consumer sensory science requires careful ethical consideration.
Ref. [29] pointed out that when sensory data are applied to ML models, data collected within specific countries or cultural contexts cannot be universally generalized across all markets. In addition, according to [191], such as ChatGPT may be influenced by human-generated training data, potentially resulting in biased or distorted outcomes. The authors further raised concerns about data security and the risk of personal privacy violations during model training [191,192].
As the era of AI progresses, the protection of personal privacy has become an increasingly critical issue, suggesting that the indiscriminate adoption of AI technologies may elicit consumer resistance due to ethical concerns [192]. Moreover, when collecting sensory data for AI-based applications, it is essential to account for cross-national and cultural differences. Since AI cannot fully capture the diversity and complexity of human expression, continuous monitoring and methodological refinement are necessary to ensure the reliability and validity of research outcomes.

6.2. XR (VR, AR, MR)

While XR offers new opportunities for sensory evaluation, it simultaneously entails a range of ethical concerns, including privacy protection, physical and psychological harm, bias, abuse within virtual spaces, and accessibility.
In particular, in sensory evaluation, VR continuously collects not only gaze, motion, and usage patterns but also physiological signals such as HR and SC and analyzes these data to tailor virtual environments to user behaviors [193]. If nonverbal cues that readily identify individuals are collected over extended periods and the transparency of their processing and sharing is not ensured, the risk of constructing profiles of individuals’ preferences and behaviors increases [193]. Moreover, advanced features such as location-tracking cameras and built-in microphones have been identified as additional concerns, insofar as some data may be captured even when these functions are nominally deactivated [194]. Accordingly, a privacy framework that centers on purpose limitation, data minimization, secure storage, controls on third-party transfers, transparent notice, and explicit consent is required. Furthermore, the participant information sheet and informed consent form should include clear language specifying the types of information that will be collected and recorded in the XR environment, the planned data-retention period, the procedures for data processing, and whether and under what conditions secondary use of the data will be permitted. In addition, the consent procedure should explicitly inform participants of their right to withdraw from the study at any time, whether and to what extent they may request deletion of their personal data, and the limitations of any de-identification measures applied. Physical and psychological harm likewise constitute important ethical considerations.
Physically, immersive systems pose risks such as disorientation and collisions; psychologically, users may experience cybersickness or excessive strain from intense affective stimuli. To minimize these risks, standardized procedures—pre-screening, safe-environment setup, conservative exposure scheduling (short sessions with breaks), real-time monitoring, predefined stopping criteria, and post-exposure recovery support—are essential [50]. Finally, high equipment costs, language proficiency, physical conditions, and cognitive demands can restrict participation, undermining sample representativeness and exacerbating economic and digital divides [193,195]. Consequently, accessibility-first design and the reduction in participation barriers are needed to ensure the inclusion of diverse users [196]. Therefore, the responsible implementation of XR-based sensory evaluation requires the institutionalization of minimum standards for privacy, safety, and accessibility, as well as the explicit documentation of policies on data-retention periods, de-identification, and secondary use of data. In addition, governance structures should be in place to systematically assess and disclose sample representativeness and potential sources of bias.

6.3. Biometrics Technology

Biometric technologies and traditional explicit sensory evaluations help prevent cognitive bias among study participants [195] and hold potential as alternatives to conventional sensory evaluation [197]. However, they also present ethical limitations such as the risk of participant re-identification, lack of inclusivity, and insufficient standards for risk assessment.
First, there is a possibility of re-identifying research participants. In particular, it has been pointed out that individuals can be identified through gaze response patterns such as eye-tracking and pupil size [198], raising concerns about potential invasion of participant privacy. As biometric recordings inherently capture unique physiological and behavioral patterns, consent forms should clearly specify the types of data collected, the purposes of analysis, conditions for secondary use, and participants’ rights to request data deletion [198,199,200]. Second, biometric technologies may introduce sample bias due to issues of inclusivity regarding people with disabilities, specific ethnicities, or demographic groups. One study reported that certain participants with disabilities experienced considerable discomfort when removing their glasses, and that glare reflections while wearing glasses interfered with the iris recognition process [201].
Additionally, ref. [199] found differences in accuracy related to participants’ race during facial recognition data analysis. Similarly, ref. [202] revealed that at low false match rate (FMR) levels, age-related loss of skin elasticity in older adults can degrade image capture performance. Third, existing Institutional Review Board (IRB) or human-subject research protocols are insufficient for assessing risks associated with biometric data. The current IRB framework was not designed for large-scale data collection and may overlook issues such as long-term data storage. Accordingly, data retention should be restricted to the minimum period necessary to fulfill the research objectives [198]. It has been suggested that including experts in biometric technologies and cybersecurity on IRB review panels could enhance the effectiveness of risk assessment for biometric research [200].

6.4. Digital Sensor

Advancements in robotics and electronic sensory technologies have established these systems as complementary tools to human-based sensory evaluation; however, several factors must still be considered before they can fully replace human sensory assessment. Ref. [164] reported that participants who tasted coffee extracted by a robotic barista exhibited a significant reduction in their food technology neophobia scale scores compared with those who tasted coffee brewed by a human barista. Emotional response analyses further indicated that consumers experienced distinct emotional states depending on whether the coffee was prepared by a robot or a human. Similarly, ref. [203] demonstrated that electronic sensory devices such as E-tongues were able to detect wine flavors with greater sensitivity than human sensory panels.
Nevertheless, the authors pointed out that certain flavor attributes perceptible to humans may not be captured by the E-tongue. These findings suggest that when commercializing robotic and electronic sensory technologies, it is essential to address potential issues such as consumer unfamiliarity and resistance. Moreover, human sensory evaluation remains indispensable for comprehensively capturing the subtle and multidimensional aspects of sensory perception. Therefore, when applying electronic sensory systems to ML models, it is essential to incorporate cross-validation procedures to verify the degree of agreement between sensor-based outputs and human sensory data [18].
In the food industry, the IoT demonstrates outstanding capabilities in large-scale data collection, real-time monitoring, and automation; yet several challenges persist. IoT systems cannot function independently, as they require the acquisition of vast amounts of data, which introduces multiple concerns. According to [171], major issues include the protection of personal data collected during consumer monitoring processes, increased consumer burden caused by higher supply chain management costs, and the need to comply with legal and regulatory requirements related to food certification and safety management.
To effectively and responsibly implement IoT technologies in the food sector, strict management of personal data is essential to build consumer trust, while strategies are needed to mitigate the financial burden placed on customers. Furthermore, as emphasized by [204], the establishment of formal governmental regulations and policy frameworks is crucial for addressing potential ethical issues and ensuring that systematic procedures are in place to manage the broader challenges associated with IoT adoption in food production and quality control.

7. Future Directions

Future sensory evaluation research is expected to evolve into a more sophisticated and contextually realistic assessment environment through the convergence of emerging technologies such as AI and ML, XR, biometrics, and IoT sensor networks. Conventional sensory analysis and consumer evaluation have inherent limitations in ecological and external validity due to the constraints of controlled laboratory settings [205]. In contrast, XR-based environments enable researchers to control identical environmental variables for participants while providing higher ecological validity than conventional laboratory settings, thus offering the potential to overcome the limitations of traditional sensory evaluations [46].
Biometric technologies quantify participants’ unconscious emotional responses to food by measuring physiological signals, thereby compensating for cognitive biases that arise when relying solely on self-reports [156]. They also provide deeper insights into the psychophysiological processes associated with food preference and intake [206]. Ref. [81] suggested that the value of biometrics in sensory and consumer research will increase when applied under more realistic research designs.
AI-based sensory and consumer science research utilizes ML to integratively analyze sensory data such as food acceptability, sensory attributes, and electronic sensor signals, enabling predictive modeling and automation of the evaluation process. Furthermore, AI plays an important role as an effective tool for collecting, exploring, and analyzing both instrumental and human data in sensory and consumer research [207]. Technological advancements in AI and ML are expected to move beyond fragmented analyses of individual evaluation parameters toward the integrated interpretation of consumers’ multidimensional data on food.
The incorporation of IoT sensor networks enables the interconnection of various sensor nodes that collect data on environmental variables, biosignals, and product characteristics through wireless communication systems, thereby facilitating real-time data collection, transmission, and analysis [208]. The application of IoT sensor networks is expected to create evaluation environments capable of real-time control of physical factors such as temperature, noise, aroma, and illumination, while linking these data with biometric signals to quantitatively correct the effects of environmental variables.
Research in sensory and consumer science utilizing emerging technologies should move beyond the independent use of each technology and progress toward building an intelligent and adaptive consumer evaluation system integrated within a unified framework. Such a system would employ AI to collect and analyze data, use XR to enhance ecological validity, utilize biometric data to increase the predictability and reliability of results through unconscious emotional responses, and integrate diverse data via IoT sensor networks to enable more precise environmental control during evaluations.
Moreover, future research should examine the integration of advanced computational approaches—such as foundation models, multimodal generative models, and reinforcement learning frameworks—into sensory and consumer evaluation pipelines. These methods may enable autonomous scenario adaptation, personalized stimulus optimization, and real-time adjustment of XR environments based on biometric feedback, ultimately supporting the development of closed-loop, intelligent-adaptive evaluation systems [209,210].

8. Conclusions

This review systematically summarized and organized the latest research in sensory and consumer science incorporating digital technologies into four categories: AI/ML, XR, biosensing, and digital sensors. Furthermore, it examined the literature on data integration and analysis through AI and ML, quantification of unconscious responses and reduction in self-report bias through biosignals, enhancement of ecological validity and immersion through XR, and precise control and real-time data acquisition through digital and IoT sensors. Based on these findings, an integrated framework linking multimodal signals with consumer perception was proposed. Additionally, it was inferred that the convergence of these technologies will promote the efficiency, quantification, and real-time capability of future sensory evaluations while improving ecological validity. Future research should extend applications to a wider range of food products and systematically investigate the methodological, ethical, and institutional validity and reliability surrounding these convergent technologies, with a view toward the development of emotion-based and personalized foods.

Author Contributions

Conceptualization, Y.L.; writing—original draft preparation, D.L., H.J. and Y.K.; writing—review and editing, Y.L.; visualization, D.L., H.J. and Y.K.; supervision, Y.L.; project administration, Y.L. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

No new data were created or analyzed in this study. Data sharing is not applicable to this article.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

AIArtificial intelligence
MLMachine learning
NLPNatural language processing
LLMsLarge language models
PLSPartial least squares
SVRSupport Vector Regression
SVMSupport Vector Machine
XGBoostExtreme Gradient Boosting
RFRandom Forest
NNNeural Network
KNNK-Nearest Neighbors
ANNArtificial Neural Network
DLDeep Learning
RMSERoot mean square error
DTDecision tree
GC–MSGas chromatography–mass spectrometry
EEMExcitation–Emission Matrix
ENRElastic Net Regression
PLS-DAPartial least squares discriminant analysis
BPBack-propagation
XRExtended reality
VRVirtual reality
ARAugmented reality
MRMixed reality
HMDHead-mounted display
EEGElectroencephalogram
fNIRSFunctional near-infrared spectroscopy
fMRIFunctional magnetic resonance imaging
BOLD Blood oxygenation level-dependent
HRHeart rate
HRVHeart rate variability
ANSAutonomic nervous system
EDA/GSRElectrodermal activity/Galvanic skin response
SCSkin conductance
SCLSkin conductance level
SRSkin resistance
ECGElectrocardiography
B.P.Blood pressure
FEAFacial expression analysis
EMGElectromyography
IoTInternet of Things
E-noseElectronic noses
E-tongueElectronic tongues

References

  1. Wang, J.; Wang, J.; Qiao, L.; Zhang, N.; Sun, B.; Li, H.; Chen, H. From traditional to intelligent, a review of application and progress of sensory analysis in alcoholic beverage industry. Food Chem. X 2024, 23, 101542. [Google Scholar] [CrossRef]
  2. Torrico, D.D.; Mehta, A.; Borssato, A.B. New methods to assess sensory responses: A brief review of innovative techniques in sensory evaluation. Curr. Opin. Food Sci. 2023, 49, 100978. [Google Scholar] [CrossRef]
  3. Chen, B.; Lin, X.; Liang, Z.; Chang, X.; Wang, Z.; Huang, M.; Zeng, X.A. Advances in food flavor analysis and sensory evaluation techniques and applications: Traditional vs emerging. Food Chem. 2025, 494, 146235. [Google Scholar] [CrossRef]
  4. Cosme, F.; Rocha, T.; Marques, C.; Barroso, J.; Vilela, A. Innovative approaches in sensory food science: From digital tools to virtual reality. Appl. Sci. 2025, 15, 4538. [Google Scholar] [CrossRef]
  5. Jo, D.M.; Han, S.J.; Ko, S.C.; Kim, K.W.; Yang, D.; Kim, J.Y.; Khan, F. Application of artificial intelligence in the advancement of sensory evaluation of food products. Trends Food Sci. Technol. 2025, 165, 105283. [Google Scholar] [CrossRef]
  6. Kraemer, P.M.; Weilbächer, R.A.; Mechera-Ostrovsky, T.; Gluth, S. Cognitive and neural principles of a memory bias on preferential choices. Curr. Res. Neurobiol. 2022, 3, 100029. [Google Scholar] [CrossRef] [PubMed]
  7. Colla, K.; Keast, R.; Mohebbi, M.; Russell, C.G.; Liem, D.G. Testing the validity of immersive eating environments against laboratory and real life settings. Food Qual. Prefer. 2023, 103, 104717. [Google Scholar] [CrossRef]
  8. Vanaraj, R.; IP, B.; Mayakrishnan, G.; Kim, I.S.; Kim, S.C. A Systematic Review of the Applications of Electronic Nose and Electronic Tongue in Food Quality Assessment and Safety. Chemosensors 2025, 13, 161. [Google Scholar] [CrossRef]
  9. Queiroz, L.P.; Nogueirac, I.B.R.; Ribeiro, A.M. Flavor Engineering: A comprehensive review of biological foundations, AI integration, industrial development, and socio-cultural dynamics. Food Res. Int. 2024, 196, 115100. [Google Scholar] [CrossRef] [PubMed]
  10. Cui, Z.; Qi, C.; Zhou, T.; Yu, Y.; Wang, Y.; Zhang, Z.; Zhang, Y.; Wang, W.; Liu, Y. Artificial intelligence and food flavor: How AI models are shaping the future and revolutionary technologies for flavor food development. Compr. Rev. Food Sci. Food Saf. 2025, 24, e70068. [Google Scholar] [CrossRef]
  11. Ji, H.; Pu, D.; Yan, W.; Zhang, Q.; Zuo, M.; Zhang, Y. Recent advances and application of machine learning in food flavor prediction and regulation. Trends Food Sci. Technol. 2023, 138, 738–751. [Google Scholar] [CrossRef]
  12. Zou, W.; Pan, F.; Yi, J.; Peng, W.; Tian, W.; Zhou, L. Targeted prediction of sensory preference for fermented pomegranate juice based on machine learning. LWT 2024, 201, 116260. [Google Scholar] [CrossRef]
  13. Keydana, S.; Chollet, F.; Kalinowski, T.; Allaire, J.J. Deep Learning with R, 2nd ed.; Manning Publications: Shelter Island, NY, USA, 2022. [Google Scholar]
  14. Saenz-Navajas, M.P.; Ferreira, C.; Bastian, S.E.; Jeffery, D.W. Bagging and boosting machine learning algorithms for modelling sensory perception from simple chemical variables: Wine mouthfeel as a case study. Food Qual. Prefer. 2025, 129, 105494. [Google Scholar] [CrossRef]
  15. Su, L.; Ji, H.; Kong, J.; Yan, W.; Zhang, Q.; Li, J.; Zuo, M. Recent advances and applications of deep learning, electroencephalography, and modern analysis techniques in screening, evaluation, and mechanistic analysis of taste peptides. Trends Food Sci. Technol. 2024, 150, 104607. [Google Scholar] [CrossRef]
  16. Ji, H.; Pu, D.; Su, L.; Zhang, Q.; Yan, W.; Kong, J.; Zuo, M.; Zhang, Y. Computational approaches for decoding structure-saltiness enhancement and aroma perception mechanisms of odorants: From machine learning to molecular simulation. Food Res. Int. 2025, 202, 115707. [Google Scholar] [CrossRef]
  17. Zhu, M.; Wang, M.; Gu, J.; Deng, Z.; Zhang, W.; Pan, Z.; Luo, G.; Wu, R.; Qin, J.; Gomi, K. Machine learning-assisted aroma profile prediction in Jiang-flavor baijiu. Food Chem. 2025, 478, 143661. [Google Scholar] [CrossRef] [PubMed]
  18. Jiang, J.; Ji, S.; Pan, G.; Tao, X.; An, F.; Liu, Q.; Wu, J.; Wu, R. Machine learning combined with sensory evaluation and multi-sensor technology to evaluate the overall quality of commercial soybean paste in China. J. Futur. Foods 2025, 6, 554–564. [Google Scholar] [CrossRef]
  19. Breiman, L. Random forests. Mach. Learn. 2001, 45, 5–32. [Google Scholar] [CrossRef]
  20. Lu, H.; Yao, C.; An, L.; Song, A.; Ling, F.; Huang, Q.; Cai, Y.; Liu, Y.; Kang, D. Classification and identification of chicken-derived adulteration in pork patties: A multi-dimensional quality profile and machine learning-based approach. Food Control 2025, 176, 111381. [Google Scholar] [CrossRef]
  21. Shuai, Y.; Zhang, K.; Zhang, T.; Zhu, H.; Jin, S.; Hu, T.; Yu, Z.; Liang, X. Decoding drinking water flavor: A pioneering and interpretable machine learning approach. J. Water Process Eng. 2025, 72, 107577. [Google Scholar] [CrossRef]
  22. Chitra Devi, V.; Devanampriyan, R.; Kayethri, D.; Sankari, R.; Premalatha, J.; Sathish Raam, R.; Mothil, S. Optimization and Process Validation of Freeze-Structured Meat Substitute Using Machine Learning Models. J. Food Process Eng. 2025, 48, e70071. [Google Scholar] [CrossRef]
  23. Harris, N.; Gonzalez Viejo, C.; Zhang, J.; Pang, A.; Hernandez-Brenes, C.; Fuentes, S. Enhancing beer authentication, quality, and control assessment using non-invasive spectroscopy through bottle and machine learning modeling. J. Food Sci. 2025, 90, e17670. [Google Scholar] [CrossRef]
  24. Miller, C.; Hamilton, L.; Lahne, J. Sensory Descriptor Analysis of Whisky Lexicons through the Use of Deep Learning. Foods 2021, 10, 1633. [Google Scholar] [CrossRef]
  25. Chen, Z.; Gurdian, C.; Sharma, C.; Prinyawiwatkul, W.; Torrico, D.D. Exploring Text Mining for Recent Consumer and Sensory Studies about Alternative Proteins. Foods 2021, 10, 2537. [Google Scholar] [CrossRef]
  26. Asseo, K.; Niv, M.Y. Harnessing Food Product Reviews for Personalizing Sweetness Levels. Foods 2022, 11, 1872. [Google Scholar] [CrossRef]
  27. Achiam, J.; Adler, S.; Agarwal, S.; Ahmad, L.; Akkaya, I.; Aleman, F.L.; McGrew, B. Gpt-4 technical report. arXiv 2023, arXiv:2303.08774. [Google Scholar] [CrossRef]
  28. Wang, Q.J.; Pellegrino, R. Automating chemosensory creativity assessment with large language models. Food Qual. Prefer. 2025, 132, 105599. [Google Scholar] [CrossRef]
  29. Ilieva, I.; Terziyska, M.; Dimitrova, T. From Words to Ratings: Machine Learning and NLP for Wine Reviews. Beverages 2025, 11, 80. [Google Scholar] [CrossRef]
  30. Visalli, M.; Symoneaux, R.; Mursic, C.; Touret, M.; Lourtioux, F.; Coulibaly, K.; Mahieu, B. Can natural language processing or large language models replace human operators for pre-processing word and sentence based free comments sensory evaluation data? Food Qual. Prefer. 2025, 127, 105456. [Google Scholar] [CrossRef]
  31. Torrico, D.D. The Potential Use of ChatGPT as a Sensory Evaluator of Chocolate Brownies: A Brief Case Study. Foods 2025, 14, 464. [Google Scholar] [CrossRef]
  32. Thomas, A.T.; Yee, A.; Mayne, A.; Mathur, M.B.; Jurafsky, D.; Gligoric, K. What Can Large Language Models Do for Sustainable Food? arXiv 2025, arXiv:2503.04734. [Google Scholar]
  33. Wang, Y.M.; Zhang, Z.; Sheng, Y.; Chi, C.F.; Wang, B. A systematic review on marine umami peptides: Biological sources, preparation methods, structure-umami relationship, mechanism of action and biological activities. Food Biosci. 2024, 57, 103637. [Google Scholar] [CrossRef]
  34. Liu, M.; Wang, K.; Zhang, Y.; Zhou, X.; Li, W.; Han, W. Mechanistic Study of Protein Interaction with Natto Inhibitory Peptides Targeting Xanthine Oxidase: Insights from Machine Learning and Molecular Dynamics Simulations. J. Chem. Inf. Model. 2025, 65, 3682–3696. [Google Scholar] [CrossRef]
  35. Mao, J.; Liu, Y.; Ma, D.; Zhou, Z. Virtual screening of umami peptides during sufu ripening based on machine learning and molecular docking to umami receptor T1R1/T1R3. Food Chem. 2025, 486, 144684. [Google Scholar] [CrossRef]
  36. Fu, B.; Li, M.; Chang, Z.; Yi, J.; Cheng, S.; Du, M. Identification of novel umami peptides from oyster hydrolysate and the mechanisms underlying their taste characteristics using machine learning. Food Chem. 2025, 473, 142970. [Google Scholar] [CrossRef]
  37. Geng, H.; Xu, C.; Ma, H.; Dai, Y.; Jiang, Z.; Yang, M.; Zhu, D. In Silico Discovery and Sensory Validation of Umami Peptides in Fermented Sausages: A Study Integrating Deep Learning and Molecular Modeling. Foods 2025, 14, 2422. [Google Scholar] [CrossRef] [PubMed]
  38. Mei, S.; Zhang, L.; Li, Y.; Zhang, X.; Li, W.; Wu, T. Machine learning-based exploration of Umami peptides in Pixian douban: Insights from virtual screening, molecular docking, and post-translational modifications. Food Chem. 2025, 478, 143672. [Google Scholar] [CrossRef]
  39. Patchipala, S. Tackling data and model drift in AI: Strategies for maintaining accuracy during ML model inference. Int. J. Sci. Res. Arch. 2023, 10, 1198–1209. [Google Scholar] [CrossRef]
  40. Jian, C.; Gao, J.; Ao, Y. A new sampling method for classifying imbalanced data based on support vector machine ensemble. Neurocomputing 2016, 193, 115–122. [Google Scholar] [CrossRef]
  41. Jing, X.; Wu, F.; Dong, X.; Qi, F.; Xu, B. Heterogeneous Cross-Company Defect Prediction by Unified Metric Representation and CCA-Based Transfer Learning. In Proceedings of the 2015 10th Joint Meeting on Foundations of Software Engineering (ESEC/FSE 2015), Bergamo, Italy, 30 August–4 September 2015. [Google Scholar]
  42. Ryu, D.; Jang, J.I.; Baik, J. A hybrid instance selection using nearest-neighbor for cross-project defect prediction. J. Comput. Sci. Technol. 2015, 30, 969–980. [Google Scholar] [CrossRef]
  43. Felix, E.A.; Lee, S.P. Systematic literature review of preprocessing techniques for imbalanced data. IET Softw. 2019, 13, 479–496. [Google Scholar] [CrossRef]
  44. Yang, H.; Wang, Y.; Zhao, J.; Li, P.; Wang, F. A machine learning method for juice human sensory hedonic prediction using electronic sensory features. Curr. Res. Food Sci. 2023, 7, 100576. [Google Scholar] [CrossRef]
  45. Ferrara, E. Should ChatGPT be biased? challenges and risks of bias in large language models. arXiv 2023, arXiv:2304.03738. [Google Scholar] [CrossRef]
  46. Ribeiro, J.C.; Rocha, C.; Barbosa, B.; Lima, R.C.; Cunha, L.M. Sensory Analysis Performed within Augmented Virtuality System: Impact on Hedonic Scores, Engagement, and Presence Level. Foods 2024, 13, 2456. [Google Scholar] [CrossRef]
  47. Low, J.Y.; Diako, C.; Lin, V.H.; Yeon, L.J.; Hort, J. Investigating the relative merits of using a mixed reality context for measuring affective response and predicting tea break snack choice. Food Res. Int. 2021, 150, 110718. [Google Scholar] [CrossRef]
  48. Gouton, M.A.; Dacremont, C.; Trystram, G.; Blumenthal, D. Effect of perceptive enrichment on the efficiency of simulated contexts: Comparing virtual reality and immersive room settings. Food Res. Int. 2023, 165, 112492. [Google Scholar] [CrossRef]
  49. Lichters, M.; Möslein, R.; Sarstedt, M.; Scharf, A. Segmenting consumers based on sensory acceptance tests in sensory labs, immersive environments, and natural consumption settings. Food Qual. Prefer. 2021, 89, 104138. [Google Scholar] [CrossRef]
  50. Qamar, S.; Anwar, Z.; Afzal, M. A systematic threat analysis and defense strategies for the metaverse and extended reality systems. Comput. Secur. 2023, 128, 103127. [Google Scholar] [CrossRef]
  51. Milgram, P.; Kishino, F. A taxonomy of mixed reality visual displays. IEICE Trans. Inf. Syst. 1994, 77, 1321–1329. [Google Scholar]
  52. Zulkarnain, A.H.B.; Gere, A. Virtual Reality Sensory Analysis Approaches for Sustainable Food Production. Appl. Food Res. 2025, 5, 100780. [Google Scholar] [CrossRef]
  53. Fuentes, S.; Tongson, E.; Viejo, C.G. Novel digital technologies implemented in sensory science and consumer perception. Curr. Opin. Food Sci. 2021, 41, 99–106. [Google Scholar] [CrossRef]
  54. Botinestean, C.; Melios, S.; Crofton, E. Exploring Consumer Perception of Augmented Reality (AR) Tools for Displaying and Understanding Nutrition Labels: A Pilot Study. Multimodal Technol. Interact. 2025, 9, 97. [Google Scholar] [CrossRef]
  55. Lee, H. A conceptual model of immersive experience in extended reality. Comput. Hum. Behav. Rep. 2025, 18, 100663. [Google Scholar] [CrossRef]
  56. Guberman, M.A.; Sakdavong, J.C.; Galmarini, M. Modulating taste perception through color and shape: A mixed reality study on solid foods. Front. Comput. Sci. 2025, 7, 1512931. [Google Scholar] [CrossRef]
  57. Rokhsaritalemi, S.; Sadeghi-Niaraki, A.; Choi, S.M. A review on mixed reality: Current trends, challenges and prospects. Appl. Sci. 2020, 10, 636. [Google Scholar] [CrossRef]
  58. Liu, M.; Chen, Z.; Huang, J.; Wan, X. From virtual reality visuals to real food perception: Uncovering the link between food aesthetics and taste in Chinese female consumers. Food Qual. Prefer. 2025, 131, 105569. [Google Scholar] [CrossRef]
  59. da Silva, F.N.; Minim, L.A.; Lima Filho, T.; Costa, A.A.D.S.X.; Vidigal, M.C.T.R.; Minim, V.P.R. Immersive virtual contexts, engagement, and emotions: How do these factors influence sensory acceptance? Food Res. Int. 2025, 207, 116106. [Google Scholar] [CrossRef] [PubMed]
  60. Fritz, W.; Hadi, R.; Stephen, A. From tablet to table: How augmented reality influences food desirability. J. Acad. Mark. Sci. 2023, 51, 503–529. [Google Scholar] [CrossRef]
  61. Barker, I.; Yang, Q.; Flintham, M.; Ankeny, R.; Ford, R. Improving immersive consumption contexts using virtual & mixed reality. Sci. Talks 2024, 10, 100346. [Google Scholar] [CrossRef]
  62. Alba-Martínez, J.; Alcañiz, M.; Martínez-Monzó, J.; Cunha, L.M.; García-Segovia, P. Beyond Reality: Exploring the effect of different virtual reality environments on visual assessment of cakes. Food Res. Int. 2024, 179, 114019. [Google Scholar] [CrossRef]
  63. Zulkarnain, A.H.B.; Kókai, Z.; Gere, A. Immersive sensory evaluation: Practical use of virtual reality sensory booth. MethodsX 2024, 12, 102631. [Google Scholar] [CrossRef]
  64. Zulkarnain, A.H.B.; Radványi, D.; Szakál, D.; Kókai, Z.; Gere, A. Unveiling aromas: Virtual reality and scent identification for sensory analysis. Curr. Res. Food Sci. 2024, 8, 100698. [Google Scholar] [CrossRef]
  65. Zulkarnain, A.H.B.; Kókai, Z.; Gere, A. Assessment of a virtual sensory laboratory for consumer sensory evaluations. Heliyon 2024, 10, e25498. [Google Scholar] [CrossRef]
  66. Man, K.; Patterson, J.A.; Simons, C. The impact of personally relevant consumption contexts during product evaluations in virtual reality. Food Qual. Prefer. 2023, 109, 104912. [Google Scholar] [CrossRef]
  67. Jeganathan, K.; Szymkowiak, A. Playing with food–The effects of augmented reality on meal perceptions. Food Qual. Prefer. 2023, 111, 104969. [Google Scholar] [CrossRef]
  68. Mellos, I.; Probst, Y. Evaluating augmented reality for ‘real life’teaching of food portion concepts. J. Hum. Nutr. Diet. 2022, 35, 1245–1254. [Google Scholar] [CrossRef]
  69. Dong, Y.; Sharma, C.; Mehta, A.; Torrico, D.D. Application of augmented reality in the sensory evaluation of yogurts. Fermentation 2021, 7, 147. [Google Scholar] [CrossRef]
  70. Halabi, O.; Saleh, M. Augmented reality flavor: Cross-modal mapping across gustation, olfaction, and vision. Multimedia Tools Appl. 2021, 80, 36423–36441. [Google Scholar] [CrossRef] [PubMed]
  71. Long, J.W.; Masters, B.; Sajjadi, P.; Simons, C.; Masterson, T.D. The development of an immersive mixed-reality application to improve the ecological validity of eating and sensory behavior research. Front. Nutr. 2023, 10, 1170311. [Google Scholar] [CrossRef] [PubMed]
  72. Low, J.Y.; Lin, V.H.; Yeon, L.J.; Hort, J. Considering the application of a mixed reality context and consumer segmentation when evaluating emotional response to tea break snacks. Food Qual. Prefer. 2021, 88, 104113. [Google Scholar] [CrossRef]
  73. Fuchs, K.; Haldimann, M.; Grundmann, T.; Fleisch, E. Supporting food choices in the Internet of People: Automatic detection of diet-related activities and display of real-time interventions via mixed reality headsets. Futur. Gener. Comput. Syst. 2020, 113, 343–362. [Google Scholar] [CrossRef]
  74. Li, H.; Ding, Y.; Zhao, B.; Xu, Y.; Wei, W. Effects of immersion in a simulated natural environment on stress reduction and emotional arousal: A systematic review and meta-analysis. Front. Psychol. 2023, 13, 1058177. [Google Scholar] [CrossRef] [PubMed]
  75. Breves, P.; Stein, J.P. Cognitive load in immersive media settings: The role of spatial presence and cybersickness. Virtual Real. 2023, 27, 1077–1089. [Google Scholar] [CrossRef]
  76. Wen, H.; Leung, X.Y. Virtual wine tours and wine tasting: The influence of offline and online embodiment integration on wine purchase decisions. Tour. Manag. 2021, 83, 104250. [Google Scholar] [CrossRef]
  77. Cossio, S.; Chiappinotto, S.; Dentice, S.; Moreal, C.; Magro, G.; Dussi, G.; Galazzi, A. Cybersickness and discomfort from head-mounted displays delivering fully immersive virtual reality: A systematic review. Nurse Educ. Pr. 2025, 85, 104376. [Google Scholar] [CrossRef]
  78. Miguel-Alonso, I.; Checa, D.; Guillen-Sanz, H.; Bustillo, A. Evaluation of the novelty effect in immersive virtual reality learning experiences. Virtual Real. 2024, 28, 27. [Google Scholar] [CrossRef]
  79. Rodrigues, S.S.; Dias, L.G.; Teixeira, A. Emerging methods for the evaluation of sensory quality of food: Technology at service. Curr. Food Sci. Technol. Rep. 2024, 2, 77–90. [Google Scholar] [CrossRef]
  80. Cong, L.; Luan, S.; Young, E.; Mirosa, M.; Bremer, P.; Torrico, D.D. The application of biometric approaches in agri-food marketing: A systematic literature review. Foods 2023, 12, 2982. [Google Scholar] [CrossRef]
  81. Wagner, J.; Hort, J. A practical evaluation of biometric measures for understanding the consumer experience during direct product evaluation: Current and future perspectives. Curr. Opin. Food Sci. 2025, 63, 101311. [Google Scholar] [CrossRef]
  82. National Institute of Standards and Technology. Biometrics. Computer Security Resource Center Glossary. Available online: https://csrc.nist.gov/glossary/term/biometrics (accessed on 8 October 2025).
  83. Adhikari, K. Application of selected neuroscientific methods in consumer sensory analysis: A review. J. Food Sci. 2023, 88, 53–64. [Google Scholar] [CrossRef]
  84. Modesti, M.; Tonacci, A.; Sansone, F.; Billeci, L.; Bellincontro, A.; Cacopardo, G.; Sanmartin, C.; Taglieri, I.; Venturi, F. E-Senses, Panel Tests and Wearable Sensors: A Teamwork for Food Quality Assessment and Prediction of Consumer’s Choices. Chemosensors 2022, 10, 244. [Google Scholar] [CrossRef]
  85. Bringas Vega, M.L.; Guo, Y.; Tang, Q.; Razzaq, F.A.; Calzada Reyes, A.; Ren, P.; Valdes Sosa, P.A. An age-adjusted EEG source classifier accurately detects school-aged barbadian children that had protein energy malnutrition in the first year of life. Front. Neurosci. 2019, 13, 1222. [Google Scholar] [CrossRef]
  86. Wang, J.; Wang, M. Review of the emotional feature extraction and classification using EEG signals. Cogn. Robot. 2021, 1, 29–40. [Google Scholar] [CrossRef]
  87. Niso, G.; Romero, E.; Moreau, J.T.; Araujo, A.; Krol, L.R. Wireless EEG: A survey of systems and studies. NeuroImage 2023, 269, 119774. [Google Scholar] [CrossRef] [PubMed]
  88. Lopez-Gordo, M.A.; Sanchez-Morillo, D.; Valle, F.P. Dry EEG Electrodes. Sensors 2014, 14, 12847–12870. [Google Scholar] [CrossRef]
  89. Bazzani, A.; Ravaioli, S.; Trieste, L.; Faraguna, U.; Turchetti, G. Is EEG suitable for marketing research? A systematic review. Front. Neurosci. 2020, 14, 594566. [Google Scholar] [CrossRef]
  90. Ferrari, M.; Quaresima, V. A brief review on the history of human functional near-infrared spectroscopy (fNIRS) development and fields of application. Neuroimage 2012, 63, 921–935. [Google Scholar] [CrossRef]
  91. Gore, J.C. Principles and practice of functional MRI of the human brain. J. Clin. Investig. 2003, 112, 4–9. [Google Scholar] [CrossRef]
  92. Zhao, Q.; Yang, P.; Wang, X.; Ye, Z.; Xu, Z.; Chen, J.; Cheng, H. Unveiling brain response mechanisms of citrus flavor perception: An EEG-based study on sensory and cognitive responses. Food Res. Int. 2025, 206, 116096. [Google Scholar] [CrossRef] [PubMed]
  93. Meyerding, S.G.; He, X.; Bauer, A. Neuronal correlates of basic taste perception and hedonic evaluation using functional Near-Infrared Spectroscopy (fNIRS). Appl. Food Res. 2024, 4, 100477. [Google Scholar] [CrossRef]
  94. Mai, J.; Li, S.; Wei, Z.; Sun, Y. Implicit Measurement of Sweetness Intensity and Affective Value Based on fNIRS. Chemosensors 2025, 13, 36. [Google Scholar] [CrossRef]
  95. Fu, Y.; Wang, C.; He, G.; Fan, S.; Wang, D.; Lv, G.; Huangfu, J. Neurophysiological and multimodal sensory evaluation of baijiu and food pairing liking. Food Res. Int. 2025, 221, 117451. [Google Scholar] [CrossRef]
  96. Stickel, L.; Grunert, K.G.; Lähteenmäki, L. Implicit and explicit liking of a snack with health-versus taste-related information. Food Qual. Prefer. 2025, 122, 105293. [Google Scholar] [CrossRef]
  97. Yang, T.; Jiang, W.; Luo, C.; Hu, J.; Xu, W.; Zhang, P.; Yang, Y. Multidimensional analysis of sensory perception and neurophysiological responses to marinated beef cooked by diverse methods. Innov. Food Sci. Emerg. Technol. 2025, 103, 104058. [Google Scholar] [CrossRef]
  98. Artêncio, M.M.; Giraldi, J.D.M.E.; de Oliveira, J.H.C. A cup of black coffee with GI, please! Evidence of geographical indication influence on a coffee tasting experiment. Physiol. Behav. 2022, 245, 113671. [Google Scholar]
  99. Bilucaglia, M.; Bellati, M.; Fici, A.; Russo, V.; Zito, M. Tuning into Flavour: Predicting Coffee Sensory Attributes from EEG with Boosted-Tree Regression Models. Front. Hum. Neurosci. 2025, 19, 1661214. [Google Scholar] [CrossRef]
  100. Wang, G.; Wang, X.; Cheng, H.; Li, H.; Qin, Z.; Zheng, F.; Sun, B. Application of electroencephalogram (EEG) in the study of the influence of different contents of alcohol and Baijiu on brain perception. Food Chem. 2025, 462, 140969. [Google Scholar] [CrossRef]
  101. Monciatti, A.M.; Lapini, M.; Gemignani, J.; Frediani, G.; Carpi, F. Unpleasant odors compared to pleasant ones cause higher cortical activations detectable by fNIRS and observable mostly in females. APL Bioeng. 2025, 9, 016101. [Google Scholar] [CrossRef]
  102. Jezierska, K.; Cymbaluk-Płoska, A.; Zaleska, J.; Podraza, W. Gustatory-Visual Interaction in Human Brain Cortex: fNIRS Study. Brain Sci. 2025, 15, 92. [Google Scholar] [CrossRef] [PubMed]
  103. Suen, J.L.K.; Yeung, A.W.K.; Wu, E.X.; Leung, W.K.; Tanabe, H.C.; Goto, T.K. Effective connectivity in the human brain for sour taste, retronasal smell, and combined flavour. Foods 2021, 10, 2034. [Google Scholar] [CrossRef]
  104. Gómez-Carmona, D.; Muñoz-Leiva, F.; Paramio, A.; Liébana-Cabanillas, F.; Cruces-Montes, S. What do you want to eat? Influence of menu description and design on consumer’s mind: An fMRI study. Foods 2021, 10, 919. [Google Scholar] [CrossRef]
  105. Faridi Esfanjani, A.; Mohebbi, M. Enhancing saltiness perception by chemosensory interaction: An fMRI study. Sci. Rep. 2023, 13, 11128. [Google Scholar] [CrossRef]
  106. Mastinu, M.; Thaploo, D.; Warr, J.; Hummel, T. Cortical Representation of Food-Related Odors in Gustatory Areas Differs According to Their Taste Association: An fMRI Study. Brain Sci. 2025, 15, 418. [Google Scholar] [CrossRef]
  107. Kim, J.; André, E. Emotion recognition based on physiological changes in music listening. IEEE Trans. Pattern Anal. Mach. Intell. 2008, 30, 2067–2083. [Google Scholar] [CrossRef]
  108. Cox, O.D.; Munjal, A.; McCall, W.V.; Miller, B.J.; Baeken, C.; Rosenquist, P.B. A review of clinical studies of electrodermal activity and transcranial magnetic stimulation. Psychiatry Res. 2023, 329, 115535. [Google Scholar] [CrossRef]
  109. Cowley, B.; Filetti, M.; Lukander, K.; Torniainen, J.; Henelius, A.; Ahonen, L.; Jacucci, G. The psychophysiology primer: A guide to methods and a broad review with a focus on human–computer interaction. Found. Trends® Hum.–Comput. Interact. 2016, 9, 151–308. [Google Scholar] [CrossRef]
  110. Boucsein, W.; Fowles, D.C.; Grimnes, S.; Ben-Shakhar, G.; Roth, W.T.; Dawson, M.E.; Filion, D.L. Publication recommendations for electrodermal measurements. Psychophysiology 2012, 49, 1017–1034. [Google Scholar] [CrossRef]
  111. Benedek, M.; Kaernbach, C. A continuous measure of phasic electrodermal activity. J. Neurosci. Methods 2010, 190, 80–91. [Google Scholar] [CrossRef]
  112. Quigley, K.S.; Gianaros, P.J.; Norman, G.J.; Jennings, J.R.; Berntson, G.G.; de Geus, E.J. Publication guidelines for human heart rate and heart rate variability studies in psychophysiology—Part 1: Physiological underpinnings and foundations of measurement. Psychophysiology 2024, 61, e14604. [Google Scholar] [CrossRef]
  113. Ishaque, S.; Khan, N.; Krishnan, S. Trends in heart-rate variability signal analysis. Front. Digit. Health 2021, 3, 639444. [Google Scholar] [CrossRef]
  114. Borgianni, Y.; Maccioni, L. Review of the use of neurophysiological and biometric measures in experimental design research. Artif. Intell. Eng. Des. Anal. Manuf. 2020, 34, 248–285. [Google Scholar] [CrossRef]
  115. Tonacci, A.; Scalzini, G.; Díaz-Guerrero, P.; Sanmartin, C.; Taglieri, I.; Ferroni, G.; Venturi, F. Chemosensory analysis of emotional wines: Merging of explicit and implicit methods to measure emotions aroused by red wines. Food Res. Int. 2024, 190, 114611. [Google Scholar] [CrossRef]
  116. Lagast, S.; De Steur, H.; Gadeyne, S.; Hödl, S.; Staljanssens, W.; Vonck, K.; De Herdt, V. Heart rate, electrodermal responses and frontal alpha asymmetry to accepted and non-accepted solutions and drinks. Food Qual. Prefer. 2020, 82, 103893. [Google Scholar] [CrossRef]
  117. Spinelli, S.; Pierguidi, L.; Gavazzi, G.; Dinnella, C.; De Toffoli, A.; Prescott, J.; Monteleone, E. Skin conductance responses to oral stimuli: The role of taste quality and intensity, and personality traits. Food Qual. Prefer. 2023, 109, 104917. [Google Scholar] [CrossRef]
  118. Álvarez-Pato, V.M.; Sánchez, C.N.; Domínguez-Soberanes, J.; Méndoza-Pérez, D.E.; Velázquez, R. A multisensor data fusion approach for predicting consumer acceptance of food products. Foods 2020, 9, 774. [Google Scholar] [CrossRef]
  119. Tang, B.; Zhu, M.; Wu, Y.; Guo, G.; Hu, Z.; Ding, Y. Autonomic Responses Associated with Olfactory Preferences of Fragrance Consumers: Skin Conductance, Respiration, and Heart Rate. Sensors 2024, 24, 5604. [Google Scholar] [CrossRef]
  120. Stuldreher, I.V.; Van der Burg, E.; Velut, S.; Toet, A.; van Os, D.E.; Hiraguchi, H.; Brouwer, A.M. Electrodermal activity as an index of food neophobia outside the lab. Front. Neuroergon. 2024, 4, 1297722. [Google Scholar] [CrossRef]
  121. Larrañaga-Ayastuy, E.; Mora, M.; Romeo-Arroyo, E.; Esteban, E.; Vázquez-Araújo, L. Electrodermal response and its relationship with explicit response in controlled and real contexts: A case study with different beer styles. J. Sens. Stud. 2023, 38, e12799. [Google Scholar] [CrossRef]
  122. Verastegui-Tena, L.; van Trijp, H.; Piqueras-Fiszman, B. Heart rate and skin conductance responses to taste, taste novelty, and the (dis) confirmation of expectations. Food Qual. Prefer. 2018, 65, 1–9. [Google Scholar] [CrossRef]
  123. He, W.; De Wijk, R.A.; De Graaf, C.; Boesveldt, S. Implicit and explicit measurements of affective responses to food odors. Chem. Senses. 2016, 41, 661–668. [Google Scholar] [CrossRef]
  124. Samant, S.S.; Seo, H.S. Influences of sensory attribute intensity, emotional responses, and non-sensory factors on purchase intent toward mixed-vegetable juice products under informed tasting condition. Food Res. Int. 2020, 132, 109095. [Google Scholar] [CrossRef]
  125. Gidlöf, K.; Anikin, A.; Lingonblad, M.; Wallin, A. Looking is buying. How visual attention and choice are affected by consumer preferences and properties of the supermarket shelf. Appetite 2017, 116, 29–38. [Google Scholar] [CrossRef]
  126. Motoki, K.; Saito, T.; Onuma, T. Eye-tracking research on sensory and consumer science: A review, pitfalls and future directions. Food Res. Int. 2021, 145, 110389. [Google Scholar] [CrossRef]
  127. Hessels, R.S.; Nuthmann, A.; Nyström, M.; Andersson, R.; Niehorster, D.C.; Hooge, I.T. The fundamentals of eye tracking part 1: The link between theory and research question. Behav. Res. Methods 2024, 57, 16. [Google Scholar] [CrossRef]
  128. Ehinger, B.V.; Groß, K.; Ibs, I.; König, P. A new comprehensive eye-tracking test battery concurrently evaluating the Pupil Labs glasses and the EyeLink 1000. PeerJ 2019, 7, e7086. [Google Scholar] [CrossRef]
  129. Fontana, L.; Albayay, J.; Zurlo, L.; Ciliberto, V.; Zampini, M. Olfactory modulation of visual attention and preference towards congruent food products: An eye tracking study. Food Qual. Prefer. 2025, 124, 105373. [Google Scholar] [CrossRef]
  130. Mehta, A.; Serventi, L.; Kumar, L.; Torrico, D.D. Exploring the effects of packaging on consumer experience and purchase behaviour: Insights from eye tracking and facial expressions on orange juice. Int. J. Food Sci. Technol. 2024, 59, 8445–8460. [Google Scholar] [CrossRef]
  131. Garza, R.; Galindo, D.; Garcia, K.P.; Gutierrez, T. Ecological harshness cues modulate food preferences and visual attention: An eye-tracking study. Food Qual. Prefer. 2025, 133, 105632. [Google Scholar] [CrossRef]
  132. Shi, S.W. Assortment levels, pupillary response, and product preference. J. Mark. Manag. 2022, 38, 2035–2054. [Google Scholar] [CrossRef]
  133. Ren, Y.; Liu, Q.; Wu, G.; Loy, J.P. Consumer preferences for sugar-sweetened beverages: Evidence from online surveys and laboratory eye-tracking choice experiments. Food Policy 2025, 130, 102791. [Google Scholar] [CrossRef]
  134. Yasui, Y.; Tanaka, J.; Kakudo, M.; Tanaka, M. Relationship between preference and gaze in modified food using eye tracker. J. Prosthodont. Res. 2019, 63, 210–215. [Google Scholar] [CrossRef]
  135. Malheiros, B.A.; Spers, E.E.; Contreras Castillo, C.J.; Aroeira, C.N.; de Lima, L.M. The Role of Visual Attention and Quality Cues in Consumer Purchase Decisions for Fresh and Cooked Beef: An Eye-Tracking Study. Appl. Sci. 2025, 15, 7360. [Google Scholar] [CrossRef]
  136. Yang, X.; Zandstra, E.H.; Boesveldt, S. How sweet odors affect healthy food choice: An eye-tracking study. Food Qual. Prefer. 2023, 109, 104922. [Google Scholar] [CrossRef]
  137. Höfling, T.T.A.; Alpers, G.W. Automatic facial coding predicts self-report of emotion, advertisement and brand effects elicited by video commercials. Front. Neurosci. 2023, 17, 1125983. [Google Scholar] [CrossRef]
  138. Salmi, A.; Li, J.; Holtta-Otto, K. Automatic Facial Expression Analysis as a Measure of User-Designer Empathy. ASME J. Mech. Des. 2023, 145, 031403. [Google Scholar] [CrossRef]
  139. Shu, Y.; Gao, H.; Wang, Y.; Wei, Y. Food Emotional Perception and Eating Willingness Under Different Lighting Colors: A Preliminary Study Based on Consumer Facial Expression Analysis. Foods 2025, 14, 3440. [Google Scholar] [CrossRef]
  140. Danner, L.; Duerrschmid, K. Automatic facial expressions analysis in consumer science. In Methods in Consumer Research; Woodhead Publishing: Cambridge, UK, 2018; Volume 2, pp. 231–252. [Google Scholar]
  141. Kulke, L.; Feyerabend, D.; Schacht, A. A comparison of the Affectiva iMotions Facial Expression Analysis Software with EMG for identifying facial expressions of emotion. Front. Psychol. 2020, 11, 329. [Google Scholar] [CrossRef]
  142. Rahmani, D.; Loureiro, M.L.; Escobar, C.; Gil, J.M. Choice experiments with facial expression analysis: How do emotions affect wine choices? J. Choice Model. 2024, 51, 100490. [Google Scholar] [CrossRef]
  143. Marques, C.; Vilela, A. FaceReader Insights into the Emotional Response of Douro Wines. Appl. Sci. 2024, 14, 10053. [Google Scholar] [CrossRef]
  144. Sato, W.; Ishihara, S.; Ikegami, A.; Kono, M.; Nakauma, M.; Funami, T. Dynamic concordance between subjective and facial EMG hedonic responses during the consumption of gel-type food. Curr. Res. Food Sci. 2025, 10, 101107. [Google Scholar] [CrossRef]
  145. Rohatgi, B.; Ramadoss, R.; Nitya, K.; Sundar, S.; Selvam, S.P.; Shree, K.H. Taste perception and muscular response: EMG based experimental evaluation. J. Oral Biol. Craniofacial Res. 2025, 15, 472–477. [Google Scholar] [CrossRef]
  146. Galler, M.; Grendstad, Å.R.; Ares, G.; Varela, P. Capturing food-elicited emotions: Facial decoding of children’s implicit and explicit responses to tasted samples. Food Qual. Prefer. 2022, 99, 104551. [Google Scholar] [CrossRef]
  147. Mehta, A.; Sharma, C.; Kanala, M.; Thakur, M.; Harrison, R.; Torrico, D.D. Self-reported emotions and facial expressions on consumer acceptability: A study using energy drinks. Foods 2021, 10, 330. [Google Scholar] [CrossRef]
  148. Katsikari, A.; Pedersen, M.E.; Berget, I.; Varela, P. Use of face reading to measure oral processing behaviour and its relation to product perception. Food Qual. Prefer. 2024, 119, 105209. [Google Scholar] [CrossRef]
  149. Mena, B.; Torrico, D.D.; Hutchings, S.; Ha, M.; Ashman, H.; Warner, R.D. Understanding consumer liking of beef patties with different firmness among younger and older adults using FaceReader™ and biometrics. Meat Sci. 2023, 199, 109124. [Google Scholar] [CrossRef]
  150. Wakihira, T.; Morimoto, M.; Higuchi, S.; Nagatomi, Y. Can facial expressions predict beer choices after tasting? A proof of concept study on implicit measurements for a better understanding of choice behavior among beer consumers. Food Qual. Prefer. 2022, 100, 104580. [Google Scholar] [CrossRef]
  151. Sodhi, N.S.; Dhillon, B.; Pandey, V.K.; Rustagi, S.; Dash, K.K.; Sharma, S.; Singh, A. Applicability of electromyography (EMG) as a prospective technique for textural evaluation of different types of biscuits. J. Agric. Food Res. 2024, 16, 101089. [Google Scholar] [CrossRef]
  152. Wagner, J.; Wilkin, J.D.; Szymkowiak, A.; Grigor, J. Sensory and affective response to chocolate differing in cocoa content: A TDS and facial electromyography approach. Physiol. Behav. 2023, 270, 114308. [Google Scholar] [CrossRef]
  153. Sato, W.; Minemoto, K.; Ikegami, A.; Nakauma, M.; Funami, T.; Fushiki, T. Facial EMG correlates of subjective hedonic responses during food consumption. Nutrients 2020, 12, 1174. [Google Scholar] [CrossRef]
  154. D’Adamo, G.; Andreani, G.; Ardizzi, M.; Ferroni, F.; De Marco, D.; Asioli, D.; Umiltà, M.A. The physiological mechanisms underlying consumer preferences towards organic food. Appetite 2025, 207, 107865. [Google Scholar] [CrossRef]
  155. Van Der Mee, D.J.; Gevonden, M.J.; Westerink, J.H.; De Geus, E.J.C. Validity of electrodermal activity-based measures of sympathetic nervous system activity from a wrist-worn device. Int. J. Psychophysiol. 2021, 168, 52–64. [Google Scholar] [CrossRef]
  156. Viejo, C.G.; Fuentes, S.; Howell, K.; Torrico, D.D.; Dunshea, F.R. Integration of non-invasive biometrics with sensory analysis techniques to assess acceptability of beer by consumers. Physiol. Behav. 2019, 200, 139–147. [Google Scholar] [CrossRef]
  157. Pimpini, L.; Kochs, S.; van Zoest, W.; Jansen, A.; Roefs, A. Food captures attention, but not the eyes: An eye-tracking study on mindset and BMI’s impact on attentional capture by high-caloric visual food stimuli. J. Cogn. 2022, 5, 19. [Google Scholar] [CrossRef]
  158. Graham, D.J.; Jeffery, R.W. Predictors of nutrition label viewing during food purchase decision making: An eye tracking investigation. Public Health Nutr. 2012, 15, 189–197. [Google Scholar] [CrossRef]
  159. Minematsu, Y.; Ueji, K.; Yamamoto, T. Activity of frontal pole cortex reflecting hedonic tone of food and drink: fNIRS study in humans. Sci. Rep. 2018, 8, 16197. [Google Scholar] [CrossRef]
  160. Mehlhose, C.; Risius, A. Signs of warning: Do health warning messages on sweets affect the neural prefrontal cortex activity? Nutrients 2020, 12, 3903. [Google Scholar] [CrossRef]
  161. Wagner, R.E.; Plácido da Silva, H.; Gramann, K. Validation of a low-cost electrocardiography (ECG) system for psychophysiological research. Sensors 2021, 21, 4485. [Google Scholar] [CrossRef]
  162. Kappel, S.L.; Looney, D.; Mandic, D.P.; Kidmose, P. Physiological artifacts in scalp EEG and ear-EEG. Biomed. Eng. Online 2017, 16, 103. [Google Scholar] [CrossRef]
  163. Costa-Feito, A.; González-Fernández, A.M.; Rodríguez-Santos, C.; Cervantes-Blanco, M. Electroencephalography in consumer behaviour and marketing: A science mapping approach. Humanit. Soc. Sci. Commun. 2023, 10, 1–13. [Google Scholar] [CrossRef]
  164. Park, S.; Park, M.K.; Heo, J.; Hwang, J.S.; Hwang, S.; Kim, D.; Chung, S.J.; Kwak, H.S. Robot versus human barista: Comparison of volatile compounds and consumers’ acceptance, sensory profile, and emotional response of brewed coffee. Food Res. Int. 2023, 172, 113119. [Google Scholar] [CrossRef]
  165. Ortiz, C.; Blanes, C.; Gonzalez-Planells, P.; Rovira-Más, F. Non-Destructive Evaluation of White-Flesh Dragon Fruit Decay with a Robot. Horticulturae 2023, 9, 1286. [Google Scholar] [CrossRef]
  166. Ciui, B.; Martin, A.; Mishra, R.K.; Nakagawa, T.; Dawkins, T.J.; Lyu, M.; Cristina, C.; Sandulescu, R.; Wang, J. Chemical sensing at the robot fingertips: Toward automated taste discrimination in food samples. ACS Sens. 2018, 3, 2375–2384. [Google Scholar] [CrossRef]
  167. Subramanian, M.; Kumar, T.S.; Sowmiya, K.; Prashithaa, N. IoT-Enhanced Quality Bread Assurance System (IQBAS). In Proceedings of the 2023 Intelligent Computing and Control for Engineering and Business Systems (ICCEBS), Chennai, India, 14–15 December 2023; IEEE: Piscataway, NJ, USA, 2023. [Google Scholar] [CrossRef]
  168. Irfanullah; Razaullah; Aslam, S.; Muqeem, F. Internet of Things Platform for Real Time Automated Safety System Based on Multi Sensor Network and Bluetooth Module. In Proceedings of the 2022 5th Conference on Cloud and Internet of Things (CIoT), Marrakech, Morocco, 28–30 March 2022; IEEE: Piscataway, NJ, USA, 2022. [Google Scholar] [CrossRef]
  169. Kok, C.L.; Chew, B.K.; Chan, Y.C.; Ong, M.K.; Toh, P.L.; Teoh, S.K.; Goh, H.G. Quality and Shelf-Life Extension of Vegetables Using Precision Control Storage System. E3S Web Conf. 2025, 603, 01026. [Google Scholar] [CrossRef]
  170. Damdam, A.N.; Ozay, L.O.; Ozcan, C.K.; Alzahrani, A.; Helabi, R.; Salama, K.N. IoT-Enabled Electronic Nose System for Beef Quality Monitoring and Spoilage Detection. Foods 2023, 12, 2227. [Google Scholar] [CrossRef]
  171. Bhat, M.A.; Rather, M.Y.; Singh, P.; Hassan, S.; Hussain, N. Advances in smart food authentication for enhanced safety and quality. Trends Food Sci. Technol. 2025, 155, 104800. [Google Scholar] [CrossRef]
  172. Torrico, D.D.; Sharma, C.; Dong, W.; Fuentes, S.; Viejo, C.G.; Dunshea, F.R. Virtual reality environments on the sensory acceptability and emotional responses of no-and full-sugar chocolate. LWT 2021, 137, 110383. [Google Scholar] [CrossRef]
  173. Meijers, M.H.; Smit, E.S.; de Wildt, K.; Karvonen, S.G.; van der Plas, D.; van der Laan, L.N. Stimulating sustainable food choices using virtual reality: Taking an environmental vs. health communication perspective on enhancing response efficacy beliefs. Environ. Commun. 2022, 16, 1–22. [Google Scholar] [CrossRef]
  174. Otsuki, M.; Okuma, T. Development and evaluation of a restaurant virtual reality training system for enhancing awareness and priority-setting skills. Sci. Rep. 2025, 15, 18673. [Google Scholar] [CrossRef] [PubMed]
  175. Aliya; Liu, S.; Zhang, D.; Cao, Y.; Sun, J.; Jiang, S.; Liu, Y. Research on the Evaluation of Baijiu Flavor Quality Based on Intelligent Sensory Technology Combined with Machine Learning. Chemosensors 2024, 12, 125. [Google Scholar] [CrossRef]
  176. Ma, C.; Ying, Y.; Xie, L. Development of a visuo-tactile sensor for non-destructive peach firmness and contact force measurement suitable for robotic arm applications. Food Chem. 2025, 467, 142282. [Google Scholar] [CrossRef]
  177. Wu, H.; Gonzalez Viego, C.; Fuentes, S.; Dunshea, F.R.; Suleria, H.A. Evaluation of spontaneous fermentation impact on the physicochemical properties and sensory profile of green and roasted arabica coffee by digital technologies. Food Res. Int. 2024, 176, 113800. [Google Scholar] [CrossRef]
  178. Viejo, C.G.; Fuentes, S.; Howell, K.; Torrico, D.; Dunshea, F.R. Robotics and computer vision techniques combined with non-invasive consumer biometrics to assess quality traits from beer foamability using machine learning: A potential for artificial intelligence applications. Food Control 2018, 92, 72–79. [Google Scholar] [CrossRef]
  179. Reiners, D.; Davahli, M.R.; Karwowski, W.; Cruz-Neira, C. The combination of artificial intelligence and extended reality: A systematic review. Front. Virtual Real. 2021, 2, 721933. [Google Scholar] [CrossRef]
  180. Subramanian, R.R.; Krishna, G.V.; Yeswanth, G.S.; Charan, G.; Sashavali, G. Smart packaging: IoT-integrated sensors for food freshness. In Proceedings of the 2025 International Conference on Computational Robotics, Testing and Engineering Evaluation (ICCRTEE), Virudhunagar, India, 28–30 May 2025; IEEE: Piscataway, NJ, USA, 2025. [Google Scholar] [CrossRef]
  181. Da Silva, T.H.B.; de Almeida Neto, M.C.; da Silva, M.F.B.; Diniz, P.K.; Galvão Filho, A.R.; Andrade, C.H. Enhancing immersive technologies with olfactory IoT devices: A new sensory frontier. In Proceedings of the 1st IEEE Latin American Conference on Internet of Things (LCIoT 2025), Fortaleza, Brazil, 23–25 April 2025; IEEE: Piscataway, NJ, USA, 2025. [Google Scholar]
  182. Xu, J.; Ma, R.; Stankovski, S.; Liu, X.; Zhang, X. Intelligent Dynamic Quality Prediction of Chilled Chicken with Integrated IoT Flexible Sensing and Knowledge Rules Extraction. Foods 2022, 11, 836. [Google Scholar] [CrossRef]
  183. Nair, K.; Sekhani, B.; Shah, K.; Karamchandani, S. Expiry Prediction and Reducing Food Wastage using IoT and ML. Int. J. Electr. Comput. Eng. Syst. 2021, 12, 155–162. [Google Scholar] [CrossRef]
  184. Mahagan, K.T.; Garmyn, A.J.; Legako, J.F.; Miller, M.F. A Comparison of Consumer Responses Using Paper and Digital Ballots for Eating Quality Assessment of Beef Steaks. Meat Muscle Biol. 2022, 5, 12611. [Google Scholar] [CrossRef]
  185. Sipos, L.; Nyitrai, Á.; Hitka, G.; Friedrich, L.F.; Kókai, Z. Sensory panel performance evaluation—Comprehensive review of practical approaches. Appl. Sci. 2021, 11, 11977. [Google Scholar] [CrossRef]
  186. Fuentes, S.; Gonzalez Viejo, C.; Torrico, D.D.; Dunshea, F.R. Digital integration and automated assessment of eye-tracking and emotional response data using the BioSensory App to maximize packaging label analysis. Sensors 2021, 21, 7641. [Google Scholar] [CrossRef]
  187. Gonzalez Viejo, C.; Harris, N.; Tongson, E.; Fuentes, S. Exploring consumer acceptability of leafy greens in earth and space immersive environments using biometrics. npj Sci. Food 2024, 8, 81. [Google Scholar] [CrossRef]
  188. Wang, L.; Xie, J.; Wang, Q.; Hu, J.; Jiang, Y.; Wang, J.; Yang, Y. Evaluation of the quality grade of Congou black tea by the fusion of GC-E-Nose, E-tongue, and E-eye. Food Chem. X 2024, 23, 101519. [Google Scholar] [CrossRef]
  189. Tapia, M.A.; Lee, S.Y. Variations in consumer acceptance, sensory engagement and method practicality across three remote consumer-testing modalities. Food Qual. Prefer. 2022, 100, 104616. [Google Scholar] [CrossRef]
  190. Dinnella, C.; Pierguidi, L.; Spinelli, S.; Borgogno, M.; Toschi, T.G.; Predieri, S.; Monteleone, E. Remote testing: Sensory test during Covid-19 pandemic and beyond. Food Qual. Prefer. 2022, 96, 104437. [Google Scholar] [CrossRef]
  191. Motoki, K.; Low, J.; Velasco, C. Generative AI framework for sensory and consumer research. Food Qual. Prefer. 2025, 133, 105600. [Google Scholar] [CrossRef]
  192. Velasco, C.; Reinoso-Carvalho, F.; Barbosa Escobar, F.; Gustafsson, A.; Petit, O. Paradoxes, challenges, and opportunities in the context of ethical customer experience management. Psychol. Mark. 2024, 41, 2506–2524. [Google Scholar] [CrossRef]
  193. Raja, U.S.; Al-Baghli, R. Ethical concerns in contemporary virtual reality and frameworks for pursuing responsible use. Front. Virtual Real. 2025, 6, 1451273. [Google Scholar] [CrossRef]
  194. Giaretta, A. Security and privacy in virtual reality: A literature survey. Virtual Real. 2024, 29, 10. [Google Scholar] [CrossRef]
  195. Lythreatis, S.; Singh, S.K.; El-Kassar, A.N. The digital divide: A review and future research agenda. Technol. Forecast. Soc. Change 2022, 175, 121359. [Google Scholar] [CrossRef]
  196. Gerling, K.; Meiners, A.-L.; Schumm, L.; Rixen, J.; Wolf, M.; Yildiz, Z.; Alexandrovsky, D.; Opp, M. An equitable experience? How HCI research conceptualizes accessibility of virtual reality in the context of disability. ACM Trans. Access. Comput. 2025; in press. [Google Scholar] [CrossRef]
  197. Torrico, D.D. Novel techniques to measure the sensory, emotional, and physiological responses of consumers toward foods. Foods 2021, 10, 2620. [Google Scholar] [CrossRef] [PubMed]
  198. Koch, R. What Are You Looking at? Emergency Privacy Concerns with Eye Tracking in Virtual Reality. Colo. Technol. Law J. 2021, 21, 109. [Google Scholar]
  199. Kotwal, K.; Marcel, S. Review of demographic fairness in face recognition. IEEE Trans. Biom. Behav. Identit-Sci. 2025. [Google Scholar] [CrossRef]
  200. Wang, X.; Wu, Y.C.; Zhou, M.; Fu, H. Beyond surveillance: Privacy, ethics, and regulations in face recognition technology. Front. Big Data 2024, 7, 1337465. [Google Scholar] [CrossRef]
  201. Ólafsdóttir, B.; Rathgeb, C.; Kolberg, J. A Case Study on the Inclusiveness of Biometric Technologies for Individuals with Congenital Disabilities. In Proceedings of the 13th Nordic Conference on Human-Computer Interaction, Uppsala, Sweden, 13–16 October 2024; pp. 1–7. [Google Scholar]
  202. Krishnan, A.; Almadan, A.; Rattani, A. Probing fairness of mobile ocular biometrics methods across gender on VISOB 2.0 dataset. In International Conference on Pattern Recognition; Springer International Publishing: Cham, Switzerland, 2021; pp. 229–243. [Google Scholar]
  203. Potter, R.I.; Warren, C.A.; Lee, J.; Ross, C.F. Comparative assessment of Riesling wine fault development by the electronic tongue and a sensory panel. J. Food Sci. 2024, 89, 3006–3018. [Google Scholar] [CrossRef]
  204. Aamer, A.M.; Al-Awlaqi, M.A.; Affia, I.; Arumsari, S.; Mandahawi, N. The internet of things in the food supply chain: Adoption challenges. Benchmarking: Int. J. 2021, 28, 2521–2541. [Google Scholar] [CrossRef]
  205. Low, J.Y.; Antlej, K.; Garvey, E.C.; Wang, Q.J. Recreating digital context: Navigating the future of food sensory studies through recent advances and applications. Curr. Opin. Food Sci. 2024, 57, 101176. [Google Scholar] [CrossRef]
  206. Pedersen, H.; Quist, J.S.; Jensen, M.M.; Clemmensen, K.K.B.; Vistisen, D.; Jørgensen, M.E.; Finlayson, G. Investigation of eye tracking, electrodermal activity and facial expressions as biometric signatures of food reward and intake in normal weight adults. Food Qual. Prefer. 2021, 93, 104248. [Google Scholar] [CrossRef]
  207. Nunes, C.A.; Ribeiro, M.N.; de Carvalho, T.C.; Ferreira, D.D.; de Oliveira, L.L.; Pinheiro, A.C. Artificial intelligence in sensory and consumer studies of food products. Curr. Opin. Food Sci. 2023, 50, 101002. [Google Scholar] [CrossRef]
  208. Gil, M.; Rudy, M.; Duma-Kocan, P.; Stanisławczyk, R. Electronic Sensing Technologies in Food Quality Assessment: A Comprehensive Literature Review. Appl. Sci. 2025, 15, 1530. [Google Scholar] [CrossRef]
  209. Liang, P.P. Foundations of multisensory artificial intelligence. arXiv 2024, arXiv:2404.18976. [Google Scholar] [CrossRef]
  210. Zhao, Q.; Ye, Z.; Deng, Y.; Chen, J.; Chen, J.; Liu, D.; Cheng, H. An Advance in Novel Intelligent Sensory Technologies: From an Implicit-Tracking Perspective of Food Perception. Compr. Rev. Food Sci. Food Saf. 2024, 23, e13327. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Overall workflow for cutting-edge technology-integrated sensory evaluation.
Figure 1. Overall workflow for cutting-edge technology-integrated sensory evaluation.
Foods 14 04169 g001
Figure 2. AI and ML in food sensory evaluation.
Figure 2. AI and ML in food sensory evaluation.
Foods 14 04169 g002
Figure 3. Conceptual decision tree for selecting lab and XR-based (VR, MR/AR) test environments in Sensory evaluation.
Figure 3. Conceptual decision tree for selecting lab and XR-based (VR, MR/AR) test environments in Sensory evaluation.
Foods 14 04169 g003
Figure 4. Spectrum of reality–virtuality: representation of XR, VR, AR, and MR (Adapted from [51], with permission from IEICE; Copyright ©1994 IEICE).
Figure 4. Spectrum of reality–virtuality: representation of XR, VR, AR, and MR (Adapted from [51], with permission from IEICE; Copyright ©1994 IEICE).
Foods 14 04169 g004
Figure 5. Classification of EDA into SCL and SCR.
Figure 5. Classification of EDA into SCL and SCR.
Foods 14 04169 g005
Figure 6. IoT framework for food quality and safety monitoring.
Figure 6. IoT framework for food quality and safety monitoring.
Foods 14 04169 g006
Table 1. Summaryof previous studies employing ML across various food categories and the ML algorithms used.
Table 1. Summaryof previous studies employing ML across various food categories and the ML algorithms used.
TechnologyFoodObjectiveKey Findings/SummaryAI/ML Algorithms
(Validation Protocol)
Reference
AI/
ML
Red
wine
To predict wine sensory attribute from simple
chemical data
(Voltammetry, EEM,
absorbance) using ML and PLS.
ML methods (RF and XGBoost) accurately predict wine mouthfeel from simple chemical data and outperform PLS (RF: R2 = 83~85%/RMSE 0.280~0.354, XGBoost: R2 = 91~92%/RMSE 0.206~0.230), offering a fast and low-cost approach for sensory prediction.[RF, XGBoost]

- DS: n = 30
- OR:
 RF = low/XGBoost = high

(K-fold cross-validation)
[14]
Jiang
-Flavor
Baijiu
(JFB)
To develop a predictive strategy for the global aroma profile of JFB, the present study integrates volatile compound data with ML algorithms. ML(NN) showed the best performance in predicting JFB aroma (R2 > 0.99), identifying 18 key flavor compounds, which were further validated through spiking, omission tests.[NN, DT, PLS,
RF, SVM]

- DS: n = 27
(dataset: n = 96)
- OR:
 RF = low/NN, DT, PLS,
SVM = high

(5-fold cross-
validation method)
[17]
MeatTo identify pork patty samples containing different levels of chicken adulteration. using ML techniques was the aim this study. BP-ANN demonstrated the highest accuracy in predicting chicken adulteration levels in pork patties (99.52%), and SHAP analysis identified key discriminant indicators (e.g., Thr, C *, His).[PLS-DA, SVM,
BP-ANN]

- DS: n = 300
(Dataset: n = 43)
- OR:
 PLS-DA = low to
medium/SVM = medium
to high/BP-ANN = high

(SVM: 5-fold
cross-validation method)
[20]
Drinking WaterTo develop a reliable predictive model for drinking water flavor by integrating diverse water quality indicators with ML techniques. XGBoost showed the highest accuracy in predicting drinking water flavor (R2 = 0.916, RMSE = 0.482), and SHAP analysis identified key water-quality indicators. A simplified model using only 10 parameters also maintained strong performance.[PLS, ENR, SVR, RF,
DT, XGBoost]

- DS: n = 78
(dataset: n = 110)
- OR:
 PLS, ENR, RF = low
/SVR = medium to high
/DT, XGBoost = high

(5-fold cross-
validation method)
[21]
Freeze-
Structured Meat
To optimize the PPI-ISP-VWG blend using RSM and evaluate freeze-structured plant-based meat with chicken-like texture. ML models accurately predicted meat analog properties, with top performance from Gradient Boosting (hardness: R2 = 0.986, RMSE = 24.698), AdaBoost (springiness = R2 = 0.940, RMSE = 0.019), and XGBoost (water activity: R2 = 0.985, RMSE = 0.002).[DT, KNN, XGBoost, RF, Gradient Boosting, AdaBoost]

- DS: n = 16
- OR:
 RF = low/AdaBoost
= medium to high/DT,
KNN, XGBoost,
Gradient Boosting
= high

(RMSE and R2 values
for cross-validation)
[22]
BeerTo develop an NIR-based and ML-driven method for beer authentication, quality evaluation, and control through the bottle. NIR spectroscopy combined with ANN models accurately authenticated beer, predicted sensory attributes and volatile compounds through unopened bottles, offering a fast, non-destructive tool for quality control and fraud detection.
(model 1: 99%, model 2: R = 0.92, model 3: R = 0.94)
[ANN]

- DS: n = 25
- OR:
 ANN = high

(Neuron trimming
test)
[23]
Fermented
pomegranate juice (FPJ)
To use ML and SHAP analysis to identify key physicochemical factors influencing sensory preference in FPJs. Gradient Boosting achieved the highest accuracy in predicting FPJ preference, and SHAP analysis identified TSS, CD, and LAB as the key influencing features.
(CPS/WPS model: R2 = 0.81)
[LR, RR, KNN, SVR,
RF, AdaBoost,
Gradient-boosted aggregation, ANN]

- DS: n = 90
- OR:
 LR, RR, RF = low/
KNN, SVR, AdaBoost
= medium to high/
Gradient-boosted
aggregation, ANN
= high

(3-, 5- and 10-fold
cross-validation)
[12]
EEM = Excitation-Emission Matrix; RMSE = Root Mean Square Error; DS = Data Size; OR = Overfitting Risk; NN = Neural Network; BP-ANN = Back-Propagation Artificial Neural Networks; SHAP = Shapley Additive Explanations; PLS-DA = Partial least squares discriminant analysis; ENR = Elastic Net Regression; PPI = Pea Protein Isolate; ISP = Isolated Soy Protein; VWG = Vital Wheat Gluten; RSM = Response Surface Methodology; NIR = Near-Infrared; TSS = Total Soluble Solids; CD = Color Density; LAB = Lactic Acid Bacteria; LR = Linear Regression; RR = Ridge Regression.
Table 2. Summary of previous studies on NLP-based LLMs in sensory/consumer research and the food industry.
Table 2. Summary of previous studies on NLP-based LLMs in sensory/consumer research and the food industry.
TechnologyFoodObjectiveKey Findings/SummaryAI/ML AlgorithmsReference
NLP/
LLM
MadeleineTo evaluate how different FC data formats (words vs. sentences) and preprocessing methods affect the quality and reliability of results. ChatGPT and the expert system performed well on word-based FC data but showed lower performance than human experts on sentence-based FC data, and preprocessing methods led to large differences in reproducibility and discriminative power. [NLP, LLM][30]
WineTo demonstrate how NLP and ML techniques can be used to analyze expert-written Bulgarian wine descriptions and extract patterns related to wine quality and style. NLP and ML enabled automatic extraction of quality and style patterns from Bulgarian wine descriptions, with BERT-based models showing high performance in predicting wine style and ratings (R2 = 0.643~0.656).[BERT, SVM, RF, XGBoost, MLP]

- Dataset: n = 5807
[29]
Chocolate browniesTo evaluate the potential use of Chat GPT as a sensory evaluator for hypothetical chocolate brownie formulations. ChatGPT provided highly positive and overly favorable sensory descriptions for all brownie formulations, showing sentiment bias and requiring validation against human sensory panels. [NLP,
LLM]
[31]
Sustainable protein
foods
To investigate how LLMs can support sustainable food development by evaluating their performance across key design and prediction tasks and integrating them with optimization methods.  LLMs, when combined with optimization techniques, can generate food choices that reduce greenhouse gas emissions by up to 79% while maintaining user satisfaction, demonstrating their potential to support sustainable food design. [LLM][32]
SweetnessTo analyze sweetness levels, liking, and ingredient information from online food reviews to gain insights into sensory nutrition and identify opportunities to reconcile the palatability-healthiness tension. Oversweetness found in 7–16% of sweetness-related reviews and was consistently linked to lower liking, indicating a clear opportunity for developing reduced-sweetness product versions.
(XGBoost accuracy: 79–84%)
[NLP, XGBoost]

- Dataset:
n (total) = about 550,000
(Sweetness -
related reviews)
[26]
WhiskyTo identify and extract unique sensory descriptors from existing whisky reviews to build a flavor language. LSTM and GloVe-based DL models accurately extracted whisky flavor descriptors from review texts with 99% accuracy, demonstrating that a flavor language can be programmatically learned.[NLP, LSTM, GloVe]

- Dataset: n = 8036
(English whisky reviews)
[24]
FC = Free Comment; BERT = Bidirectional Encoder Representations from Transformers; MLP = Multi-Layer Perceptron Regressor; LSTM = Long Short-Term Memory; GloVe = Global Vectors for World Representation.
Table 3. Summary of previous studies on AI/ML applications in molecular docking and physicochemical-based food research.
Table 3. Summary of previous studies on AI/ML applications in molecular docking and physicochemical-based food research.
TechnologyFoodObjectiveKey Findings/SummaryAI/ML Algorithms
(Validation Protocol)
Reference
MLOysterTo rapidly identify oyster-derived umami peptides using ML and to clarify their umami and salt-enhancing mechanisms through molecular docking and sensory analysis.Three oyster-derived umami peptides were identified using ML, and molecular docking confirmed their binding to T1R1/T1R3 and TMC4, revealing strong umami and salt-enhancing properties. [iUmami-SCM,
Umami_YYDS,
TastePeptides
-DM]

- Dataset: n = 159
[36]
SaltinessTo predict the saltness-enhancing intensity of savory odorants using an XGBoost regression model and to elucidate their structural and receptor-binding mechanisms through SHAP analysis and molecular simulations. XGBoost accurately predicted saltiness-enhancing intensity (R2 = 0.96), SHAP identified key structural groups (phenyl, aldehyde), and molecular simulations revealed key OR1A1/OR1D2 binding sites explaining odor-induced salt enhancement.[XGBoost]

- Dataset: n = 81
- OR: high

(5-fold cross-validation)
[16]
SufuTo elucidate the formation mechanism of umami peptides during sufu fermentation and to establish a rapid screening model using peptidomics, ML, and molecular docking. Peptidomics and ML identified 637 umami peptides, and molecular docking with sensory validation confirmed five novel peptides that bind T1R1/T1R3 and impart actual umami taste. [Umami-MRNN, UMPred-FRL, Umami_YYDS]

- Dataset: n = 637
[35]
SausageTo develop an integrated DL-based framework combined with metagenomics and molecular docking to efficiently predict, screen, and validate potential umami peptides in fermented sausages.Integrated DL and metagenomics enabled high-throughput screening of umami peptides, identifying top candidates that showed stable T1R1/T1R3 binding and strong umami taste validated by molecular docking, MD simulation, and sensory evaluation.
(Accuracy: CNN = 82.4%, Transformer = 79.4%, LSTM = 81.4%, Attention = 81.6%)
[CNN, Transformer, LSTM, Attention architectures]

- Dataset: n = 508
- OR: high

(80/20 split with a balanced da taset and an all-model consensus ensemble)
[37]
Pixian Doubanjiang
(PXDB)
To identify umami peptides in aged PXDB using ML and molecular docking, and to elucidate their sensory mechanisms and biosynthetic pathways.ML identified 69 potential umami peptides from PXDB, with VEGGLR confirmed to have a very low umami threshold and strong T1R1/T1R3 binding, while PTM profiling suggested regulatory roles in umami peptide biosynthesis. [Umami-MRNN]

- Dataset: n = 117
[38]
SHAP = Shapley Additive Explanations; OR = Overfitting Risk; CNN = Convolutional Network; LSTM = Long Short-Term Memory; PTM = Post-Translational Modification.
Table 4. Descriptions of VR, AR and MR (Adapted from [57], with permission from MDPI, ©2020).
Table 4. Descriptions of VR, AR and MR (Adapted from [57], with permission from MDPI, ©2020).
Virtual RealityAugmented RealityMixed Reality
Display deviceSpecial HMD or smart glasses required.Smartphones, tablets, AR glasses or headsets (optional).HMD or AR glasses (optional handheld or projection devices).
Image sourceComputer graphics or real images produced by a computer.Combination of computer-generated images and real-life objects.Combination of computer-generated images and real-life objects.
EnvironmentFully digital.Physical surroundings with overlaid virtual content.Real and virtual elements coexist and interact in real time
PerspectiveVirtual objects adjust in size and position according to the user’s viewpoint in the virtual world.Virtual objects align with the user’s real-world viewpoint.Virtual objects align and interact with the user’s real-world viewpoint.
PresenceFeeling of being transported somewhere else with no sense of the real world.Feeling of still being in the real world, but with new elements and objects superimposed.Feeling of still being in the real world, but with new elements and objects superimposed.
AwarenessHighly rendered virtual objects may be indistinguishable from reality.Highly rendered virtual objects may be indistinguishable from reality.Virtual objects may be indistinguishable from real ones and can be manipulated as part of the physical environment.
Table 5. Summary of literature on VR, AR, and MR.
Table 5. Summary of literature on VR, AR, and MR.
TechnologyFoodObjectiveKey Findings/SummaryReference
VRVirtual cakeTo assess the effects of VR on visual liking and hedonic responses to cakes across two immersive, photogrammetry-based contexts.No main effect of context on liking; visual liking differed significantly by the context–cake interaction, age, and subjective hunger.[62]
Chocolate biscuits, orange juiceTo design a VR-based sensory booth to complement sensory evaluation and expand applications in sensory science.Feasible VR-based sensory booth enabling multiple sensory methods for evaluation and perception research.[63]
Scent sticksTo test whether VR food imagery modulates odor identification and perception via a VR-integrated olfactory task.Olfactory augmentation in VR heightened presence, enhanced recall, improved comfort and affect, and influenced consumer behavior.[64]
Bakery items,
Scented sticks
To evaluate a VR-based sensory laboratory integrating conventional sensory methods to examine differences in consumer responses.SSQ, virtual reality sickness questionnaire and virtual reality neuroscience questionnaire scores indicate the virtual sensory laboratory is suitable for consumer sensory evaluation.[65]
Granola barTo assess how consumption-environment personal relevance (usage frequency) shapes perception and acceptance.Personal relevance increased data repeatability, yielding more reliable consumer insights.[66]
Apple juiceTo compare presence, liking, beverage desire, intake, and choice across real, lab, and two immersive contexts.360VR induced stronger café presence than a picture-based context, while liking remained comparable to laboratory ratings.[7]
SandwichTo compare responses across different experimental setups.Immersion ranked: real-life > simulated environments > scenario-based booth; pattern consistent with external validity.[48]
ARChicken mealTo assess effects of AR simulated control and environmental embedding on mental imagery, evaluation ease, liking, and purchase intention.Environmental embedding’s effect on product liking was fully mediated by mental imagery quality (no direct effect).[67]
DessertTo assess whether AR superimposition enhances mental simulation, increasing desire and purchase intention.AR raised mental simulation, which mediated higher desire and purchase likelihood.[60]
10 different food imagesTo develop and validate an AR tool for food portion estimation.AR improved portion-size estimation accuracy.[68]
YogurtTo evaluate how AR environments influence consumer sensory responses to different yogurts.Significant yogurt–environment interaction for appearance, flavor, sweetness, mouthfeel, aftertaste, and overall liking.[69]
BeverageTo build and evaluate a wearable AR–olfaction system to test how visual and scent cues modulate taste perception.Olfaction exerted a stronger influence on flavor perception than vision.[70]
MRSnack and real foodsTo assess utility and ecological validity of an HMD-passthrough MR app for interacting with real foods.Experts rated the virtual restaurant more acceptable than a sensory booth, but less acceptable than a real restaurant.[71]
Tea break snackTo examine how consumption context—including MR—shapes consumers’ emotional responses to tea-break snacks.Incorporating context is crucial for consumer emotional-response data collection.[72]
Tea break snackTo compare consumer affective responses to snacks across a sensory booth, an MR-evoked café, and a real café to assess MR’s ecological validity.Affective ratings in the MR café matched the real café (p ≥ 0.10), supporting MR as an ecologically valid setting for consumer testing.[47]
Snack foods, BeveragesTo develop an MR HMD–camera computer vision system to detect diet-related actions and trigger real-time visual interventions that promote healthier choices.Current neural networks achieve high-accuracy food item detection in real-world settings.[73]
Table 6. Summary of literature on nerve and brain activity.
Table 6. Summary of literature on nerve and brain activity.
TechnologyFoodObjectiveKey Findings/SummaryReference
EEGD-limonene, essential oilsTo compare the brain’s sensory and cognitive responses to various citrus flavors using EEG.Left-right asymmetry of alpha waves and intensity of delta waves in the prefrontal cortex showed a significant correlation with liking ratings for citrus flavors.[92]
BaijiuTo evaluate the predictive validity of brainwave attentiveness and facial expressions for pairing preferences.Sweetness and saltiness were key drivers of preference; white liquors with elegant flavors matched sweetness, while spicy ones paired well with diverse tastes (umami, saltiness, sweetness).[95]
Protein chocolate milkTo compare the effects of health- and taste-related perceptions on explicit and implicit food preferences.EEG and fMRI results indicated that health-related perceptions reduced explicit preferences compared to taste-related perceptions, but did not affect implicit preferences.[96]
Marinated beefTo examine neural responses to different cooking methods using EEG.High-heat cooking elicited stronger α, β, and γ activity, linked to pleasure, appetite, and cognitive engagement.[97]
Food samples with unpleasant/pleasant aromasTo integrate brainwave technology and pattern recognition techniques to provide objective, quantified physiological data reflecting responses to odor stimuli.Developed an experimental paradigm capable of collecting olfactory EEG responses to eight distinct odors and proposed a new olfactory perception dimensional space theory.[66]
CoffeeTo examine gender differences in coffee preference based on GI information using EEG analysis.Men preferred coffee with GI information, while women favored coffee without it; EEG results contrasted with self-reported preferences.[98]
CoffeeTo predict the sensory characteristics of coffee using EEG and ML technologies.Signals from the parietal lobe, central lobe, and frontal lobe regions showed the highest predictive power.[99]
Alcohol solutions and BaijiuTo compare brain responses to white liquor and alcohol of equal concentration.White liquor showed significantly higher brain signal activity (increased δ, α waves and heightened frontal lobe, parietal lobe, right temporal lobe).[100]
fNIRSSucrose solutionsTo identify brain regions associated with sweetness intensity and emotional value.As sweetness increased, the number of activated channels rose from 7 to 11; a positive correlation between participants’ self-reported sweetness intensity data and implicit data.[94]
ChocolateTo confirm differences in brain activity between chocolate lovers and non-lovers.The introduction of fNIRS to sensory evaluation demonstrated that sweetness and bitterness, respectively, decrease and increase neural activity.[93]
Thermal water, orange essential oil, mineral waterTo investigate gender differences in brain responses to pleasant and unpleasant odors.Compared with pleasant odors, unpleasant ones elicited a significantly greater increase in oxygenated hemoglobin; women showed higher fNIRS responses than men, with stronger activation to unpleasant odors.[101]
Distillate water and coffeeTo examine the relationship between perceived bitterness and brain oxygenation changes.Bitter samples showed ΔoxyHb increases in taste regions; women displayed higher ΔoxyHb for water, suggesting a link between vision and taste.[102]
fMRIThree solutions (sour taste, mango smell, and flavor of sour taste plus mango smell) To identify brain regions involved in the integration of taste and olfactory signals and to clarify the neural mechanism underlying their interaction.Sour taste and odor were integrated in the anterior insula and rolandic operculum, key regions activated during taste stimulation.[103]
Pleasant/unpleasant dishesTo investigate how plate design aesthetics influence consumers’ neural and emotional responses to food.Pleasing designs enhanced product attitudes by activating reward and attention regions, while unpleasant designs triggered inhibition and rejection areas linked to negative evaluations.[104]
NaCl solutionsTo examine how odor cues (MSG and cheddar cheese) affect preference for saltiness and related brain activation.Saltiness preference increased with these odors; high-salt stimuli activated the rolandic operculum, while preference-related activation appeared in the rectus gyrus, medial orbitofrontal cortex, and substantia nigra.[105]
Marshmallow, caramel, grapefruit, quinineTo explore brain activation induced by odor stimuli related to taste perception.Odors activated the insula and frontal operculum; sour odors showed stronger activity in the angular gyrus, orbitofrontal cortex, caudate, and nucleus accumbens.[106]
Table 7. Summary of literature on autonomic nervous system responses.
Table 7. Summary of literature on autonomic nervous system responses.
TechnologyFoodObjectiveKey Findings/SummaryReference
EDA/GSRSweet gumsTo assess food acceptance by integrating FER, GSR, and heart rate measurements.Integrating FER, GSR, and heart rate improved prediction of food acceptance; GSR and pulse enhanced accuracy beyond FER alone.[118]
Peppermint, jasmine, sweet orange, and lavender essential oilsTo examine physiological responses to olfactory preferences using EDA.EDA collected SC, respiration, and HR; olfactory preference affected respiration and HR, but not skin conductance.[119]
Hotdog, tofuTo investigate the relationship between food neophobia and physiological responses using SCR.SCR positively correlated with food neophobia; elevated pre-presentation signals indicated expectancy toward food.[120]
BeerTo determine whether samples can be distinguished using EDA-derived skin conductance data.Skin conductance alone distinguished samples; explicit symbolism and value showed negative correlation with EDA measures.[121]
ECGRed wineTo examine the relationship between ECG-measured emotions and sensory attributes.ECG-measured emotions highly correlated with quantitative and hedonic sensory attributes; specific aromatic molecules induced positive or negative emotions.[115]
Universally/personally accepted/non-accepted solutionsTo enhance understanding of consumers’ food experiences using HR, HRV, SC, and EEG measurements.HR, HRV, SC, and EEG clarified food experience responses; non-accepted solutions increased HR and shortened SC response latency.[116]
Sucrose, quinineTo investigate physiological responses to expectation confirmation and violation during tasting.HR decreased during second tasting; expectation-confirming tastes increased HR, while expectation-violating tastes decreased it; SC unaffected and lower in second session.[122]
Skin temperatureMushroom, fish, chocolate, caramel, cucumber, orange, appleTo examine how early facial and autonomic responses reflect olfactory arousal.Early facial and ANS responses reflected olfactory arousal; explicit measures linked to conscious processing and odor salience.[123]
Vegetable juiceTo analyze emotional and physiological indicators influencing purchase intention.Significant differences in state anxiety inventory, negative sensory feedback, emotional quotient, and FE related to purchase behavior; negative emotions low and positive emotions high in self-reports and FE.[124]
Table 8. Summary of literature on eye movements and visual responses.
Table 8. Summary of literature on eye movements and visual responses.
TechnologyFoodObjectiveKey Findings/SummaryReference
Eye-trackingOrange juiceTo investigate how packaging elements influence visual attention and purchase intent.Visual attention focused on nutritional information, but purchase intent mainly driven by New Zealand logo.[130]
Sugar-sweetened beveragesTo examine the relationship between nutritional awareness, visual attention, and purchase intention for beverages.As awareness of nutrition has increased, visual attention to product attributes has mediated a growing preference for reduced or no sugar beverages.[133]
Test food on a trayTo investigate eye movement patterns and consumption behavior across age groups.Total fixation time and frequency increased according to food preference, with the highest intake observed across all age groups.[134]
BeefTo assess how visual and informational attributes of beef affect consumer attention and purchase intent.Deep red color increased purchase intent; brown color and Nellor breed decreased it; color, breed, marbling, and price affected fixation metrics.[135]
Apple, honey melon, chocolate, caramelTo examine how odor stimuli influence attention and food choice behavior.More frequent selection of healthy foods regardless of odor; Longer initial attention span under healthy odor stimuli.[136]
Table 9. Summary of literature on FE and responses.
Table 9. Summary of literature on FE and responses.
TechnologyFoodObjectiveKey Findings/SummaryReference
FEAAdded sugar and surprise flavorTo evaluate facial emotion decoding as a tool for distinguishing food samples.Facial decoding distinguished samples through anger and disgust; it uniquely identified the effect of added sugar; and emotions were reflected in explicit evaluations.[146]
Energy drinksTo compare explicit and implicit emotional responses to different energy drinks.Positive emotions were observed in both beverages; Energy Drink A elicited greater implicit emotional engagement than Energy Drink B.[147]
Oat breadTo analyze oral response patterns to different bread types using facial recognition.Bread type affected chewing duration and frequency; facial recognition data aligned with explicit satiety results.[148]
Beef pattyTo compare age-related differences in facial expressiveness during sensory evaluation.Younger consumers showed greater facial expressiveness; blank expression most frequent, with age-related differences observed.[149]
BeerTo validate FE measurement for predicting beer selection.FE metrics predicted beer choice; ‘Lip suck’ negatively and ‘Lip press’ positively influenced selection.[150]
Orange juiceTo analyze the relationship between visual attention and purchase intention using FE and eye-tracking data.Nutritional information captured most attention, but New Zealand logo determined purchase intent, showing attention and liking were misaligned.[147]
EMGBiscuitsTo assess the feasibility of using EMG signals to evaluate chewing behavior and texture in biscuits.EMG replicated chewing behavior; chewing time reflected texture attributes, suggesting utility for texture evaluation in baking.[151]
ChocolateTo analyze facial muscle activity in response to different taste profiles and chocolate preferences.Facial muscle activity differed between bitter and sweet segments; activity varied by preferred cocoa content during consumption.[152]
Gel-type solid foodTo investigate the relationship between muscle activity and subjective responses to solid food.Preference, desire, and value for solid food negatively correlated with masseter muscle EMG activity.[153]
Pear juiceTo examine the effect of organic labeling on muscle activation and consumer response.Hyoid muscle activation observed during pre-observation of organic-labeled products; shorter reaction times for organic juices indicated label influence on preference.[154]
Table 10. Summary of previous studies on robotics applications in food industry.
Table 10. Summary of previous studies on robotics applications in food industry.
TechnologyFoodObjectiveKey Findings/SummaryReference
RobotCoffeeTo compare volatile compounds
(GC–MS), consumer acceptance, sensory profiles, and emotional responses between robot- and
human-brewed coffee.
The study suggested that robot baristas could serve as an efficient alternative to human baristas, and indicated the potential expansion of human–robot collaborative models across the coffee and broader food industries.[164]
White-
flesh
dragon
To develop and validate a robot-based sensor system for non-destructive evaluation of
texture degradation in dragon
fruit.
The robot-based measurement method estimated the internal decay of dragon fruit with about 84% accuracy, demonstrating the potential for integrating robotics with sensory evaluation techniques.[165]
FoodTo implement an automate taste system by integrating
chemical sensors into a robotic
finger, enabling rapid discrimination of food flavors and additives.
The robotic finger successfully
distinguished various tastes from food
samples, enabling rapid evaluation and
suggesting the potential of robots to replace
human sensory assessment.
[166]
Table 11. Summary of previous studies on IoT applications in food quality monitoring and preservation.
Table 11. Summary of previous studies on IoT applications in food quality monitoring and preservation.
TechnologyFoodObjectiveKey Findings/SummaryReference
IoTLettuceTo design and evaluate an IoT-based temperature and humidity control storage system to reduce postharvest losses and extend the shelf life of lettuce.Application of the IoT-based smart storage system improved the shelf life and consumer preference of lettuce, suggesting its potential contribution to quality management and food loss reduction.[169]
IoT,
E-nose
BeefTo evaluate volatile organic
compound (VOC) concentrations
for identifying beef spoilage
levels, an IoT-based E-nose system
was proposed.
The correlation between bacterial growth and VOC generation in beef spoilage evaluation was identified, demonstrating that the IoT-based E-nose system can serve as a real-time tool for food spoilage detection.[170]
IoTBreadTo control bread production
quality, an integrated system
combining various sensors and
IoT technologies was proposed.
The integrated system combining various sensors and IoT technologies enabled faster and more efficient quality control and real-time monitoring compared to traditional food production management methods.[167]
Table 12. Summary of strengths, limitations, and applications of digital technologies in food industry.
Table 12. Summary of strengths, limitations, and applications of digital technologies in food industry.
TechnologyStrengthsLimitationsApplicationsReference
AIRapid analysis of large-scale data, capability to predict food taste, ability to interpret unstructured consumer expressions, analysis of consumer preference-sensory interactions.Overfitting risk, domain shift, drift, lower accuracy with single-method data, limits of text-based review analysis, not fully representative.Consumer preference prediction, food quality prediction, consumer data analysis, peptide screening for taste perception.[9,10,11,24,25,26,39,40,41,42,43]
XRHigher ecological validity, realistic consumption contexts, greater engagement and immersion, flexible context design.HMD-induced fatigue/cybersickness, cognitive overload, novelty effects, high cost and low scalability.Immersive sensory/consumer tests, eating-environment effects, virtual food-choice environments, VR-based food/nutrition training.[39,45,47,55,57,71,72,172,173,174]
BiometricsHigh temporal or spatial resolution, objective physiological indicators, sensitivity to subtle sensory differences, enhanced prediction of liking and choice.Awareness of monitoring, restricted movement, intake-related artifacts, small participant cohorts, limited of wired devices, external influence susceptibility.Measurement emotional responses, physiological-neural reactions, attention and perception, formulation-related responses, integrated experience, and measurements to predict choice behavior.[73,77,78,80,86,87,89,90,91,102,109,117,118,120,135,149,150,151,152,153,154,155,156,157,158]
Digital sensorComplements human sensory evaluation, reduces production time, improves quality control, enables system automation.Cannot rely on data alone, cannot fully replace human sensory evaluation, high cost.Sensory/consumer test, food quality monitoring and management.[164,165,166,167,169,170]
Table 13. Summary of previous studies on digital sensing technologies and ML.
Table 13. Summary of previous studies on digital sensing technologies and ML.
TechnologyFoodObjectiveKey Findings/SummaryAI/ML Algorithms
(Validation Protocol)
Reference
Digital
sensing
technologies/ML
SoybeanTo establish correlations between human sensory evaluations and multi-sensor instrument data using ML models, based on various commercial soybean paste products. Using multi-sensor data and ML, the SVR model achieved the best performance (prediction set:
R2 = 0.997, RMSE = 0.536), accurately predicting and distinguishing the sensory quality of commercial soybean pastes.
[SVR, RF, XGBoost, BRR, RR, KNN, ANN]

- DS: n = 99

(Using a grid search methodology combined with 10-fold cross validation)
[18]
PeachesTo design an integrated visuo-tactile sensing system and composite DL model capable of non-destructively predicting peach firmness while incorporating markers to improve contact-force estimation. The visuo-tactile sensor enabled non-destructive prediction of peach firmness (R2 = 0.878, RMSE = 0.732) and contact force (R2 = 0.942, RMSE = 1.115), demonstrating strong feasibility for robotic-arm-based agricultural applications.[SVR, KNR, CNN,
CNN-LSTM]

- DS: n = 1660

(Validation-set protocol)
[176]
CoffeeTo assess volatile and sensory differences between fermented and unfermented coffee using digital sensors (NIR, E-nose) and to build ML models predicting aroma compounds and sensory intensities. Digital sensing combined with ANN models accurately predicted volatile compounds (R(1) up to 0.98) and sensory descriptor intensities (R(1) = 0.91), effectively distinguishing fermented from unfermented coffee across roasting levels.[ANN]

- DS: n = 16

(Validation-set protocol, Bayesian regularization, and neuron trimming)
[177]
BaijiuTo integrate E-nose/E-tongue data with human sensory evaluations to develop ML models for predicting and classifying the flavor quality of strong-aroma Baijiu. E-nose/E-tongue data combined with ML enabled highly accurate Baijiu flavor prediction (R2 > 0.999) and 100% classification accuracy, effectively distinguishing 42 strong-aroma Baijiu samples across origins, alcohol levels, and grades.[LR, DT, RF, GBT, SVR, K NN, SVM, Naive Bayes]

- DS: n = 42

(Test-set validation and model ensembling)
[175]
Fruit juiceTo fuse electronic sensory features with ANNs to model and predict human sensory attributes and hedonic responses to fruit juice. Fused e-sensory features combined with ANN accurately predicted human sensory hedonic responses to fruit juice (best model R2 = 0.95, RMSE = 0.04), demonstrating strong potential to supplement human sensory evaluation.[ANN]

- DS: n = 287

(H yperparameter optimization)
[44]
BeerTo evaluate consumer acceptance and perceived quality of beer based on visual foam characteristics and to develop an ANN model predicting liking using biometric and physical parameters. Biometric and physical metrics integrated with visual foam attributes enabled accurate prediction of beer liking (ANN accuracy 82%), effectively distinguishing consumer preference for different beer types. [ANN]

- DS: n = 15

(70-15-15 train/validation/test split and cross-entropy validation)
[178]
(1) correlation coefficient; RMSE = Root Mean Square Error; BRR = Bayesian Ridge Regression; RR = Ridge Regression; DS = Data Size; KNR = K-Neighbors Regressor; CNN-LSTM = Convolutional Recurrent Coupling Network; NIR = Near-Infrared; LR = Logistic Regression; GBT = Gradient Boosting Tree.
Table 14. Summary of studies integrating advanced sensory evaluation technologies.
Table 14. Summary of studies integrating advanced sensory evaluation technologies.
TechnologyFoodObjectiveKey Findings/SummaryReference
MIP, ML,
IoT
FoodsTo describe an IoT, MIP sensor, and ML integrated solution for improving the performance of a food spoilage detection
system.
The MIP sensor detected VOCs related to food spoilage, and ML models classified freshness with up to 95% accuracy. An IoT framework enabled real-time freshness prediction and spoilage alerts.[180]
IoT, AI,
VR,
Biometric
sensor
OlfactoryTo implement an immersive experience that synchronizes
olfactory and visual stimuli
through olfactory virtual reality system integrating VR, IoT,
and AI technologies.
Cat-Seg, MAFT, and YOLOv11m-seg models performed odor classification; Cat-Seg showed the best results. An IoT device controlled chemical release, and biofeedback dynamically adjusted olfactory stimuli.[181]
IoT, AIChickenTo develop a quality prediction model for the cold chain of chilled chicken, IoT and flexible sensors were integrated.Environmental and physicochemical indicators (e.g., temperature, humidity, color, TVB-N) were analyzed with sensory data. Knowledge rule–based analysis enabled real-time food quality prediction.[182]
E-nose,
IoT,
ML
BananaTo detect food spoilage and
determine shelf life, an IoT
and ML—based E-nose
system was developed.
A low-cost E-nose with IoT connectivity (ESP8266, MOS-based sensors) detected fruit ripeness and spoilage. The SVC model achieved 97.05% accuracy in freshness classification.[183]
MIP = molecularly imprinted polymer; VOCs = Volatile Organic Compounds; TVB-N = Volatile Base Nitrogen Rapid Detector; SVC = Support Vector Classifier.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Lee, D.; Jeon, H.; Kim, Y.; Lee, Y. Integrating Cutting-Edge Technologies in Food Sensory and Consumer Science: Applications and Future Directions. Foods 2025, 14, 4169. https://doi.org/10.3390/foods14244169

AMA Style

Lee D, Jeon H, Kim Y, Lee Y. Integrating Cutting-Edge Technologies in Food Sensory and Consumer Science: Applications and Future Directions. Foods. 2025; 14(24):4169. https://doi.org/10.3390/foods14244169

Chicago/Turabian Style

Lee, Dongju, Hyemin Jeon, Yoonseo Kim, and Youngseung Lee. 2025. "Integrating Cutting-Edge Technologies in Food Sensory and Consumer Science: Applications and Future Directions" Foods 14, no. 24: 4169. https://doi.org/10.3390/foods14244169

APA Style

Lee, D., Jeon, H., Kim, Y., & Lee, Y. (2025). Integrating Cutting-Edge Technologies in Food Sensory and Consumer Science: Applications and Future Directions. Foods, 14(24), 4169. https://doi.org/10.3390/foods14244169

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop