Next Article in Journal
Wind Resistance Mechanism of an Anole Lizard-Inspired Climbing Robot
Next Article in Special Issue
Special Issue “Emotion Intelligence Based on Smart Sensing”
Previous Article in Journal
Accuracy of Ground Reaction Force and Muscle Activation Prediction in a Child-Adapted Musculoskeletal Model
Previous Article in Special Issue
Recognition of Emotion by Brain Connectivity and Eye Movement
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

A Review of AI Cloud and Edge Sensors, Methods, and Applications for the Recognition of Emotional, Affective and Physiological States

1
Department of Construction Management and Real Estate, Vilnius Gediminas Technical University, Sauletekio Ave. 11, LT-10223 Vilnius, Lithuania
2
Machine Intelligence Research Labs, Scientific Network for Innovation and Research Excellence, Auburn, WA 98071, USA
3
Institute of Sustainable Construction, Vilnius Gediminas Technical University, Sauletekio Ave. 11, LT-10223 Vilnius, Lithuania
4
Department of Applied Mechanics, Vilnius Gediminas Technical University, Sauletekio Ave. 11, LT-10223 Vilnius, Lithuania
*
Author to whom correspondence should be addressed.
Sensors 2022, 22(20), 7824; https://doi.org/10.3390/s22207824
Submission received: 18 August 2022 / Revised: 28 September 2022 / Accepted: 12 October 2022 / Published: 14 October 2022
(This article belongs to the Special Issue Emotion Intelligence Based on Smart Sensing)

Abstract

:
Affective, emotional, and physiological states (AFFECT) detection and recognition by capturing human signals is a fast-growing area, which has been applied across numerous domains. The research aim is to review publications on how techniques that use brain and biometric sensors can be used for AFFECT recognition, consolidate the findings, provide a rationale for the current methods, compare the effectiveness of existing methods, and quantify how likely they are to address the issues/challenges in the field. In efforts to achieve the key goals of Society 5.0, Industry 5.0, and human-centered design better, the recognition of emotional, affective, and physiological states is progressively becoming an important matter and offers tremendous growth of knowledge and progress in these and other related fields. In this research, a review of AFFECT recognition brain and biometric sensors, methods, and applications was performed, based on Plutchik’s wheel of emotions. Due to the immense variety of existing sensors and sensing systems, this study aimed to provide an analysis of the available sensors that can be used to define human AFFECT, and to classify them based on the type of sensing area and their efficiency in real implementations. Based on statistical and multiple criteria analysis across 169 nations, our outcomes introduce a connection between a nation’s success, its number of Web of Science articles published, and its frequency of citation on AFFECT recognition. The principal conclusions present how this research contributes to the big picture in the field under analysis and explore forthcoming study trends.

1. Introduction

Global research in the field of neuroscience and biometrics is shifting toward the widespread adoption of technology for the detection, processing, recognition, interpretation and imitation of human emotions and affective attitudes. Due to their ability to capture and analyze a wide range of human gestures, affective attitudes, emotions and physiological changes, these innovative research models could play a vital role in areas such as Industry 5.0, Society 5.0, the Internet of Things (IoT), and affective computing, among others.
For hundreds of years, researchers have been interested in human emotions. Reviews on the applications of affective neuroscience include numerous related topics, such as the mirror mechanism and its role in action and emotion [1], the neuroscience of under-standing emotions [2], consumer neuroscience [3], the role of positive emotions in education [4], mapping the brain as the basis of feelings and emotions [5], the neuroscience of positive emotions and affect [6], the cognitive neuroscience of music perception [7], and social cognition in schizophrenia [8]. Applications in neuroscience also include the analysis of cognitive neuroscience [9,10,11], and brain sensors [12,13], and works in the literature also discuss the recognition of basic emotions using brain sensors [14].
Studies of the applications of affective biometrics can be found in the literature in the fields of brain biometric analysis [15], predictive biometrics [16], keystroke dynamics [17], applications in education [18], consumer neuroscience [19], adaptive biometric systems [20], emotion recognition from gait analyses [21], ECG databases [22], and others. Several works on affective states have integrated multiple biometric and neuroscience methods, but none have included an integrated review of the application of neuroscience and biometrics and an analysis of all of the emotions and affective attitudes in Plutchik’s wheel of emotions.
Scientists analyzed various brain and biometric sensors in the reviews [23,24,25,26]. Curtin et al. [23], for instance, state that both fNIRS and rTMS sensors have changed significantly over the past decade and have been improved (their hardware, neuronavigated targeting, sensors, and signal processing), thus clinicians and researchers now have more granular control over the stimulation systems they use. Krugliak and Clarke [26], da Silva [24], and Gramann et al. [27] analyzed the use of EEG and MEG sensors to measure functional and effective connectivity in the brain. Khushaba et al. [25] used brain and biometric sensors to integrate EEG and eye tracking for assessing the brain response. Other scientists [28,29,30,31,32,33] used the following biometric sensors in their studies: heart rate, pulse rate variability, odor, pupil dilation and contraction, skin temperature, face recognition, voice, signature, gestures, and others.
Indeed, the biometrics and neuroscience field has been the focus of studies by many researchers who have achieved significant results. A number of neuroscience studies have analyzed the detection and recognition of human arousal [34], valence [35,36], affective attitudes [36,37], emotional [38,39,40,41], and physiological [42] states (AFFECT) by capturing human signals.
Though most neuroimaging approaches disregard context, the hypothesis behind situated models of emotion is that emotions are honed for the current context [43]. According to the theory of constructed emotion, the construction of emotions should be holistic, as a complete phenomenon of brain and body in the context of the moment [44]. Barrett [45] argues that rather than being universal, emotions differ across cultures. Emotions are not triggered—they are created by the person who experiences them. The combination of the body’s physical characteristics, the brain (which is flexible enough to adapt to whatever environment it is in), and the culture and upbringing that create that environment, is what causes emotions to surface [45]. Recently, there have been attempts in the academic community to supply contextual (from cultural and other circumstances) analysis [46,47].
Various theories and approaches (positive psychology [48,49,50], environmental psychology [51,52,53], ergonomics—human factors science [54,55,56], environment–behavior studies, environmental design [57,58,59], ecological psychology [60,61], person–environment–behavior [62], behavioral geography [63], and social ecology research [64] also emphasize emotion context sensitivity.
The objective of this research is to provide an overview of the sensors and methods used in AFFECT (affective, emotional, and physiological states) recognition, in order to outline studies that discuss trends in brain and biometric sensors, and give an integrated review of AFFECT recognition analysis using Plutchik’s [65] wheel of emotions as the basis. Furthermore, the research aim is to review publications on how techniques that use brain and biometric sensors can be used for AFFECT recognition. In addition, this is a quantitative study to assess how the success of the 169 countries impacted the number of Web of Science articles on AFFECT recognition techniques that use brain and biometric sensors that were published in 2020 (or the latest figures available).
In this paper, we identify the critical changes in this field over the past 32 years by applying text analytics to 21,397 articles indexed by Web of Science from 1990 to 2022. For this review, we examined 634 publications in detail. We have analyzed the global gap in the area of neuroscience and affective biometric sensors and have aimed to update the current big picture. The aforementioned research findings are the result of this work.
When emotions as well as affective and physiological states are determined by recognition sensors and methods—and, later, when such studies are put to practice—a number of issues arise, and we have addressed these issues in this review. Moreover, our research has filled several research gaps and contributes to the big picture as outlined below:
  • A fairly large number of studies around the world apply biometric and neuroscience methods to determine and analyze AFFECT. However, there has been no integrated review of these studies.
  • Another missing piece is a review of AFFECT recognition, classification, and analysis based on Plutchik’s wheel of emotions theory. We have examined 30 emotions and affective states defined in the theory.
  • Information on diversity attitudes, socioeconomic status, demographic and cultural background, and context is missing from many studies. We have therefore identified real-time context data and integrated them with AFFECT data. The correct assessment of AFFECT and predictions of imminent behavior are becoming very important in a highly competitive market.
  • To demonstrate a few of the aforementioned new research areas in practice, we have developed our own metric, the Real-time Vilnius Happiness Index (Section 4), among other tools. These studies have used integrated methods of biometrics and neuroscience, which are widely applied in various fields of human activity.
  • In this research, we therefore examine a more complex problem than any prior studies.
The following sections present the results of this study, a discussion, the conclusions we can draw, and avenues for future research. The method is presented in Section 2. Section 3 summarizes the emotion models. In Section 4, we discuss about brain and biometrics AFFECT sensors, classifications of biometric and neuroscience methods and technologies, emotions and explores the use of traditional, non-invasive neuroscience methods (Section 4) and widely used and advanced physiological and behavioral biometrics (Section 4). Section 4 also summarizes prior research and studies techniques for the recognition of arousal, valence, affective attitudes, and emotion-al and physiological states (AFFECT) in more detail. We summarize existing research on users’ demographic and cultural backgrounds, socioeconomic status, diversity attitudes, and the context in Section 5. We present our research results in Section 6, evaluation of biometric systems in Section 7, and finally, a discussion and our conclusions in Section 8.

2. Method

The research method we used can be broken down as follows: (1) formulating the research problem; (2) examining the most popular emotion models, identifying the best option among them for our research (Section 3), and creating the Big Picture for the model; (3) carrying out a review of publications in the field (Section 4); (4) raising and confirming two hypotheses; (5) collecting data; (6) using the INVAR method for multiple criteria analysis of 169 countries; (7) determining correlations; (8) developing three maps to illustrate the way the success of the 169 countries impacts the number of Web of Science articles on AFFECT (emotional, affective, and physiological states) recognition and their citation rates; (9) developing three regression models; and (10) consolidating the findings, providing a rationale for the current methods, comparing the effectiveness of existing methods, and quantifying how likely they are to address the issues and challenges in the field. The following ten steps of the method describe the proposed algorithm and its experimental evaluation in detail.
Furthermore, the research aim is to review publications on how techniques that use brain and biometric sensors can be used for AFFECT recognition, consolidate the findings, provide a rationale for the current methods, compare the effectiveness of existing methods, and quantify how likely they are to address the issues/challenges in the field (Step 1). We have analyzed the global gap in the area of neuroscience and affective biometric sensors and have set the goal of updating the current big picture. The findings of the research above framed the problem.
Step 2 of the research was to examine the most popular emotion models (Section 3) and identify the best option among them for our research. We have chosen the Plutchik’s wheel of emotions and one of the main reasons is that the model enables integrated analysis of human emotional, affective, and physiological states.
Step 3 was to review sensors, methods, and applications that can be used in the recognition of emotional, affective, and physiological states (Section 4). We have identified the major changes in the field over the past 32 years through a text analysis of 21,397 articles indexed by Web of Science from 1990 to 2022. We searched for keywords in three databases (Web of Science, ScienceDirect, Google Scholar) to identify studies investigating the use of both neuroscience and affective biometric sensors. A total of 634 studies that used both neuroscience and affective biometric sensor techniques in the study methodology were included, and no restrictions were placed on the date of publication. Studies which investigated any population group were at any age or gender were considered in this work.
A set of keywords related to biometric and neuroscience sensors were used for the above search of three databases. Two main sets of keywords “sensors + biometrics + emotions” and “sensors + neuroscience/brain + emotions” were used in our main search. More specific search terms related to biometrics (i.e., eye tracking, blinking, iris, odor, heart rate), neuroscience/brain techniques (i.e., EEG, MEG, TMS, NIRS, SST) and their components (i.e., algorithms, functionality, performance) were also used to refine the search. For each candidate article, the full text was accessed and reviewed to determine its eligibility. The primary results and article conclusions were identified, and discrepancies were resolved by way of discussion. The studies differed significantly in terms of protocol design, signal processing, stimulation methods, the equipment used, the study population, and statistical methods.
In Step 4, two central hypotheses were raised and confirmed:
Hypothesis 1.
There is an interconnection between a country’s success, its number of Web of Science articles published, and its citation frequency on AFFECT recognition. When there are changes in the country’s success, its number of Web of Science articles published, and its citation times on AFFECT recognition, the countries’ 7 cluster boundaries remain roughly the same (Section 6).
Hypothesis 2.
Increases in a country’s success usually go hand in hand with a jump in its number of Web of Science articles published and its citation times on AF-FECT recognition.
Next, in Step 5, we collected data. The determination of the success of 169 countries and the results obtained are described in detail in a study by Kaklauskas et al. [66]. This study used data [66] from the framework of variables taken from a number of databases and websites, such as the World Bank, Eurostat-OECD, the World Health Organization, Global Data, Global Finance, Transparency International, Freedom House, Knoema, Socioeconomic Data and Applications Center, Heritage, the Global Footprint Network, Climate Change Knowledge Portal (World Bank Group, Washington, DC, USA), the Institute for Economics and Peace, and Our World in Data; global and national statistics and publications were also used. We based our research calculations on publicly available data from 2020 (or the latest available).
We used the INVAR method [67] to conduct a multi-criteria examination of the 169 nations—the outcomes can be found in Section 6 (Step 6). This method determines a combined indicator for whole nation success. This combined indicator is in direct proportion to the corresponding impact of the values and significances of the specified indicators on a nation’s success. The INVAR method was used to conduct multiple criteria analyzes of different groups of countries, such as the former Soviet Union [68], Asian countries [69], and the global analysis of 169 [66] and 173 [70] countries.
The study’s 7th step presents the median values of the correlations for 169 countries, its publications, and citations (Section 6). It was found that the median correlation of the dependent variable of the Publications—Country Success model with the independent variables (0.6626) is higher than in the Times Cited—Country Success model (0.5331). Therefore, it can be concluded that the independent variables in the Publications—Country Success model are more closely related to the dependent variable than in the Times Cited—Country Success model.
In Step 8, we developed three maps that illustrate the way the success of the 169 countries impacts the number of Web of Science articles on AFFECT (emotional, affective, and physiological states) recognition and their citation rates. The Country’s Success and AFFECT Recognition Publications (CSP) Maps of the World are a convenient way to illustrate how the three predominant CSP dimensions (a country’s success, the numbers of publications, and the frequency of articles being cited) are interconnected for the 169 countries, while the CSP models allow for these connections to be statistically analyzed from various perspectives. It also allows for CSP dimensions to be forecast based on the country’s success criteria. In other words, the CSP models give us a more detailed analysis of the CSP dimensions through statistical and multi-criteria analysis, while the CSP maps (Section 6) are more of a way to present the results in a visual manner. The amount of data available is gradually increasing, as is the knowledge gained from research conducted around the world. As a result, the CSP models are becoming better and better, and providing a clearer reflection of the actual picture. This means that they can effectively facilitate research and innovation policy decisions.
In Step 9, we created two regression models (Section 6). For the multiple linear regressions, we used IBM SPSS V.26 to build two regression models on 15 indicators of country success [66] and the three predominant CSP dimensions (Section 6). Step 9 entailed the construction of regression models for the number of publications and their citation rates, and the calculation of the effect size indicators describing them. Two dependent variables and 15 independent variables were analyzed to construct these regression models. The process was as follows:
  • Construction of regression models for the numbers of publications and their citations.
  • Calculation of statistical (Pearson correlation coefficient (r), standardized beta coefficient (β), coefficient of determination (R2), standard deviation, p-values) and non-statistical (research context, practical benefit, indicators with low values) effect size indicators describing these regression models.
It was found that changes in the values of the Country Success variable explain the variance of the Publications variable by 89.5%, and the variance of the Times Cited variable by 54.0%. Additionally, when the value of the Country Success variable increases by 1%, the value of Publications increases by 1.962% and Times Cited—by 2.101%. As the success of a country increased by 1%, the numbers of Web of Science articles published and their citations grew by 1.962% and 2.101%, respectively. A reliability analysis of the compiled regression models allows us to conclude that the models are suitable for analysis (p < 0.05). The 15 country success indicators explained 69.4% and 51.18% of the number of Web of Science articles published and their citations, respectively.
Step 10 was to assess the biometric systems under analysis: the rationale behind the available biometric and brain approaches was outlined, the efficacy of existing methods compared, and their ability to address issues and challenges present in the field determined (Section 7).

3. Emotion Models

First, this chapter will discuss emotion models in more detail. Then, we will choose the best option for our research and look at the Big Picture, i.e., the links between the selected emotion model and biometric and brain sensors, and the trends.
Emotional responses are natural to humans, and evidence shows they influence thoughts, behavior, and actions. Emotions fall into different groups related to various affects, corresponding to the current situation that is being experienced [71]. People encounter complex interactions in real life, and respond to them with complex emotions that often can be blends [72]. Emotional responses are the way for our brain and body to deal with our environment, and that is why they are fluid and depend on the context around us [73].
Two fundamental viewpoints form the basis in approaches to the classification of emotions: (a) emotions are discrete constructs and they have fundamental differences, and (b) emotions can be grouped and characterized on a dimensional basis [74]. These classifications (emotions as discrete categories and dimensional models of emotion) are briefly analyzed next.
In word recognition, alternative models have so far received little interest, and one example is the discrete emotion theory [75]. This theory posits that there is a limited set of universal basic emotions hardwired through evolution, and that each of the wide variety of affective experiences can essentially be categorized into this limited set [76,77]. The discrete emotion theory states that many emotions can be distinguished on the basis of expressive, behavioral, physiological, and neural features [78]. The definition of emotions provided by Fox [79] states they are consistent and discrete responding processes that can include verbal, physiological, behavioral, and neural mechanisms. They are triggered and changed by external or internal stimuli or events and respond to the environment. Russell and Barrett [80] argue that, unlike the discrete emotion theory, their alternative models can account for the rich context-sensitivity and diversity of emotions. Emotion blends could be of three kinds: (a) Positive-blended emotions were blends of only positive emotions; (b) negative-blended emotions were blends of only negative emotions; and (c) mixed emotions were blends of both positive and negative emotions, as well as neutral ones. The way teachers have described blended emotions reflects that mathematics teaching involves many and complex tasks, where the teacher has to continuously keep gauging the level of progress [81].
Emotional dimensions represent the classes of emotion. Categorized emotions can be characterized in a dimensional form, with each emotion located in a different location in space, for example in 2D (the circumplex model, “consensual” model of emotion, and vector model) or 3D (the Lövheim cube, the pleasure–arousal–dominance (PAD) emotional state model, and Plutchik’s model) [82].
The circumplex model [83] proposes that two independent neurophysiological systems: One of the systems is related to arousal (activated/deactivated) and to valence (a pleasure–displeasure continuum), and the other to valence (a continuum from pleasure to displeasure) and to arousal (activation–deactivation) [84]. Each emotion can be understood as having varying valence and arousal, and is a linear combination of these two dimensions, or as varying valence and arousal [83,85]. We already applied the Russel’s circumplex model of emotions to perform a review of the human emotion recognition of sensors and methods [85].
The vector model comprises two vectors. The model holds that there is an underlying dimension of arousal with a binary choice of valence that determines direction, and an underlying dimension of arousal. This results in there being two vectors that, both starting at zero arousal and neutral valence and zero arousal, proceed as straight lines, one in a positive, and one in the direction of negative valence and the other in the direction of positive valence. Typically, the vector model uses direct scaling of the dimensions of each individual stimulus individually in this model [86,87].
The positive activation–negative activation (PANA) or “consensual” model of emotion, also known as positive activation/negative activation (PANA), assumes that there are two separate systems—positive affect and negative affect. In the PANA model, the vertical axis represents low to high positive affect, and the horizontal axis of this model represents low to high negative affect (low to high). The vertical axis represents positive affect (low to high) [88]. There are two uncorrelated and independent dimensions: Positive Affect (PA), represents the extent (from low to high) to which a person shows enthusiasm for life. The second factor is Negative Affect (NA), and NA represents the extent to which a person is feeling upset or unpleasantly aroused. Positive Affect and Negative Affect are independent and uncorrelated dimensions [89].
The Pleasure–Arousal–Dominance (PAD) Emotional-State Model, offers a general three-dimensional approach to measuring emotions [90]. This 3D model captures emotional response, and includes the three dimensions of pleasure–displeasure (P), arousal–nonarousal (A), and dominance–submissiveness (D) as basic factors of emotional response [91]. The initials PAD stand for pleasure, arousal, and dominance, which span different emotions. For instance, pleasure can be happy/unhappy, hopeful/despairing, satisfied/unsatisfied, pleased/annoyed, content/melancholic, and relaxed/bored. Arousal can be excited/calm, stimulated/relaxed, wide-awake/sleepy, jittery/dull, frenzied/sluggish, and aroused/unaroused. Dominance can be important/awed, dominant/submissive, influential/influenced, controlling/controlled, in control/cared-for, and autonomous/guided [92]. The neuro-decision and neuro-correlation tables, the inverted U-curve theory, the PAD emotional state model, neuro-decision making, and neuro-correlation tables are used to evaluate the impact of digital twin smart spaces (such as indoor air quality, a level of the lighting intensity and colors, learning materials, images, smells, music, pollution, and others) on users, and track their response dynamics in real time, and to then react to this response [93].
The PAD is composed of three different subscales, reflecting pleasure, arousal, and dominance. These can represent different emotions; for example, the pleasure states include happy (unhappy), pleased (annoyed), satisfied (unsatisfied), contented (melancholic), hopeful (despairing) and relaxed (bored), while the arousal states include stimulated (relaxed), excited (calm), frenzied (sluggish), jittery (dull), wide awake (sleepy) and aroused (unaroused), and the dominance states include controlling (controlled), influential (influenced), in control (cared for), important (awed), dominant (submissive), and autonomous (guided) [92]. The affective space model makes it possible to visualize the distribution of emotions along the two axes of valance (V) and arousal (A). Using this model, different emotions can be identified, such as happiness, calmness, fear, and sadness [94].
Swedish neurophysiologist Lövheim proposed that a cube of emotion is the direct relation between certain specific combinations of the levels of the three signal substances (serotonin, noradrenaline, and dopamine) and eight basic emotions [95]. A three-dimensional model, the Lövheim cube of emotion, was presented where there is a model with each of the signal substances of form represented as the axes of a coordinated system, and each corner of this 3D space holding one of the eight basic emotions is placed in the eight corners. In this model, anger is produced by the combination of high noradrenaline, high dopamine, and low serotonin [96].
The eight main categories of emotions defined by Robert Plutchik in 1980s include two equal groups opposite to each other: half are positive emotions and the other half are negative ones [97]. To visualize eight primary emotion dimensions, which are fear, trust, surprise, anticipation, anger, joy, disgust and sadness, eight sectors have been isolated [98]. The Emotion Wheel shows each of the eight basic emotions highlighted with a recognizable color [99]. When we add another dimension, the Wheel of Emotions becomes a cone with its vertical dimension representing intensity. Moving from the outside towards the wheel’s center emotions intensify and this fact is highlighted by the indicator color. The intensity of emotions is decreasing towards the outer edge and the color, correspondingly, becomes less intense [98,99]. When feelings intensify one feeling can turn into another: annoyance into rage, serenity into ecstasy, interest into vigilance, apprehension into terror, acceptance into admiration, pensiveness into grief, distraction into amazement, and, if left unchecked, boredom can become loathing [98]. Some emotions have no color marking. They are a mix of two primary emotions [98,99]. Joy and anticipation, for instance, combine to become optimism. When anticipation combines with anger it becomes aggressiveness. The combination of trust and fear is submission, joy and trust combine to become love, surprise and fear become awe, the pair of disgust and anger becomes contempt, sadness and disgust combine to become remorse, and surprise and sadness become disapproval [100].
After the analysis of the said emotion models, we have made the decision to choose Plutchik’s wheel of emotions for our research. The ability to analyze human emotional, affective, and physiological states in an integrated manner offered by this model is one of the main reasons of our choice. The wheel is briefly discussed below.
Several ways to classify emotions have been proposed in the field of psychology. For that purpose, the basic emotions are first identified and then they allow clustering with any other more complex emotion [101]. Plutchik [65] proposed a classification scheme based on eight basic emotions arranged in a wheel of emotions, similar to a color wheel. Just like complementary colors, this setup allows the conceptualization of primary emotions by placing similar emotions next to each other and opposites 180 degree apart. Plutchik’s wheel of emotions classifies these eight basic emotions grounded on the physiological aim [102]. Emotions are coordinated with the body’s physiological responses. For example, when you are scared, your heart rate typically increases and your palms become sweaty. There is ample empirical evidence that suggests that physiological responses accompany emotion [103]. Another parallel with colors is the fact that some emotions are primary emotions and other emotions are derived by combining these primary emotions. The two models share important similarities, and such modelling can also serve as an analytical tool to understand personality. In this case, a third dimension has been added to the circumplex model to represent the intensity of emotions. The structural model of emotions is, therefore, shaped like a cone [104]. Figure 1 demonstrates Plutchik’s wheel of emotions, biometrics and brain sensors, and trends and interdependence in this Big Picture stage. At the center of the circles is Plutchik’s wheel of emotions. Plutchik’s wheel of emotions also includes affective attitudes (interest, boredom). Plutchik [65] notes that the same instinctual source of energy is discharged as part of the emotion felt and the underlying peripheral physiological process. Emotions can be of various levels of arousal or degrees of intensity [105]. Looking at the intensity of Plutchik’s eight basic emotions, Kušen et al. [106] identified variations in emotional valence. The first circle, therefore, analyses, directly or indirectly, human arousal, valence, affective attitudes, and emotional and physiological states (AFFECT). Human AFFECT can be measured by means of neuroscience and biometric techniques. The market and global trends are a constant force affecting neuroscience and biometric technologies and their improvement. Based on the analysis of global sources [107,108,109,110] and our experience, Figure 1 presents brain and biometric sensors, as well as technique trends. Sensors will be able to integrate more and more new technologies and collect a greater variety of data, as they will become more accurate, more flexible, cheaper, smaller, greener, and more energy-efficient [108,109,110]. Network neuroscience, a new explicitly integrative approach towards brain structure and function, seeks new ways to record, map, model, and analyze what constitutes neurobiological systems and what interactions happen inside them. The computational tools and theoretical framework of modern network science, as well as the availability of new empirical tools to map extensively and record the way shifting patterns link molecules, neurons, brain areas and social systems, are two trends enabling and driving this approach [107].
Figure 2 shows numerous sciences and areas in which neuroscience and biometrics analyze the AFFECT. According to Sebastian [111], neuroeconomics is the study of the effect of anticipating money decisions on our brain. It has solidified as an entirely academic and unifying field that ventures to describe the techniques of the decision-making process; and reiterates economic behavior and decision-making process with economic disposition. The procedure of neuroeconomics involves the integration of behavioral experiments and brain imaging in order to more clearly appreciate the workings behind individual and collective decision-making [112]. Serra [113] reported that neuroeconomics researchers utilize neuroimaging devices such as functional magnetic resonance imaging (fMRI), magnetic resonance imaging (MRI), transcranial magnetic stimulation (rTMS), and transcranial direct-current stimulation (tDCS), positron emission tomography (PET) and electroencephalography (EEG). The majority of challenges probed by neuroeconomics researchers are basically similar to the problems a marketing researcher would acknowledge as aspects of their functional domain [114]. Kenning and Plassmann [115] has also defined neuroeconomics as the implementation of neuroscientific methods in the evaluation and appreciation of economically significant behavior.
According to Wirdayanti and Ghoni [116], neuromanagement entails psychology, the biological aspect of humans for decision-making in management sciences. As stated Teacu Parincu et al. [117], neuromanagement is targeted at investigating the acts of the human brain and mental performances whenever people are confronted with management challenges, using cognitive neuroscience, in addition to other scientific disciplines and technology, to evaluate economic and managerial problems. Its focal point is on neurological activities that are related to decision-making and develops personal as well as organizational intelligence (team intelligence). It also centers on the planning and management of people (for example, selection, training, group interaction and leadership) [118].
Neuro-Information Science can be defined as the science that observes neurophysiological reactions that are connected with the peripheral nervous system; that is then connected to conventional cognitive activities. Michalczyk et al. [119] stated that neuro-information-systems research has developed into a conventional approach in the information systems (IS) discipline for evaluating and appreciating user behavior. Riedl et al. [120] and Michalczyk et al. [119] concluded that Neuro-information-systems comprise studies that are centered on all types of neurophysiological techniques, such as functional magnetic resonance imaging (fMRI), electroencephalograhy (EEG), fNIRS (functional near-infrared spectroscopy), electromyography (EMG), hormone studies, or skin conductance and heart rate evaluations, as well as magnetoencephalography (MEG) and eye-tracking (ET).
Neuro-Industrial Engineering brought about by the synergy between neuroscience and industrial engineering has afforded resolutions centered on the physiological status of people. Ma et al. [121] reported that NeuroIE secures its objective and real data by analyzing human brain and physiological indexes with advanced brain AFFECT devices and biofeedback technology, evaluating the data, adding neural activities as well as physiological status in the process of evaluation; as new constituents of operations management, and finally understanding better human–machine integration by modifying work environment and production system in line with people’s reaction to the system, preventing mishaps and enhancing efficiency and quality. According to Ma et al. [121], Neuro-Industrial Engineering is centered on humans and lays hold of human physiological status data (e.g., EEG, EMG, GSR and Temp). Zev Rymer [122] also stated that the application of Neuro-Industrial Engineering is multidisciplinary in that it cuts across the neurological sciences (particularly neurology and neurobiology) in addition to different fields of engineering disciplines such as simulation, systems modeling, robotics, signal processing, material sciences, and computer sciences. The area encompasses a range of topics and applications; for example, neurorobotics, neuroinformatics, neuroimaging, neural tissue engineering, and brain–computer interfaces.
As soon as a user contacts an insurer, a bank or any other call center, a version of Cogito’s software known as Dialog could be active in the background, assisting the client service agent to deal with the client. Should the user become upset or angry, the client service agent can ensure that necessary actions are taken to satisfy the client. According to Cogito, this service is known as “digital intuition”. Its usefulness in call centers cannot be overemphasized as it can give feedback about real-time communications. The speed at which speeches are made by the callers as well as the dynamic range of their voices can also be analyzed by the software. For example, significant variations in pitch and stresses in caller’s tones could signify excitement or anger. Less significant dynamism, a monotonous flat tone, could imply a lack of interest or unconcern. Some companies make use of the software to assist their employees engage new patients for healthcare projects that help control health challenges such as obesity or asthma. Cogito is among recent profit-based research companies whose focus are on the evaluation of signals subconsciously given off by people which exposes their mindset. The evaluation of these kinds of social-signals is beneficial beyond call centers and meeting rooms. According to Hodson [123], keeping track of conversations during surgeries or plane cockpits could assist surgeons and pilots to be aware of whether their colleagues are really attentive to their directives, possibly preserving lives.
Several areas where we can apply the technology of recognizing emotions from speech include human–computer interactions and call centers [124].

4. Brain and Biometric AFFECT Sensors

4.1. Classifications

Globally, several classifications of biometric and neuroscience methods and technologies are used. Our research focuses on neuroscience methods that are non-invasive. The use of non-invasive brain stimulation is widespread in studies of neuroscience [125]. The non-invasive neuroscience methods are: transcranial magnetic stimulation (TMS), electroencephalography (EEG), magnetoencephalography (MEG), positron emission tomography (PET), functional magnetic resonance imaging (fMRI), near infrared spectroscopy (NIRS), diffusion tensor imaging (DTI), steady-state topography (SST), and others [126,127,128,129,130,131,132,133,134]. These non-invasive neuroscience methods are described in detail in Section 3. In the future, the authors of this article plan to analyze invasive neuroscience methods, too.
Biometrics can be physical or behavioral. In the first case, emotions can be identified by their physical features, including face, and in the second case by their behavioral characteristics, including gait, voice, signature, and typing patterns [135]. Various sensors can measure physiological signals, known as biometrics, capturing the response of bodily systems to things that are experienced through our senses, but also things imagined, by tracking sleep architecture, heart rate variability (HRV), respiratory rate (RR), and heart rate (RHR) [136].
Scientific literature classifies biometrics into certain types. Stephen and Reddy [137] and Banirostam et al. [138], for instance, classify biometrics into three categories: physiological, behavioral, and chemical/biological. Yang et al. [139] distinguish physiological and behavior traits. Kodituwakku [140] believes biometric technology can be classified into two general categories: physiological biometric techniques and behavioral biometric techniques. Jain et al. [141] and Choudhary and Naik [142] also classify biometrics into two categories: physiological and behavioral. In the literature, not only signature, voice, and gait are considered behavioral biometric features, but also ECG, EMG, and EEG [143], while other authors distinguish cognitive biometrics [144,145], including electroencephalography (EEG), electrocardiography (ECG), electrodermal response (EDR), blood pulse volume (BVP), near-infrared spectroscopy (NIR), electromyography (EMG), eye trackers (pupillometry), hemoencephalography (HEG), and related technologies [145]. Some scientific sources claim that eye tracking is a behavioral biometric [146], while others claim that it is a measurement in physiological computing [147]. Physiological biometrics measures the physiological signals to determine identity as well as authenticating and analyzing users emotions. Respiration, perspiration, heartbeat, eye-reactions to light, brain activity, emotions, and even body odor can be measured for numerous purposes, including physical and logical access control, payments, health monitoring, liveness detection, and neuromarketing among them [136].
Scientists identify the following AFFECT biometric types [139,140,141,142,148,149,150]:
  • Physiological features: facial patterns, odor, pupil dilation and contraction, skin conductance, heart rate, respiratory rate, temperature, blood volume pulse, and others.
  • Behavioral features: gait, keystroke, mouse tracking, signature, handwriting, speech/voice, and others.
  • The authors of this article have used the classification of biometrics proposed by the abovementioned authors (physiological and behavioral features).
Biometric technologies are usually divided into those of first and second generation [151]. First-generation biometrics can confirm a person’s identity in a quick and reliable way, or authenticate them in different contexts, and law enforcement is one of the areas where such solutions are employed in practice [152]. The primary purpose of first-generation biometrics is identity verification, such as facial recognition, and the technology is built around simple sensors that capture physical features and store them for later use [153]. Second-generation biometrics can also be used to detect emotions, with electro-physiologic and behavioral biometrics (e.g., based on ECG, EEG, and EMG) as examples of such technologies [154]. Second-generation biometrics measure individual patterns of learned behavior or physiological processes, rather than physical traits, and are also known as behavioral biometrics [155]. Second-generation biometrics usage has the ability to analyze/evaluate emotions and detect intentions [156]. The use of second-generation biometrics enables wireless data collection regarding the body. The data can then be used to infer an individual’s intent and emotions, as well as emotion tracking across spaces [151,157]. We examine only physiological effects affected by emotional reactions (i.e., second-generation biometrics), and the use of biometric patterns for the identification of individuals is not discussed in this study.
A diverse range of AI algorithms have been applied for AFFECT recognition, for example machine learning, artificial neural networks, search algorithms, expert systems, evolutionary computing, natural language processing, metaheuristics, fuzzy logic, genetic algorithms, and others. Some of the most important supervised (classification, regression), unsupervised (clustering), and reinforcement learning algorithms of machine learning are common as tools in biometrics or neuroscience research to detect emotions and affective attitudes, and are listed below:
  • Among classification algorithms the most common choices are: naïve Bayes [158,159,160], Decision Tree [161,162,163], Random Forest [164,165,166], Support Vector Machines [167,168,169], and K Nearest Neighbors [170,171,172].
  • Among regression algorithms the usual choices are: linear regression [173,174,175], Lasso Regression [176,177], Logistic Regression [178,179,180], Multivariate Regression [181,182], and Multiple Regression Algorithm [183,184].
  • Among clustering algorithms the most common choices in biometrics or neuroscience research are: K-Means Clustering [185,186,187], Fuzzy C-means Algorithm [188,189], Expectation-Maximization (EM) Algorithm [190], and Hierarchical Clustering Algorithm [188,191,192].
  • Among reinforcement learning algorithms the most common choices are: deep reinforcement learning [193,194,195] and inverse reinforcement learning [196].

4.2. Brain AFFECT Devices and Sensors

Neuroscience is associated with multiple fields of science, for example chemistry, computation, psychology, philosophy, and linguistics. Various research areas of neuroscience include behavioral, molecular, operative, evolutionary, cellular, and therapeutic features of the neurotic system. The neuroscience market encompasses technology (electrophysiology, neuro-microscopy, whole-brain imaging, neuroproteomics analysis, animal behavior analysis, neuro-functional study, etc.), components (services, instrument, and software) and end-users (healthcare centers, research institutions and academic, diagnostic laboratories, etc.) [197]. Global Industry Analysts Inc. (San Jose, CA, USA) [197] has previously grouped the global neuroscience market into instrument, software, and services based on components.
Neuroscience provides valuable perceptions concerning the structural design of the brain and neurological, physical, and psychological activities. It helps neurologists to appreciate the various components of the brain that can assist in the development of medications and techniques to handle and avoid many neurological anomalies. The rising death rate as a result of several neurological disorders, such as Parkinson’s disease, Alzheimer’s, schizophrenia, and other brain-related health challenges, represents the basic factor controlling the neuroscience market growth [198]. According to Neuroscience Market [198], the increasing request for neuroimaging devices and the progressive brain mapping research and evaluation projects are other crucial growth-inducing factors.
Neuroscience covers a whole range of branches, such as, neuroevolution, neuroanatomy, developmental neuroscience, neuroimmunology, cellular neuroscience, neuropharmacology, clinical neuroscience, cognitive neuroscience, nanoneuroscience, molecular neuroscience, neurogenetics, neuroethology, neurochemistry, neurophysics, paleoneurobiology, neurology, and neuro-ophthalmology.
Other branches of neuroscience analyze AFFECT in various related sciences and fields, such as affective neuroscience [199,200], neuroinformatics [201,202], neuroimaging [203,204], systems neuroscience [205,206], computational neuroscience [207,208], neurophysiology [51,209], behavioral neuroscience [210,211], neural engineering [212,213], neuroeconomics [214,215], neurolinguistics [216,217], neuropsychology [218,219,220], neurophilosophy [221,222,223], neuroaesthetics [224,225,226], neurotheology [227,228,229], neuropolitics [230,231,232], neurolaw [233,234,235], social neuroscience [236,237], cultural neuroscience [238,239], neuroliterature [240,241,242], neurocinema [243,244,245], neuromusicology [246,247,248], and neurogastronomy [249,250].
For example, Lim [251] identifies the following neuroscientific techniques for neuromarketing:
  • Electromagnetic methods, including magnetoencephalography (MEG), electroencephalography (EEG), and steady-state topography (SST). MEG involves the magnetic fields produced by the brain (its natural electrical currents) and is used to track the changes that occur when participants see or interact with various presentation outputs. EEG is related to the ways in which brainwaves change and is used to detect changes when participant see or interact with various promoting outputs (an electrode band or helmet is used for this purpose). SST measures a steady-state visually evoked potential, and is used to determine how brain activities change depending on the task;
  • Metabolic methods, including positron emission tomography (PET) and functional magnetic resonance imaging (fMRI). PET is used to examine the metabolism of glucose within the brain with great accuracy by tracing radiation pulses, while fMRI is used to measure blood flow in the brain to determine changes in brain activity;
  • Electrocardiography (ECG), which uses external skin electrodes to measure electrical changes related to cardiac cycles;
  • Facial electromyography (fEMG), which amplifies tiny electrical impulses to record the physiological properties of the facial muscles;
  • Transcranial Magnetic Stimulation (TMS), which is used to observe the effects of promoting output on behavior by temporarily disrupting specific brain activities. TMS is a non-invasive, safe brain stimulation method. By means of a strong electromagnet, this technique momentarily generates a short-lived virtual lesion, i.e., disrupts information processing in one of brain regions. If stimulation interferes with performing a certain task, the affected brain region is, then, necessary for normal performance of the task [252].
Table 1 demonstrates traditional non-invasive neuroscience methods.
For clarity, several descriptions of traditional neuroscience methods are presented below.
Wearable healthcare devices store a lot of sensitive personal information which makes the security of these devices very essential. Sun et al. [272] proposed an acceleration-based gait recognition method to improve gait-based elderly recognition. Gait is also a good indicator in health assessment, Majumder et al. [273] created a simple wearable gait analyzer for the elderly to support healthcare needs.
Lim [251] states that neuroscientific methods and tools include those that track, chart, and record the activity of a person’s neural system and brain in relation to a certain behavior, and neurological representations of this activity can then be generated to shed light on how an individual’s brain and nervous system respond when the person is exposed to a stimulus. In this way, neuroscientists can observe the neural processes as they happen in real time. There are three main types of neuroscientific method: those that track what is happening inside the brain (metabolic and electromagnetic activity); those that track what is happening at the neural level outside the brain; and those that can influence neural activity (Table 1, Figure 1).
Non-invasive neuroscience technical information is provided in detail in various research literature about the origin of the measured signal and the engineering/physical principle of the sensors for EEG [274,275,276], MEG [277,278,279], TMS [280,281,282], etc.
Gannouni et al. [283] have proposed a new approach with EEG signals used in emotion recognition. To achieve better emotion recognition using brain signals, Gannouni et al. [283] applied a novel adaptive channel selection method. The basis of this method is the acknowledgment that different persons have unique brain activity that also differs from one emotional state to another. Gannouni et al. [283] argue that emotion recognition using EEG signals needs a multi-disciplinary approach, encompassing areas such as psychology, engineering, neuroscience, and computer science. With the aim of improving the reproducibility of emotion measurement based on EEG, Apicella et al. [35] have proposed an emotional valence detection method for a system based on EEG, and their experiments proved an accuracy of 80.2% in cross-subject analysis and 96.1% in within-subject analysis. Dixson et al. [284] have pointed out that facial hair may interfere with detection of emotional expressions in a visual search. However, facial hair may also interfere with the detection of happy expressions within the face in the crowd paradigm, rather than facilitating an effect of anger superiority as a potential system for threat detection.
Wang et al. [285] introduced an EEG-based emotion recognition system to classify four emotion states (joy, sadness, fear, and relaxed). Their experiments used movie elicitation to acquire EEG signals from their subjects [285]. The way in which meditation influences emotional response was investigated via EEG functional connectivity of selected brain regions as the subjects experienced happiness, anger, sadness or were relaxed, before and after meditation.
Neurometrics is a quantitative EEG method. Looking at individual records, this method provides a reproducible, precise estimate of deviations from normal. Only sufficient amount of good quality raw data transformed for Gaussian distributions, correlated with age, and corrected taking into account intercorrelations among measures ensure meaningful and reliable results [286]. Businesses, government agencies, and individuals use neurometric information when they need timely and profitable decisions. Techniques based on neurometric information are applied to make profitable business decisions. These techniques are based on biometric information, eye tracking, facial action coding and implicit response testing, and are used to understand and record human sentiments and other related feedback [161].
The fronto-striatal network is involved in a range of cognitive, emotional, and motor processes, such as decision-making, working memory, emotion regulation, and spatial attention. Practice shows that intermittent theta burst transcranial magnetic stimulation (iTBS) modulates the functional connectivity of brain networks. Treatments of mood disorders usually involve high stimulation intensities and long stimulation intervals in transcranial magnetic stimulation (TMS) (Figure 3) therapy [287].
One of imaging techniques is FDG-PET/fMRI (simultaneous [18F]-fluorodeoxyglucose positron emission tomography and functional magnetic resonance imaging). This technique makes it possible to image the cerebrovascular hemodynamic response and cerebral glucose uptake. These two sources of energy dynamics in the brain can provide useful information. Another greatly useful technique for characterizing interactions between distributed brain regions in humans has been resting-state fMRI connectivity, while metabolic connectivity can be a complementary measure to investigate the dynamics of the brain network. Functional PET (fPET), a new approach with high temporal resolution, can be used to measure fluoro-d-glucose (FDG) uptake and looks like a promising method to assess the dynamics of neural metabolism [288]. Figure 4 shows raw images of signal intensity variation across the brain for one individual subject.
Many biological tissues comprised of fibers, which are groups of cells aligned in a uniform direction, have anisotropic properties. In the human brain, for instance, within its white matte regions, axons usually form complex fiber tracts that enable anatomical communication and connectivity. Non-invasive tools can show the groups of axonal fibers visually. One of them is diffusion tensor magnetic resonance medical imaging (DTI), which is one particular method or application of the broader Diffusion-Weighted Imaging (DWI). The basic principle behind this technique is that water diffuses more slowly as it moves perpendicular to the preferred direction, whereas in the direction aligned with the internal structure the diffusion is more rapid. The DTI outputs can be further used to compute diffusion anisotropy measures such as the fractional anisotropy (FA). The principal direction of the diffusion tensor can also be used to obtain estimates related to the white matter connectivity in the brain. Figure 5 shows an example of DTI tractography, or visualization of the white matter connectivity [289].

4.3. Physiological and Behavioral Biometrics

Physiological biometrics (as opposed to behavioral biometrics) is a category of approaches that refers to physical measurements of the human body, including face, pupil constriction and dilation [290]. When a recognition system is based on physiological characteristics it can ensure a comparatively high accuracy [291]. The ubiquity of electronics such as cell phones and computers, and evolving sensor technology offer human beings new possibilities to track their behavioral and physiological features and evaluate the associated biometric results. Advances in mobile devices mean they now have many efficient and complex sensors. Biometric technology often contributes to mobile application growth, including online transaction efficiency, mobile banking, and voting. The global market for biometric systems is wide and comprises many different segments such as healthcare, transportation and logistics, security, military and defense, government, consumer electronics, and banking and finance [292].
Table 2 presents widely used physiological and behavioral biometrics.
Most of today’s eye tracking systems are video-based, with an eye video camera and infrared illumination. Eye tracking systems can be categorized as tower-mounted, mobile, or remote based on how they interface with the environment and the user (Figure 6) and different video-based eye tracking systems are required depending on the experiment, the environment, and the type of activity to be studied [313]. Researchers have used eye-tracking for behavioral research.
The left image in Figure 7 shows the last frame of an expression showing surprise on a sample face from Cohn–Kanade database and highlights the trajectories (the bright lines that change color from darker to brighter from their start to end) followed by each tracked feature point. Figure 7. The application of the dense flow method (right) and the result of applying the feature optical flow on the subset of 15 points (left) [317].
A group of participants were tested to record the facial EMG (fEMG) activity. Following the guidelines for fEMG placement recommended by Fridlund and Cacioppo, two 4-mm bipolar miniature silver/silver chloride (Ag/AgCl) skin electrodes were placed on their left corrugator supercilii and zygomaticus major muscle regions (Figure 7) [318]. To avoid bad signals or other unwanted influences, the BioTrace software (on NeXus-32) was used to visualize and, if necessary, correct the biosignals before each recording. Figure 8 shows the arrangement of fEMG electrodes on the M. zygomaticus major and M. corrugator supercilii. An example of a filtered electromyography (EMG) signal is shown on the right side [319].
Humans have a range of biometric traits that can be a basis for various biometric recognition systems (Figure 9). The other biometrics traits are iris, face thermogram, gait, keystroke pattern, voice, face, and signature. They can have different significance. For example, iris scan has high accuracy, medium long term stability and medium security level, while voice recognition has low accuracy, low long term stability and low security level [320]. The choice of the biometric traits, however, invariably depends on the availability of the dataset’s samples, the application, the value of tolerance accepted, and the level of complexities [150].
Biometric sensors are transducers that change the biometric traits of a person, such as face, voice, and other characteristics, into an electrical signal. These sensors read or measure speed, temperature, electrical capacity, light, and other types of energy. Different technologies are available with digital cameras, sensor networks, and complex combinations. One type of sensor is required in every biometric device, and biometric sensors are a key feature of emotions recognition technology. Biometrics can be used in a microphone for voice capture or in a high-definition camera for facial recognition [321].
Jain et al. [141] state that enrolment and emotions recognition are two main phases in biometric emotions recognition systems. The enrolment phase means acquiring an individual’s biometric data to be stored in the database along with the emotions recognition details. The recognition phase uses the stored data to compare the data with the re-acquired biometric data of the same individual, to determine emotions. A biometric system is, therefore, a pattern recognition system consisting of a database, sensors, a feature extractor, and a matcher.
Loaiza [322] states that overall physiological effects related to emotional reactions depend on three types of autonomic variables: (1) the cardiac system, including blood pressure, cardiac cycles, and heart rate variability; (2) respiration, including amplitude, respiration period, and respiratory cycles; and (3) electrodermal activity, including resistance, responses, and skin conductance levels. Ekman [77] report that different emotions can have very different autonomic variables. For instance, in contrast to someone in a happy state, an angry person had a higher heart rate and temperature. Furthermore, the feeling of fear was also accompanied by higher heart rate. Pace-Schott et al. [323] argue that the ability to regulate physiological state and regulation of emotion are two inseparable features. Physiological feelings contribute to emotion regulation, reproduction, and survival.
Many works have focused on emotion detection using different techniques [35,283,284,324,325,326,327]. Specific tasks (e.g., WASSA-2017, SemEval) have also included emotion detection tasks that cover four categories of emotions (anger, fear, sadness, and joy) [320]. According to Saganowski et al. [326], the most common approach to the use of physiological signals in emotion recognition is to (1) collect and clean data; (2) to preprocess, synchronize, and integrate signal; (3) to extract and select features; and (4) to train and validate machine learning models.
Signals are a natural expression of the human body; they can be used with great success in the classification of emotional states. EEGs, temperature measurements, or electrocardiograms (ECGs) are examples of such physiological signals. They can help us to classify emotional states such as anger, sadness, or happiness, and can be captured by different sensors to identify individual differences. The goal of all of these physiological methods is to evaluate consumer attention and to obtain a particular message noticed, and their performance in this area is commendable. The advantages of these techniques include their creative and versatile placement, the stimulation of interest through novel means that capture attention, the ability to directly target and personalize messages, and lower implementation costs [328]. To study marketing trends, Singh et al. [328] recommend avoiding costly research methods such as fMRI and EEG, and instead using smaller and cheaper galvanic readings and eye tracking (ET) to investigate brain responses. These authors also propose a fuzzy rule-based algorithm to anticipate consumer behavior by detecting six facial expressions from still images.
Various organizations are contributing to the progress of biometric standards, such as international standards organizations (International Electrotechnical Commission, ISO-JTC1/SC37, London, UK), national standards bodies (American National Standards Institute, New York, NY, USA), standards-developing organizations (International Committee for Information Technology Standards, American National Institute of Standards and Technology, Information Technology Laboratory), and other related organizations (International Biometrics and Identification Association, International Biometric Group, Biometric Consortium, Biometric Center of Excellence) [329]. De Angel et al. [330] give rise to numerous recommendations to begin improving the generalizability of the research and generating a more standardized approach to sensing in depression.
  • Sample recommendations include reporting on recruitment strategies, sampling frames and participation rates; increasing the diversity of the study population by enrolling participants of different ages and ethnicities; reporting basic demographic data such as age, gender, ethnicity, and comorbidities; and measuring and reporting participant engagement and acceptability in terms of attrition rates, missing data, and/or qualitative data.
  • Furthermore, in machine learning models—describing the model selection strategy, performance metrics and parameter estimates in the model with confidence intervals or nonparametric equivalents.
  • Recommendations for data collection and analysis include using established and validated scales for depression assessment; presenting any available evidence on the validity and reliability of the sensor or device used; describing in sufficient detail so as to enable replication, data processing and feature construction; and providing a definition and description of how missing data is handled.
  • Recommendations for data sharing include making the code used for feature extraction available within an open science framework and sharing anonymized datasets in data repositories.
  • The key recommendation is recognizing the need for consistent reporting in this area. The fact that many studies—especially in the field of computer science—fail to report basic demographic information. A common framework should be developed that has standardized assessment and analysis tools and reliable feature extraction and missing data descriptions, and has been tested in more representative populations.
Neuromarketing, neuroeconomics, neuromanagement, neuro-information systems, neuro-industrial engineering, products, services, call centers studies use various instruments and techniques to measure user psychological states. Some of these tools are more complex than others, and the results that are produced can vary widely [331]. They fall into three major categories: the first two contain tools used for neuroimaging (medical devices offering in vivo information on the nervous system) and use techniques that measure brain electrical activity and neuronal metabolism, while the third contains tools used to evaluate neurophysiological indicators of the mental states of an individual. Leading neuroimaging tools such as fMRI and PET fall into the first category, while EEG, MEG, and other less invasive and cheaper neuroimaging devices that measure electrical activity in the brain [332] fall into the second category, and tools that track and record individual signals of broader physiological reaction and response measurements (e.g., electro-dermal activity, ET, etc.) fall into the third category.
Next, we overview the literature and examine the various types of arousal, valence, affective attitudes, and emotional and physiological states (AFFECT) recognition methods in more detail. A summary of the outcomes is provided in Table 3.
The combination of several different approaches to the recognition and classification of emotional state (also known as multimodal emotion recognition) is currently a research area of great interest, especially since the use of different physiological signals can provide huge amounts of data. Since each physiological can make a significant impact on the ability to classify emotions [333]. Table 3 presents an overview of studies related to the recognition of valence, arousal, emotional states, physiological states, and affective attitudes (affect). A brief overview of some of these studies follows.
Many scientists and practitioners have earned acclaim and honor for their research in areas such as diagnostics, large-scale screening, analysis, monitoring, and categorizations of people by COVID-19 symptoms. Their work relied on early warning systems, wearable technologies, the Internet of Medical Things, IoT based systems, biometric monitoring technologies, and other tools that can assist in the COVID-19 pandemic. Javaid et al. [438] review how different industry 4.0 technologies (e.g., AI, IoT, Big data, Virtual Reality, etc.) can help reduce the spread of disease. Kalhori et al. [439] and Rahman et al. [440] discuss the digital health tools to fight COVID-19. Various sensors and mobile devices to detect the disease, reduce its spread, and measure different symptoms are also widely discussed. Rajeesh Kumar et al. [441] propose a system to identify asymptotic patients using IoT-based sensors, measuring blood oxygen level, body temperature, blood pressure, and heartbeat. Stojanović et al. [442] propose a phone headset to collect information about respiratory rate and cough, Xian et al. [443] present a portable biosensor to test saliva. Chamberlain et al. [444] presented distributed networks of Smart thermometers track COVID-19 transmission epicenters in real-time.
Neurotransmitters (NT) are billions of molecules constantly needed to keep human brains functioning. They are chemical messengers that carry, balance, and boost signals travelling between nerve cells (neurons) and other cells in the body. Many different psychological and physical functions can be affected by these chemical messengers, including fear, appetite, mood, sleep, heart rate, breathing rate, concentration and learning [445]. Lim [251] has also outlined new ways of exploiting neuromarketing research to achieve a better understanding of the brain and neural activity and hence advance marketing science. Lim [251] highlighted three main aspects: (i) antecedents (such as the product, physical evidence, the price of the product, the place where everything is happening, promotion, the process involved, people); (ii) the process; and (iii) the consequences for the target market (behavioral outcomes before, during and after the act of buying) and the marketing organization (visits, sales, awareness, equity). Agarwal and Xavier [253] described the most popular neuromarketing tools, including event-related potential (ERP) (P300), EEG, and fMRI, and explained how these tools could be applied in marketing. A business and marketing article [256] lists the three categories of neuroscientific techniques that are applied in business and advertising research (Table 1 and Table 2, Figure 1) as follows:
  • Methods that monitor what is happening in the brain (i.e., the physiological activity of the CNS);
  • Methods that record what is happening elsewhere in the body (i.e., the physiological activity of the PNS);
  • Other techniques for tracking behavior and conduct.
Ganapathy [260] groups neuromarketing tools into three categories (Table 1 and Table 2). Farnsworth [258] gives information that can be essential when deciding on the best neuromarketing method or technique to help stakeholders understand research methods relating to human behavior at a glance, while Saltini [264] gives a short list of neuromarketing tools (Table 1 and Table 2). A system developed by CoolTool [257] allows several neuromarketing tools to be used separately or combined.
Although individual neuroscientific tools for neuromarketing, neuroeconomics, neuromanagement, neuro-information systems, neuro-industrial engineering, products, services, call centers have been developed by many researchers (for example [111,251,253,254,255,256,257,258,259,260,261,262,263,264,265,266,267,268,269,270,293,298,299,300,303,309,311,312,328,446,447,448], a review and analysis of the complete range of tools used in neuromarketing, neuroeconomics, neuromanagement, neuro-information systems, neuro-industrial engineering, products, services, call centers research has not yet been carried out. Thorough examinations of the range of research tool alternatives that are available for neuroscience are also often missing from research in this area. We have therefore compiled a complete list of neuroscience techniques for neuromarketing, neuroeconomics, neuromanagement, neuro-information systems, neuro-industrial engineering, products, services, call centers. Humans experience emotions and their associated feelings (e.g., gratitude, curiosity, fear, sadness, disgust, happiness, and pride) on a daily basis. Yet, in case of affective disorders such as depression and anxiety, emotions can become destructive. Thus the focus on understanding emotional responsiveness is not surprising in neuroscience and psychological science [449]. So neuroscience techniques analyze emotional, affective and physiological states tracking neural/electrical activity [335,336,337,338,339,340,450,451] or neural/metabolic activity [341,342,343,344,349,447,452,453] within the brain. This is also presented in Table 3.
For example, neuromarketing techniques can complement business decisions and make them more profitable, using the automated mining of opinions, attitudes, emotions and expressions from speech, text, emotions, neuron activity and other database-fed sources. Advertisements that are adjusted based on such information can engage the target audience more effectively and make a better impact on the audience, and this may translate into better sales and higher margins. In an attempt to enhance corporate branding and advertising routines, various factors have been studied, such as emotional appeal and sensory branding, to ensure that companies deliver the right message and that customers perceive the right message [171].
Affect recognition is widely used in gaming to create affect-aware video games and other software. Alhargan et al. [454] present affect recognition in an interactive gaming environment using eye-tracking. Szwoch and Szwoch [455] give a review of automatic multimodal affect recognition of facial expressions and emotions. Krol et al. [456] combined eye-tracking and brain–computer interface (BCI) and created a completely hands-free game Tetris clone where traditional actions (i.e., block manipulation) are performed using gaze control. Elor et al. [457] measure heart rate and galvanic skin response (GSR) with Immersive Virtual Reality (iVR) Head-Mounted Display (HMD) systems paired with exercise games to show how exercise games can positively affect physical rehabilitation.
Stress is a relevant health problem among students, so Tiwari, Agarwal [458] present a stress analysis system to detect stressful conditions of the student, including measurement of GSR and electrocardiogram (ECG) data. Nakayama et al. [459] suggest measuring heart rate variability as a method to evaluate nursing students stress during simulation to provide a better way to learn.
A literature review can reveal the most popular types of traditional and non-traditional neuromarketing methods. According to Sebastian [111], focus groups are one of the more traditional marketing methods, while various neuroscience techniques have also been applied to record the metabolic activity of the body and the electrical activity of the brain (transcranial magnetic stimulation (TMS), electroencephalography (EEG), functional magnetic resonance imaging, magnetoencephalography (MEG), and positron-emission tomography (PET)).
Electronic platforms are not the only possibility for non-traditional marketing, and Tautchin and Dussome [460] believe that traditional media can also be reimagined in new forms, such as guerrilla marketing, local displays, vehicle wraps, scaffolding, and even bubble cloud ads or aerial banners. In addition to giving high-quality feedback data, non-traditional techniques can also help in the evaluation of business decisions and conclusions [328].
Based on factors such as skin texture, gender, and SC, wearable biometric GSR sensors could be used to identify whether a person is in a sad, neutral, or happy emotional state. To understand marketing strategies better and to improve ads, other biometric sensors such as pulse oximeters and health bands could be used in the future to make automated predictions of emotions [461]. The galvanic skin response (GSR) method has an important limitation—it does not provide information on valence. The usual way to address this issue is to use other emotion recognition methods. They provide additional details and thus enable detailed analysis. Table 3 lists studies where GSR is used to measure emotions.
Eye tracking (ET) is used to record the frequencies of choices; sensor features are extracted and matched with certain preference labels to determine mutual dependences and to discover which brain regions are active when a certain choice task is performed. High values for alpha, beta and theta waves have been reported in the occipital and frontal brain regions, with a high degree of synchronization. A hidden Markov model is a popular tool for time-series data modeling, and researchers have successfully used this approach to build brain–computer-interface tools with EEG signals, counting mental task classification, medical applications and eye movement tracking [462].
A classification model based on SVM architecture, developed by Lakhan et al. [463], can predict the level of arousal and valence in recorded EEG data. Its core is a feature extraction algorithm based on power spectral density (PSD).
Multimodal frameworks that combine several modalities to improve results have recently become popular in the domain of human–computer interaction. A combination of modalities can give a more efficient user experience since the strengths of one modality can offset the weaknesses of another and the usability can be increased. These systems recognize and combine different inputs, taking into account certain contextual and temporal constraints and thus facilitating interpretation. Kong et al. [464] created a way of using two different sensors and calibrating them to achieve simultaneous gesture recording. Hidden Markov Model (HMM) was used for all single- and double-handed gesture recognition. Multimodality means that several unimodal solutions are combined into a system, meaning that multiple solutions can be combined into a single best solution using optimization algorithms [464].
The automatic emotion recognition system proposed by El-Amir et al. [465] uses a combination of four fractal dimensions and detrended fluctuation analysis, and is based on three bio-signals, GSR, EMG, and EEG. Using two emotional dimensions, the signals were passed to three supervised classifiers and assigned to three different emotional groups, with a maximum accuracy for the valence dimension of 94.3% and a maximum accuracy for the arousal dimension of 94%. This approach is based on external signals such as facial expressions and speech recognition, which means that it is simple and that no special equipment is required. The limitations of this approach are that emotions can be faked, and that these types of recognition methods fail with disabled people and people with certain diseases. Other approaches are based on electromyography, ECGs, SC, EEGs, and other physiological signals that are spontaneous and cannot be consciously controlled [465].
Plassmann et al. [466] as well as Perrachione and Perrachhione [467] carried out exciting studies in an attempt to determine how marketing stimuli lead to buying decisions. They applied neurosciences to marketing in order to create better models and to understand of how a buyer’s brain and emotions operate. Gruter [468] states that a wide range of techniques and tools are used to measure consumer responses and behavior. Three approaches that are used in neuromarketing can give access to the brain: input and output models, internal reflexes, and external reflexes.
Leon et al. [469] present a real-time recognition and classification method based on physiological signals to track and detect changes in emotions from a neutral state to either a positive or negative (i.e., non-neutral) state. They used the residual values of auto-associative neural networks and the statistical probability ratio test in their approach. When the proposed methodology was implemented to process a recognition level of 71.4% was achieved [469]. Monajati et al. [470] also investigated the recognition of negative emotional states, using the three physiological signals of galvanic skin response, respiratory rate and heart rate. Fuzzy-ART was applied to analyze the physiological responses and to recognize negative emotions. An overall accuracy of 94% was achieved in determining which emotions were negative as opposed to neutral [470].
Andrew et al. [471] described investigations of brain responses to modern outdoor advertising, focusing on memorability, visual attention, desirability, and emotional intensity. They also described ways in which the latest imaging tools and methods could be applied to monitor subconscious emotional responses to outdoor media in many forms, from multisensory advertising screens to simple paper posters. Andrew et al. [471] explained the cognitive processes behind their success, not solely in the context of the advertising to which people are typically exposed outside their homes, but also in the broader digital world. Andrew et al. findings have fundamental implications for media campaign planning, design, and development, identifying the possible role of outdoor advertising compared to other media, and possible ways of combining different media platforms and making them work for the benefit of advertisers.
Kaklauskas et al. [472] integrated Damasio’s somatic marker hypothesis with biometric systems, multi-criteria analysis techniques, statistical investigation, a neuro-questionnaire, and intelligent systems to produce the INVAR neuromarketing system and method. INVAR can measure the efficiency of both a complete video advertisement and its separate frames. This system can also determine which frames make viewers interested, confused, disgusted, happy, scared, surprised, angry, sad, bored, or confused; can identify the utmost positive or negative video advertisement; measure the consequence of a video advertisement on long-term and short-term memory; and perform other functions.
Lajante and Ladhari [473] applied peripheral psychophysiology measures in their research, based on the assumption that measures of emotion and cognition such as SC responses and facial EMGs could make a significant contribution to new ideas about consumer decision making, judgments and behaviors. These authors believe that their approach can help in applying affective neuroscience to the field of consumer services and retailing.
Michael et al. [474] aimed to understand the ways in which unconscious and direct cognitive and emotional responses underlie preferences for particular travel destinations. A 3×5 factorial design was run in order to better understand the unconscious responses of consumers to possible travel destinations. The factors considered in this study were the type of stimulus (videos, printed names, and images) and the travel destination (New York, London, Hong Kong, Abu Dhabi, and Dubai). ET can provide reliable tracking of cognitive and emotional responses over time. The authors suggested that decisions on travel destinations have both a direct and an unconscious component, which may affect or drive overt preferences and actual choices.
Harris et al. [448] investigated ways of measuring the effectiveness of social ads of the emotion/action type, and then of making these ads more effective using consumer neuroscience. Their research offers insights into changes in behavioral intent brought about by effective ads and gives an improved understanding of ways of making good use of social messages regarding a certain action, challenge or emotion that may be needed to help save lives. It can also reduce spending on social marketing campaigns that end up being ineffectual.
Libert and Van Hulle [475] argue that the development of economically practicable solutions involving human–machine interactions (HMI) and mental state monitoring, and neuromarketing that can benefit severely disabled patients has put brain–computer interfacing (BCI) in the spotlight. The monitoring of a customer’s mental state in response to watching an ad is interesting, at least from the perspective of neuromarketing managers. The authors propose a method of monitoring EEGs and predicting whether a viewer will show interest in watching a video trailer or will show no interest, skipping it prematurely. They also trained a k-nearest neighbor (kNN), a support vector machine (SVM), and a random forest (RF) classifier to carry out the prediction task. The average single-subject classification accuracy of the model was as follows: 73.3% for viewer interest and 75.803% for skipping using SVM; 78.333% for viewer interest and 82.223% for skipping using kNN; and 75.555% for interest and 80.003% for skipping using RF.
Jiménez-Marín et al. [476] showed that sensory marketing tends to accumulate user experiences and then exploit them to bring the users closer to the product they are evaluating, thus motivating the final purchase. However, several issues need to be considered when these techniques are applied to reach the desired outcomes, and it is important to be aware of recent advances in neuroscience. The authors explore the concept of sensory marketing, pointing out its possibilities for application and its various typologies.
Cherubino et al. [477] highlighted the new technological advances that have been achieved over the last decade, which mean that research settings are now not the only scenarios in which neurophysiological measures can be employed and that it is possible to study human behavior in everyday situations. Their review aimed to discover effective ways to employ neuroscience technologies to gain better insights into human behavior related to decision making in real-life situations, and to determine whether such applications are possible.
Monica et al. [478] explored the cognitive understanding and usability of banking web pages. They reviewed the theoretical literature on user experience in online banking services research, with a focus on ET as a research tool, and then selected two Romanian banking websites to study consumer attention, while consumers were navigating the sites, and memory, after their visits. The research findings showed that the layout and information display can make web pages more or less usable and can have an effect on cognitive understanding.
Singh et al. [328] discussed various methods of feature extraction for facial emotion detection. The algorithm they proposed could detect a total of six facial emotions, using a fuzzy rule-based system. During their experiment, neurometrics were recorded using a system comprising MegaMatcher software, Grove-GSR Sensor V1.2, and a 12-megapixel Hikvision IP camera. The participants were asked to watch a set of video ads for a range of well-known cosmetic products and wore SC sensors and sat in front of a camera that monitored their responses. Singh et al. [328] analyzed the cognitive processes of university students in relation to advertising and compliance with the code of self-regulation. A quantitative and qualitative methodology based on facial expressions, ET techniques and focus groups was used for this purpose. The results suggested that online game operators could be clearly identified. A high interaction of the public within the exhibition of supposed skills of the successful player and welcome bonuses also exists, and there was shown to be a lack of knowledge of the visual elements of awareness, a trivialization of compulsive gambling, and sexist attitudes towards women attracting public attention. A positive public attitude towards gaming was also observed by Singh et al. [328]; it was seen as a healthy form of leisure that was compatible with family and social relationships.
Goyal and Singh [461] proposed the use of research-based approaches for the automatic recognition of human affective facial expressions. These authors created an intelligent neural network-based system for the classification of expressions from extracted facial images. Several basic and specialized neural networks for the detection of facial expressions were used for image extraction.
Electromyography measures and assesses electric potentials in muscle cells. In medical settings, this method is used to identify nerve and muscle lesions, while in emotion recognition this method is used to look for correlations between emotions and physiological responses. Most EMG-based studies examine facial expressions drawing on the hypothesis that facial expressions take part in emotional responses to various stimuli. The hypothesis was first proposed by Ekman and Friesen in 1978; they described the relationships between basic emotions, facial muscles, and the actions they trigger. Morillo et al. [479] used low-cost EEG headsets and applied discrete classification techniques to analyze scores given by subjects to individual TV ads, using artificial neural networks, the C4.5 algorithm and the Ameva discretization algorithm. A sample of 1400 effective advertising campaigns was studied by Pringle et al. [480], who determined that promotions with exclusively emotional content achieved around double (31% vs. 16%) success as those with only rational content, while compared to campaigns with mixed emotional and rational content, the exclusively emotional campaigns performed only slightly better (31% vs. 26%).
According to Takahashi [481] some of the available emotion recognition systems in facial expressions or speech look at several emotional states such as fear, teasing, sadness, joy, surprise, anger, disgust, and neutral. Takahashi [481] investigated emotion recognition based on five emotional states (fear, anger, sadness, joy, and relaxed).
The authors [353,355,356,357,359,360,371,372,373,374] carried out an in-depth analysis of how blood pressure, SC, heart rate and body temperature depend on stress and emotions. Figures suggest that work-related stress costs the EU countries at least EUR 20 billion annually. Stress experienced at work can cause anxiety, depression, heart disease and increased chronic fatigue which can have a considerable negative impact on creativity, competitiveness and work productivity.
Research worldwide shows that people exposed to stress can experience higher blood pressure and heart rate. Light et al. [482] analyzed cases of daily elevated stress levels and looked at the effects on fluctuations in systolic and diastolic blood pressure. Gray et al. [483] investigated how systolic and diastolic blood pressure can be affected by psychological stress, while Adrogué and Madias [484] described the effects of chronic, emotional and psychological stress on blood pressure. The unanimous conclusion of research in this area is that diastolic and systolic blood pressure and heart rate depend on stress and can increase depending on the level of stress.
Blair et al. [485] analyzed the effect of stress on heart rate and concluded that heart rate rises sharply within three minutes of the onset of stress and starts to fall only after another five to six minutes. Gasperin et al. [486] concluded that high blood pressure was affected by chronic stress. A number of studies have shown that patients with heart rates higher than 70 beats per minute are more likely to develop cardiovascular diseases and to die from them; tests show that a rapid heartbeat increases the risk of heart attack by 46%, heart insufficiency by 56% and death by 34%.
Sun et al. [487] proposed an activity-aware detection scheme for mental stress. Twenty participants took part in their experiment, and galvanic skin response, ECG, and accelerometer data were recorded while they were sitting, standing, and walking. Baseline physiological measurements were first taken for each activity, and then for participants exposed to mental stressors. The accelerometer was used to track activity, and the data gave a classification accuracy between subjects of 80.9%, while the 10-fold cross-validation accuracy for the classification of mental stress reached 92.4%. This study focused on physiological signals for example photoplethysmography and galvanic skin response. The neural network configurations (both recurrent and feed forward) were examined and a comprehensive performance analysis showed that the best option for stress level detection was layer recurrent neural networks. For a sample of 19 automotive drivers, this evaluation achieved an average sensitivity of 88.83%, a precision of 89.23% and a specificity of 94.92% [488].
Palacios et al. [489] applied a new process involving two databases containing utterances under stress by men and women. Four classification methods were used to identify these utterances and to organize them into groups. The methods were then compared in terms of their final scores and quality performance.
Fever occurs when the body’s thermoregulatory set point increases, and many findings suggest that the rise in core temperature induced by psychological stress can be seen as fever. A fever of psychological origin in humans might then be a result of this mechanism [490].
Wu and Liang [491] presented a training and testing procedure for emotion recognition based on semantic labels, acoustic prosodic information and personality traits. A recognition process based on semantic labels was applied, using a speech recognizer to identify word sequences, and HowNet, a Chinese knowledge base, was used as the source for deriving the semantic word sequence labels. The emotion association rules (EARs) of the word sequences were then mined by applying a text-based mining method, and the relationships between the EARs and emotional states were characterized using the MaxEnt model. In a second approach based on acoustic prosodic information, emotional salient segments (ESSs) were detected in utterances and their prosodic and acoustic features were extracted, including pitch-related, formant, and spectrum attributes. The next step was the construction of base-level classifiers using SVM, gaussian mixture models (GMM) and MLP, which were then combined (using MDT) by selecting the most promising option for emotion recognition based on acoustic prosodic information. The process ended when the final emotional state was determined. A weighted product fusion method was applied to combine the outputs produced by the two types of recognizers. The personality traits of the specific speaker, as determined from the Eysenck personality questionnaire, were then taken into consideration to examine their impact and personalize the emotion recognition scheme [491].
A hybrid analysis method for online reviews proposed by Nilashi et al. [492] allows for the ranking of factors affecting the decisions of travelers in their choice of green hotels with spa services. This method combined text mining, predictive learning techniques and multiple criteria decision-making methods, and was proposed for the first time in the context of hospitality and tourism, with an emphasis on green hotel customer grouping based on online customer feedback. Nilashi et al. [492] used the latent Dirichlet analysis method to analyze textual reviews, a self-organizing map for cluster analysis, the neuro-fuzzy method to measure customer satisfaction, and the TOPSIS method to rank the features of hotels. The proposed method was tested by analyzing travelers’ reviews of 152 Malaysian hotels. The findings of this research offer an important method of hotel selection by travelers, by means of user-generated content (UGC), while hotel managers can use this approach to improve their marketing strategies and service quality.
A neuromarketing method for green, energy-efficient and multisensory homes, proposed by Kaklauskas et al. [493], can be used to determine the conditions that are required. The multisensory dataset (physiological and emotional states) collected as part of this research contained about 200 million data points, and the analysis also included noise pollution and outdoor air pollution (volatile organic compounds, CO, NO2, and PM10). This article discussed specific case studies of energy-efficient and green buildings as a demonstration of the proposed method. The results matched findings from both current and previous studies, showing that the correlation between age and environmental responsiveness has an inverse U shape and that age is an important factor affecting interest in eco-friendly, energy-efficient homes.
The VINERS method and biometric techniques developed by Kaklauskas et al. [494] for the analysis of emotional states, physiological reactions and affective attitudes were used to determine which locations are the best choice and then to show neuro ads of available homes offered for sale. Homebuyers were grouped into rational segments, taking into account consumer psychographics and behavior (happy, angry or sad, and valence and heart rate) and their demographic profiles (age, gender, marital status, children or no children, education, main source of income). A rational video ad for the respective rational segment was then selected. This study aimed to combine the somatic marker hypothesis, neuromarketing, biometrics and the COPRAS method, and to develop the VINERS method for use with multi-criteria analysis and the neuromarketing of the best places to live. The case study presented in the article demonstrated the VINERS method in practice.
Etzold et al. [495] examined the case of users booking appointments online, and the ways in which they interacted with the webpage interface and visualizations. The main point was to determine whether a new interface for online booking was easy to navigate and successful in attracting user attention. In this study, the authors particularly wanted to determine whether a new, more expensive customer website was seen as more user-friendly and supportive than the older, cheaper alternative. An empirical study was carried out by tracking users eye movements as they were navigating the existing website of Mercedes-Benz, a car manufacturer, and then a new, updated version of the same company’s website. A total of 20 people were observed, and evaluations of their ET data suggested that the new service appointment booking interface could be further improved. Scan-paths and heatmaps demonstrated that the old website was superior [495].
In recent years, many different emotional values, such as the net emotional value (NEV), the service encounter emotional value (SEEVal), and others, have been analyzed. Attempts have been also made to put them into practice [496,497,498,499,500,501,502,503]. These studies are overviewed below. To calculate NEV, the average score for negative emotions (stressed, dissatisfied, frustrated, unhappy, irritated, hurried, disappointed, neglected) is subtracted from the average score for positive emotions (cared for, stimulated, happy, pleased, trusting, valued, focused, safe, interested, indulgent, energetic, exploratory). The average score obtained this way can be used to characterize a client’s feelings about a service or a product [499]. A higher value of NEV indicates that the relationships forged by a business are more reliable. One advantage of the NEV is that it characterizes the total balance of a consumer’s feelings related to products or services, and thus reveals the value drivers. The relationship between NEV and client satisfaction is linear [500].
The NEV can be used to highlight both aspects that need to be improved, and those that are positive. Since the NEV is calculated based on a subtraction, the result may be either a negative or a positive number. The overall score can indicate what is happening with the client at an emotional level, and suggest ways to use this to gain competitive advantage [501].
The SEEVal is another measure proposed by Bailey et al. [504], and is the sum of the NEV experienced by the client and the NEV experienced by the product or service provider’s employee. The client’s end results linked to SEEVal are typically loyalty, satisfaction, pleasure, and voluntary benevolence [504]. The IGI Global Dictionary defines an emotional value as a set of positive moods (feeling good or being happy) resulting from products or services and contained in the value gain from the customers’ emotional states or feelings when using the products or services (IGI Global Dictionary). Emotional value acts as a moderator, and has significant effects on the roles of social, functional, epistemic, conditional and environmental values [497].
Zavadskas et al. [505] examined data on potential buyers to analyze the hedonic value in one-to-one marketing situations. They used the neutrosophic PROMETHEE technique to examine arousal, valence, affective attitudes, emotional and physiological states (AFFECT), and argued that hedonic value is tied to several factors including customers’ social and psychological data, client satisfaction, criteria of attractiveness, aesthetics, and economy, the sales site rental price, emotional factors, and indicators of the purchasing process. Their research showed that an analysis of the aforementioned data on potential buyers can make an important contribution to more effective one-to-one marketing. The case study cited in this work concerned two sites in Vilnius and intended to calculate the hedonic value of these sites during the Kaziukas Fair.
The ROCK Video Neuroanalytics and associated e-infrastructure were established as part of the H2020 ROCK project. This project tracked passers-by at ten locations across Vilnius. One of our outputs is the real-time Vilnius Happiness Index (Figure 10 and https://api.vilnius.lt/happiness-index, accessed on 5 September 2022). The project also involved a number of additional actions (https://Vilnius.lt/en/category/rock-project/, accessed on 5 September 2022).
The intensity of the most intense negative emotion (scared, disgusted, sad, angry) subtracted from the intensity of “happiness” equals valence [430]. This way the single score of valence combines both positive and negative emotions. Our pool of data comprised 208 million data points analyzed using SPSS Statistics, a statistical software suite. Figure 10b presents the average values of valence per hour on weekdays. Every hour, the changes of average valence among Vilnius passers-by were recorded. Valence was measured every second and these values were accumulated by weekdays (marked in the chart with specific colors) at 95% confidence intervals. The y-axis shows the average values of valence (which fluctuates between −1 to 1) for each full day, for seven days, and the x-axis shows the hour starting at midnight [348].

5. Users’ Demographic and Cultural Background, Socioeconomic Status, Diversity Attitudes, and Context

Emotions are a means to engage in a relationship with others: Anger means that the person refuses to accept a specific treatment from others and expresses that they feel entitled to something more. Anger is expressed with the aim of influencing, controlling, and fixing the behavior of others [506].
Through emotions, people can adaptively respond to opportunities and demands they face around them [507,508,509]. When people face everyday stressors, stressful transitions, ongoing challenges, and acute crises, the adaptive function of emotions is evident in all of these situations. Emotions also depend on context [510]. This means that emotions are most effective when people express them in the situational contexts for which the emotions most likely evolved. In addition, they are specifically most likely to promote adaptation in such scenarios. The experience of anger, for instance, is adaptive because it motivates the focus of energies and the mobilization of resources toward an effective response. When a person expresses anger, adaptive mechanisms are also at work because it shows the person’s willingness, and perhaps even ability, to defend themselves. Emotional responses are sensitive to contexts, and are therefore, an integral part of our ways to adapt to daily life and the environment [511].
The ability to modify emotion responses according to changing context may be an important element of psychological adjustment [510]. An individual’s capacity to modify emotion responses taking into account the demands of changing contexts (i.e., environmental or interpersonal) is particularly relevant. This mechanism is known as emotion context sensitivity [511].
Cultural and gender differences in emotional experiences have been identified in previous research [512]. For instance, these authors used the Granger causality test to establish how a person’s cultural background and situation affect emotion. The conclusions drawn by [513] propose a top-down mechanism where gender and age can impact the brain mechanisms behind emotive imagery, either directly or by interacting with bottom-up stimuli.
Cultural neuroscientists are studying how cultural traits such as values, beliefs, and practices shape human affective, emotional, and physiological states (AFFECT) and behavior. Hampton and Varnum [514] have reviewed theoretical accounts on how culture impacts internal experiences and outward expressions of emotion, as well as how people opt to regulate them. They also analyze cultural neuroscience research that investigates how emotion regulation varies in different cultural groups.
Thus far, differences between nations have largely been the focus in studies of culture in social neuroscience. Culture impacts more than just our behavior—it also plays a role in how we see and interpret the world [515]. For instance, socioeconomic factors such as education, occupation, and income have a significant impact on how a person thinks. In one study, working-class Americans were shown to exhibit a more context-dependent thought process, similar to the collectivist patterns seen in other countries. Individuals of a lower social class in terms of their socio-economic status agreed with contextual explanations of economic trends, broad social outcomes, and emotions [516].
Gallo and Matthews [517] looked at the indirect evidence that socioeconomic status is associated with negative emotions and cognition, and that negative emotions and cognition are associated with target health status. They also proposed a general framework for understanding the roles of cognitive–emotional factors, arguing that low socioeconomic status causes stress, and impairs a person’s reserve capacity for managing it, thus heightening emotional and cognitive vulnerability.
Choudhury et al. [518] explore critical neuroscience, a field of inquiry that probes the social, cultural, political, and economic contexts and assumptions that form the basis for behavioral and brain science research.
Numerous studies have illustrated that depending on the specific demographic background, there are major differences between users’ emotions, behavior, and perceived usability. According to Goldfarb and Brown [519], scientific research is characterized by racial, cultural, and socioeconomic prejudices, which lead to demographic homogeneity in participation. This in turn spurs inaccurate representations of neurological normalcy and leads to poor replication and generalization.
According to Freud, the unconscious is a depository for socially unacceptable ideas, wishes or desires, traumatic memories, and painful emotions that psychological repression had pushed out of consciousness [520]. HireVue, which is a global front-runner in AI technologies, is one of the top emotional AI companies that is now turning to biosensors that read non-conscious data in lieu of facial coding methods to measure emotions [521].
The ideas of what it means to have good relationships and to be a good person differ in different cultural contexts [522]. People’s emotional lives are closely related to these different ideas of how people see themselves and their relationships: Emotions usually match the cultural model [523,524]. Therefore, rather than being random, cultural variation in emotions matches the cultural ideals of ways to be a good person and to maintain good relationships with other people [506].
Aside from being biologically driven, emotion is also influenced by environment, as well as cultural or social situations. Culture can constrain or enhance the way emotions are felt and are expressed in different cultural contexts, and it can influence emotions in other ways. Studies have consistently shown cross-cultural differences in the levels of emotional arousal. Eastern culture, for instance, is related to low arousal emotions, whereas Western culture is related to high arousal emotions [525]. Many findings in cross-cultural research suggest that decoding rules and cultural norms influence the perception of anger [526]. Scollon et al. [527] look at five cultures (Asian American, European American, Hispanic, Indian, and Japanese) to assesses the way emotions are experienced in these cultures. Pride shows the greatest cultural differences [527]. As emotions are fundamentally genetically determined, different ones are perceived in similar ways throughout most nations or cultures [528].

6. Results

The present article aims to bridge the affective biometrics and neuroscience gap in existing knowledge, in order to contribute to the overall knowledge in this area. We also aim to provide information on the knowledge gaps in this area and to chart directions for future research.
We conclude this review by discussing unanswered questions related to the next generation of AFFECT detection techniques that use brain and biometric sensors.
By performing text analytics of 21,397 articles that were indexed by Web of Science from 1990 to 2022, we examined the key changes in this area within the last 32 years. Scientific output relating to AFFECT detection techniques using brain and biometric sensors is steadily increasing. As this trend suggests, there has been continuous growth in the number of papers published in the field, with the total number of articles appearing between 2015 and 2021 nearing the total number of articles published over the previous 25 years (1990 to 2014). In light of the increasing commercial and political interest in brain and biometric sensor applications, this trend is likely to continue.
With ground-breaking emerging technologies and the growing spread of Industry 5.0 and Society 5.0, AFFECT should be analyzed by taking into account demographic and cultural background, socioeconomic status, diversity attitudes, and context. Advanced computational models will be needed for this approach.
Quite a few biometric and neuroscience studies have been performed in the world, where AFFECT detection takes into account demographic and cultural background (age, gender, ethnicity, race, major diagnoses, and major medical history); socioeconomic status (education, income, and occupation); diversity attitudes; and context. Yet, to the best of our knowledge, none of the technologies available in the world offer AFFECT detection that incorporates political views, personality traits, gender, race, diversity attitudes, and cross-cultural differences in emotion.
Sometimes confusion exists in the spirit of some research about physiological effects due to emotional reactions and biometric patterns with regard to individual identification. To resolve this confusion, we analyze only physiological effects caused by emotional reactions (i.e., second generation biometrics; Section 3) in the part of the review discussing biometrics. Biometric patterns for individual identification are not analyzed in this research.
Human emotions can be determined by physiological signals, facial expressions, speech, and physical clues, such as posture and gestures. However, social masking—when people either consciously or unconsciously hide their true emotions—often renders the latter three ineffective. Physiological signals are therefore often a more accurate and objective gauge of emotions [529]. For instance, researchers [530,531] performed many studies to analyze physiological signals and unconscious emotion recognition. Nonetheless, our years of research experience have proven that in public spaces, facial expressions, speech, and physical clues, such as posture and gestures, are much more convenient and effective.
Emotion recognition can be more accurate when human expressions are analyzed looking at multimodal sources such as texts, physiological signals, videos, or audio content [532]. Integrated information from signals such as gestures, body movements, speech, and facial expressions helps detect various emotion types [533]. Statistical methods, knowledge-based techniques, and hybrid approaches are three main emotion classification approaches in emotion recognition [534].
The emotional dimensions follow the approach of representing the emotion classes. Categorized emotions can be represented in a dimensional form with each emotion placed in a distinct position in space: either 2D (Circumplex model, “Consensual” Model of Emotion, Vector Model,) or 3D (Lövheim Cube, Pleasure-Arousal-Dominance [PAD] Emotional-State Model, Plutchik’s model, PAD Emotional-State Model), with each emotion occupying a distinct position in space. Most dimensional models have dimensions of valence and arousal or intensity or arousal dimensions: Valence dimension indicates how much and to what degree an emotion is pleasant or unpleasant, whereas arousal dimension differentiates between showing its state, either that of activation or deactivation [82]. The objectives of our study were most in line with Plutchik’s ‘wheel of emotions’ model, which we used in this research.
The use of artificial intelligence to recognize emotions and affective attitudes is a comparatively promising field of investigation. To make the most of artificial intelligence, multiple modalities in context should be generally used. Artificial intelligence has enabled biometric recognition and the efficient unpacking of human emotions and affective and physiological responses and has contributed considerably to advances in the field of pattern recognition in biometrics, emotions, and affective attitudes. Many different AI algorithms are used in the world, such as machine learning, artificial neural networks [535,536,537], search algorithms [166,538,539], expert systems [540,541], evolutionary computing [542,543], natural language processing [544,545], metaheuristics, fuzzy logic [546,547,548], genetic algorithm [549,550,551], and others.
Based on our review, presented in Section 1, Section 2, Section 3, Section 4 and Section 5, we find that investigators should develop procedures to guarantee that AI models are appropriately used and that their specifications and results are reported consistently. There is a need to create innovative AI and machine learning techniques.
Based on the review (Section 1, Section 2, Section 3, Section 4 and Section 5), investigators should develop procedures to guarantee that AI models are appropriately used and that their specifications and results are reported consistently. There is a necessity to create innovative AI and machine learning techniques.
The existing emotion recognition approaches all need data, but the training of machine learning algorithms requires annotated data, and obtaining such data is usually a challenge [552]. The use of AI models may become less complex, and AI algorithms faster when certain database techniques are applied. These techniques can also provide AI capability inside databases. Supporting AI training inside databases is a challenging task. One of the challenges is to store a model in databases, so that its parallel training is possible with multiple tenants involved in its training and use, at the same that security and privacy issues are taken care of. Another challenge is to update a model, especially in case of dynamic data updates [553]. The following datasets can help with the task of classifying different emotion types from multimodal sources such as physiological signals, audio content, or videos: BED [554], MuSe [555], MELD [544,556], UIT-VSMEC [411] HUMAINE [557], IEMOCAP [558], Belfast database [559], SEMAINE [560], DEAP [561], eNTERFACE [384], and DREAMER [562]. Github [563], for instance, provides a list of all public EEG-datasets such as High-Gamma Dataset (128-electrode dataset from 14 healthy subjects with about 1000 four-second trials of executed movements, 13 runs per subject), Motor Movement/Imagery Dataset (2 baseline tasks, 64 electrodes, 109 volunteers), and Left/Right Hand MI (52 subjects).
The findings also suggest that the development of more powerful algorithms cannot address the perception, reading, and evaluation of the complexity of human emotions, by making an integrated analysis of users’ demographic and cultural background (age, gender, ethnicity, race, major diagnoses, and major medical history); socioeconomic status (education, income, and occupation); diversity attitudes; and context. We can only hope that the future will bring further research to address this issue and help to develop more advanced AFFECT technologies that can better cope with issues such as demographic and cultural background (age, gender, ethnicity, race, major diagnoses and major medical history); socioeconomic status (education, income and occupation); diversity attitudes; and context (weather conditions, pollution, etc.).
Worldwide research has yet to resolve several problems, and additional research areas have arisen, such as missing data analysis, potential bias reduction, a lack of stringent data collection and privacy laws, application of elicitation techniques in practice, open data and other data-related issues. Olivas et al. [564] for instance, analyze various methods for handling missing data:
  • Missing data imputation techniques: analysis of the variable containing missing data (Mean, Regression, Hot Deck, Multiply Imputation) and analysis of relationships between variables for a case containing missing data (Imputation based on Machine Learning: Neural Network, Self-organizing map, K-NN, Multilayer perceptron);
  • Case deletion (Listwise Deletion (Complete-case), Pairwise Deletion);
  • Approaches that take into account data distributions (Bayesian methods, Model-based likelihood, Maximum Likelihood with EM).
It was found that the median correlation of the dependent variable of the Publications—Country Success model with the independent variables (0.6626) is higher than in the Times Cited—Country Success model (0.5331). Therefore, it can be concluded that the independent variables in the Publications—Country Success model are more closely related to the dependent variable than in the Times Cited—Country Success model (Figure 11).
The CSP maps of the world that have been compiled for this research provide a visualization of two aspects. A country’s success (x-axis) is one of the aspects, while the publications dimensions (CSPN and CSPC; y-axis) are the other (Figure 12 and Figure 13). The publications (x-axis) are one of the aspects, while the publications times cited dimensions (y-axis) are the other in Figure 14. The CSP maps group the countries into the same eight clusters as the Inglehart–Welzel 2020 Cultural Map of the World (English-speaking, Catholic Europe, Protestant Europe, Orthodox Europe, West and South Asia, African-Islamic, Confucian, and Latin America) [565]. Two clusters—English-speaking and Protestant Europe—have been merged into one because of their shared history, religion, cultures, and degree of economic development. The parallels between the two aforementioned clusters have been confirmed by numerous studies [566]. The Inglehart–Welzel 2020 Cultural Map of the World includes many institutional, technological, psychological, and economic variables that demonstrate strong perceptible correlations [567]. The country success indicators in the CSP maps can be characterized as a large set of variables within the criteria system, such as politics, human development and well-being, the environment, macroeconomics, quality of life, and values based.
In addition, this is a quantitative study to assess how the success of the 169 countries impacted the number of Web of Science articles published in 2020 on AFFECT recognition techniques that use brain and biometric sensors (or the latest figures available).
For the multiple linear regressions, we used IBM SPSS V.26 to build two regression models on 15 indicators of country success and the two predominant CSP dimensions. Two CSP regression models were developed based on an analysis of 15 independent variables and two dependent variables. The 15 independent variables and the two regression models are summarized in Table 4, Table 5, Table 6, Table 7 and Table 8. Table 4 contains descriptive statistics for two of the CSP models. The minimum and maximum values indicate the value range for each variable in the set of values that the variable in question can take. The average value of the full range that each variable can take is the mean and is usually equal to the arithmetical average. The standard deviation is a measure of the dispersion in the values of the variable in relation to the mean. Kurtosis is a measure of whether the values are heavy-tailed or light-tailed relative to the center of the distribution, whereas skewness is a measure of the symmetry of the distribution of the values. Acceptable values are considered to be between −3 and +3 for skewness, and between −10 and +10 for kurtosis. When the skewness is close to zero and kurtosis is close to three, the distribution of the values of the variable within the specified value range is in line with a normal distribution.
Step 9 entailed the construction of regression models for the number of publications and their citation rates, and the calculation of the ES indicators describing them. Two dependent variables and 15 independent variables were analyzed to construct these regression models. The process was as follows:
  • Construction of regression models for the numbers of publications and their citations.
  • Calculation of statistical effect size (ES) indicators describing these regression models. ES is a value used in statistics to measure the strength of the relationship between two variables, or to calculate a sample-size estimate of that amount [568]. An ES may reflect the regression coefficient in a regression, the correlation between two variables, the mean difference, or the risk of a specific event occurring [569]. Guidelines developed by Durlak [570] provide advice on the ESs to use in research, and how to calculate and interpret them. We used these guidelines, and applied the following five measures of ES, as these indicators are crucial for meta-analysis and could be computed from our measurements:
    Pearson correlation coefficient (r): Beta weights and structure coefficients r are the two sets of coefficients that can provide a more perceptive stereoscopic view of the dynamics of the data [571]. Interpretation may be also improved through the use of other results (e.g., [572]).
    Standardized beta coefficient (β): Theoretically, the highest-ranking variable is the one with the largest total effect, since β is a measure of the total effect of the predictor variables [573].
    Coefficient of determination (R2): This is a measurement of the accuracy of a CSP model. The outcome is represented by the dependent variables of the model. The closer the coefficient of determination to one, the more variability the model explains. R2 can therefore be used to determine the proportion of the variation in the dependent variable that can be predicted by examining the independent variables [573].
    Standard deviation: If this is too high, it will render the measurement virtually meaningless [574].
    p-values. There is no direct relationship between the p-value and the size, and a small p-value may be associated with a small, medium, or large effect. There is also no direct relationship between the ES and its practical or clinical significance: a lower ES for one outcome may be more important than a higher ES for another outcome, depending on the circumstances [570].
  • Calculation of non-statistical ES measures, which may better indicate the significance of the relationships between pairs of variables in our two models:
    Research context: Durlak [570] argues that ESs must be interpreted in the context of other research.
    Practical benefit: As this is an intuitive measure, practical benefit can allow stakeholders to make more accurate assessments of whether the research findings published can significantly improve their ongoing projects [575].
    Indicators with low values: These are usually easier to improve than indicators with high values.
Based on the results of descriptive statistics, it can be concluded that the values of the dependent variables of the models used in the study demonstrate normal distribution (skewness < 10 and kurtosis < 10), which allows for the use of parametric analysis methods in the analysis.
A correlation analysis found that the strongest relationship in the Publications—Country Success model is between the dependent variable Publications and the independent variable GDP per Capita. Meanwhile, in the Times Cited—Country Success model, the strongest relationship is between the variables of Times Cited and GDP per Capita in PPP. It was also found that in both models, the relationships between the dependent variables and the independent variables are statistically significant (p < 0.001), except for the relationships between the dependent variables and the Unemployment Rate variable.
A reliability analysis of the compiled regression models allows us to conclude that the models are suitable for analysis (p < 0.05). It was also found that the changes in the values of the independent variables used in the models explain the variance of the Publications variable by 69.4%, and the variance of the Times Cited variable by 51.1%.
An analysis of the standardized coefficients of the model allows us to conclude that changes in the GDP per Capita variable have the biggest impact on changes in the Publications variable. The GDP per Capita in PPP variable also have a significant impact. Meanwhile, the Times Cited variable is most affected by the GDP per Capita in PPP variable, which has a statistically significant effect on the dependent variable.
To confirm Hypothesis 1, we built two CSP models, which are formal representations of the CSP maps. These models demonstrate that on average, an increase of 1% in a country’s success leads to an average improvement by 0.203% in the country’s two CSPN and CSPC dimensions. As the success of a country increased by 1%, the numbers of Web of Science articles published and their citations grew by 1.962% and 2.101%, respectively. Figure 12 and Figure 13 also illustrate that an increase in a country’s success goes hand in hand with a jump in its CSPN and CSPC dimensions, thus confirming Hypothesis 1.
Hypothesis 2 was based on the results of the analysis pertinent to the CSP models, as well as on the correlations found between the 169 countries and the 15 indicators [66]. A clear visual confirmation of Hypotheses 1 and 2 are also provided by Figure 12 and Figure 13, which show the specific groupings of countries in the seven clusters examined in this study. These models may be of major significance for policy makers, R&D legislators, businesses, and communities.

7. Evaluation of Biometric Systems

In this chapter, we outline the rationale behind the current biometrics and brain approaches, compare the efficacy of existing methods, and determine whether or not they are capable of addressing the kinds of issues and challenges associated with the field (with figures). Biometric systems have several drawbacks in terms of their precision, acceptability, quality, and security. They are generally evaluated based on aspects such as (1) data quality; (2) usability; (3) security; (4) efficiency; (5) effectiveness; (6) user acceptance and satisfaction; (7) privacy; and (8) performance.
Data quality measures the quality of biometric raw data [576,577]. This type of assessment is generally used to quantify biometric sensors and can also be used to enhance the system performance. According to the International Organization for Standardization ISO 13407:1999 [578], usability is defined as “[t]he extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency, and satisfaction in a specified context of use” [579]:
  • In this context, efficiency means that users must be able to accomplish the tasks easily and in a timely manner. It is generally measured as task time;
  • Here, effectiveness means that users are able to complete the desired tasks without excessive effort. This is generally measured by common metrics such as the completion rate and number of errors, for example the failure-to-enroll rate (FTE) [580];
  • User satisfaction measures the user’s acceptance of and satisfaction with the system. It is generally measured by looking at a number of characteristics, such as ease of use and trust in the system. Even if the performance of one biometric system exceeds that of another in terms of performance, this will not necessarily mean that it will be more operational or acceptable.
Security measures the robustness of a biometric system (including algorithms, architectures, and devices) against attack. The International Organization for Standardization ISO/IEC FCD 19792 [581] specifically addresses processes for evaluating the security of such systems [579].
Unlike traditional methods, biometric systems do not provide a 100% reliable answer, and it is almost impossible to obtain such a response. In a secure biometric system, there is a trade-off between recognition performance and protection performance (security and privacy). The reason behind this trade-off arises from the unclear concept of security, which requires a more standardized framework for evaluation purposes. If this gap can be closed, an algorithm could be developed that would jointly reduce both of them. ISO 19795 contained standards for performance metrics and evaluation methodologies for traditional biometric systems. In addition to performance testing, it provided metrics related to the storage and processing of biometric information [582]. ISO/IEC 24745 specifies that, unlike privacy, security is delivered at the system level. In general, the ability of a system to maintain the confidentiality of information with the use of the provided countermeasures (such as access control, integrity of biometric references, renewability, and revocability) is referred as its security factor. When seeking to bypass the security of a biometric system, an invader may impersonate a genuine user to gain access to and control over various services and sensitive data. Privacy refers to secrecy at the information level. The following criteria were proposed in ISO/IEC 24745 for the purpose of evaluating the privacy offered by biometric protection algorithms: irreversibility, unlinkability, and confidentiality [583].
The discriminating powers of all biometric technologies rely on the extent of entropy, with the following used as performance indicators for biometric systems [584,585,586,587]: False match rate (FMR); False non-match rate (FNMR); Relative operating characteristic or receiver operating characteristic (ROC); Crossover error rate or equal error rate (CER or EER); Failure to enroll rate (FER or FTE), and Failure to capture rate (FTC).
Specific advantages and disadvantages are characteristic to each biometric technology. Table 9 shows these comparisons.
Upon completing the literature analysis, we then compared biometric technologies looking at the following seven parameters: universality, distinctiveness/uniqueness, permanence, collectability, performance, acceptability, and circumvention (Table 10). Another set of comparisons was the strengths and weaknesses characteristic to biometric technologies and related to their ease of use, error incidence, accuracy, user acceptance, long term stability, cost, template sizes, security, social acceptability, popularity, speed, and whether or not they have been socially introduced (Table 11). The working characteristics of various biometrics differ, as does their accuracy, and depend on the design of their operation. The level of security and the kinds of possible errors are also different in each biometric approach; the denial of access to the biometric sample holders is possible caused by various factors such as aging, cold, weather conditions, physical damages, and so on [600,601]. Other researchers also look at FAR, FRR, CER, and FTE in their comparisons of biometric technologies (Table 12).
Multimodal biometric systems take advantage of multiple sensors or biometrics to remove the restrictions of unimodal biometric systems [616]. While unimodal biometric systems are restricted by the integrity of their identifier, the change of several unimodal systems having the same restrictions is low [617]. Multimodal biometric systems can fuse these unimodal systems sequentially, simultaneously, both ways, or in series, meaning sequential, parallel, hierarchical, and serial integration modes, respectively. For instance, final results of decision level fusion of multiple classifiers are joined using methods such as majority voting [616]. This multimodal analysis will assist in identifying the actual reasons of such issues with the current biometrics and brain approaches, as well as the restrictions of the existing state-of-the-art approaches and technologies.
An efficient way to combine multiple classifiers Is needed when an array of classifiers outputs is developed. Various architectures and schemes have been proposed for joining multiple classifiers. The most popular methods are majority vote and weighted majority vote. In majority vote, the right class is the one most selected by various classifiers. If all the classifiers show different classes or in the event of a tie, then the one with the highest overall output is chosen to be the right class. Vote averaging method averages the separate classifier outputs confidence for every class over the entire ensemble. The class output with the highest average value is selected to be the right class [618]. The vote averaging method has been used to measure the efficacy of existing biometrics methods (Table 10 and Table 11). In our case, High (Very High) was assigned 3 points, Medium was assigned 2, and Low was assigned 1. The calculations did not evaluate some qualitative indicators, such as error incidence and socially introduced. Additionally, not all biometrics technologies had data on the analyzed indicators. As a result, eye tracking we not evaluated in this case due to a lack of data. The highest average number of points was collected by Skin temperature-thermogram (2.57), Iris/pupil (2.43), Face (2.30), and Signature (2.09). Many of the metrics for biometric technologies in Table 9, Table 10, Table 11 and Table 12 are analyzed in detail throughout the article.

8. Discussion and Conclusions

Nevertheless, there are still unanswered questions that need to be addressed. We evaluated the evidence available to find a relationship between brain and biometric sensor data and AFFECT in order to determine the primary digital signals for AFFECT. The multidisciplinary literature used was from the disciplines of engineering, computer science, neuroscience, physiology, psychology, mathematical modeling, and cognitive science. The distinct conventions of these disciplines resulted in certain variegations, depending on the features and characteristics of the research results being focused on. The literature under analysis has small sample sizes, short follow-up times, and significant differences in the quality of the reports, which limits the interpretability of the pooled results. On average, the current AFFECT detection techniques that use brain and biometric sensors achieved a classification accuracy greater than 70%, which seems sufficient for practical applications. As part of this review, several issues that need to be addressed were identified, as well as numerous recommendations and directions for future AFFECT detection and recognition research being suggested. They are listed below:
  • Many studies fail to report information on demographic and cultural background, socioeconomic status, diversity attitudes, and context, and AFFECT papers often have limited descriptions of feature extraction and analysis. This has a significant impact on the interpretation of their findings. Sample recommendations include reporting on participant enrolment and selection approaches and analysis of demographic and cultural background (age, gender, ethnicity, race, major diagnoses, and major medical history); socioeconomic status (education, income and occupation), diversity attitudes, and context. In order to improve the ability of researchers to assess the strength of evidence, one of the first steps should be the development of this kind of consistent reporting.
  • Behavioral traits (e.g., gesture, keystroke, voice) change over time, and therefore are less stable. Multiple interactions are typically required to set a reliable baseline. Injury, illness, age, and stress can also cause changes in behavioral traits. Many of the studies on AFFECT recognition examined brain and biometric data under different AFFECT while overlooking the baseline (spontaneous) brain and biometric data.
  • The literature did not contain brain and biometric sensor-based AFFECT recognition of mixed emotions (parallel involvement of negative and positive emotions). We study the 30 primary, secondary, and tertiary dyads of Plutchik’s wheel of emotions, creating mixed emotions.
  • Researchers need a set of guidelines to ensure AI models (artificial neural networks, evolutionary computing, natural language processing; metaheuristics, fuzzy logic, genetic algorithm) are correctly applied, and that their specifications and results are consistently reported (the model selection strategy, parameter estimates in the model with confidence intervals, performance metrics, etc.). There is also a need to further develop advanced AI and machine learning techniques (multi-modal learning, neuroscience-based deep learning, automated machine learning, self-supervised deep learning, Quantum ML, Tiny ML, System 2 deep learning).
  • More results are also needed to identify which of the elicitation techniques applied in practice are effective, and in which cases they work best, taking into account the type of information obtained, the stakeholders’ (developers, end-users, etc.) characteristics, the context, and other factors. More data sets need to be created that use active elicitation techniques, such as various games, as these are better at mimicking real-life experiences and bringing about emotions. Gamification is a current trend that uses game methods for real-life AFFECT elicitation.
  • Recommendations also state that the two sources of potential bias (AFFECT interpretation algorithmic biases, data sources and input) in multi-feature studies should be reduced, and a wider variety of multimodal samples should be used.
  • Missing data analysis has some gaps, for example missing data descriptions and how missing data is handled, and most appropriate methods should be applied in AFFECT recognition. As far as missing data goes, the literature had major shortcomings.
  • As algorithms improve, accuracy is growing, but this significantly depends on the data sets used. Some gaps and a lack of discussion have also been noted concerning the question of whether the integrated brain and biometric sensors used in this research are reliable and appropriate for AFFECT detection.
  • A trend related to emotional AI businesses (Realeyes, Affectiva, etc.) that expand their global operations in regions with less stringent data collection and privacy laws has not been sufficiently examined globally.
  • The recommendations for open science include the proposal to share and reuse open multimodal AFFECT data, information, knowledge, and science practices (publications and software) by preparing a Data Management Plan that would address any important aspects of making data findable, accessible, interoperable, and reusable, or FAIR. Open data analysis should also include recognized and validated scales for AFFECT evaluation; any accessible confirmation on the reliability and validity of the AFFECT device and sensor applied should be presented. The open datasets have usually sought to obtain higher accuracy by using different sets of stimuli and groups of participants.
Emotional acculturation, happens when people, on contact with a different culture, learn new ways to express their emotions [619], incorporate new cultural values in their existing set, and then adjust their emotions to suit these new values [620,621,622,623]. This may be a research area in affective computing that needs more studies and focus. With growing global integration, emotional acculturation will become increasingly important, and advanced computational models will be needed to simulate the related processes. M.-T. Ho et al. [624] believe that this may be a key thematic change in the decades to come. The findings also suggest that developing more powerful algorithms cannot solve the perception, reading and evaluation of the complexity of human emotions. Instead, the complex modulators that affective and emotional states stem from need to be better understood by the scientific community. We can only hope that the future will bring further research that will remedy this and help develop more advanced technologies that can better cope with issues such as gender, race, diversity attitudes, and cross-cultural differences in emotion [624].
The substantial improvements in the development of affordable and simple to utilize sensors for recognizing AFFECT have resulted in numerous studies being conducted. For this review, we studied in detail 634 articles. We focused on recent state-of-the-art AFFECT detection techniques. We also took existing data sets into account. As this review illustrates, exploring the relationship between brain and biometric signals and AFFECT is a formidable undertaking, and novel approaches and implementations are continually being expanded.
The evaluation of the intensity of human AFFECT is a complex process which requires the use of a multidirectional approach. The main difficulties of this process include variations in the nature of human beings, social aspects, etc., due to these methods, which fits for average evaluation of customers majority, but shows poor results in personalized cases and vice versa. Moreover, the reliability of evaluations of human emotions strongly depends on the number of biometric parameters used, and the measurement methods and sensors applied. It is well known that a higher reliability of recognition can be achieved by increasing the number of parameters, but this will also increase the need for certain equipment and will slow down the evaluation process. The selection of measurement methods and sensors is no less important in the successful recognition of emotions. Contact measurement methods give the most reliable results, but their implementation is relatively complicated and may even be frightening for potential customers. The best solution in this case is non-contact measurement methods, that is, contact methods which do not require special preparation and allow measurements to be taken without the knowledge of the customer.
Future research possibly could focus on areas of reaction to emotion development stage, while sensing and evaluation became faster than emotion recognition by person itself.
This research has addressed the various issues that emerge when affective and physiological states, as well as emotions, are determined by recognition methods and sensors and when such studies are later applied in practice. The manuscript presents the key results on the contribution of this research to the big picture. These results are summarized below:
  • Many studies around the world apply neuroscience and biometric methods to identify and analyze human valence, arousal, emotional and physiological states, and affective attitudes (AFFECT). An integrated review of these studies is, however, yet missing.
  • In view of the fact that no reviews of AFFECT recognition, classification and analysis based on Plutchik’s wheel of emotions theory are available, our study has examined the full spectrum of thirty affective states and emotions defined in the theory.
  • We have demonstrated the identification and integration of contextual (pollution, weather conditions, economic, social, environmental, and cultural heritage) [342] and macro-environmental [568] data with data on AFFECT states.
  • The authors of the article have presented their own Real-time Vilnius Happiness Index (Figure 10a) and other systems and outputs to demonstrate several of the aforementioned new research areas in practice.
Information on diversity attitudes, socioeconomic status, demographic and cultural background, and context is missing in many studies. In this study, we have identified real-time context [347] data and have integrated them with AFFECT data. For example, the ROCK Video Neuroanalytics system and associated e-infrastructure were established as part of the H2020 ROCK project, in which passers-by were tracked at 10 locations across Vilnius [348]. One of the outputs was the real-time Vilnius Happiness Index (Figure 10 and https://api.vilnius.lt/happiness-index, accessed on 5 September 2022), and the project also involved a number of additional activities (https://Vilnius.lt/en/category/rock-project/, accessed on 5 September 2022) [625,626].
The analysis of the global gap In the area of affective biometric and brain sensors presented in this study and our aim of contributing to the current state of research in this area have led to the aforementioned research results.
Based on the evaluation of biometric systems performed in Section 7 and the conclusions presented in Chapter 8, future AFFECT biometrics and neuroscience development directions and guidelines are visible. We performed the above analysis by extensively discussing biometric and neuroscience methods and domains in the article.
Additionally, Section 2 and Section 6 present statistical and multiple criteria analysis across 169 nations, our outcomes demonstrate a connection between a nation’s success, its number of Web of Science articles published, and its frequency of citation on AFFECT recognition. This analysis demonstrates which country’s success metrics significantly influence future AFFECT biometrics and neuroscience development.
Advancements in the development of biometric and neuroscience sensors and their applications are summarized in this review. Regardless of the encouraging progress and new applications, the lack of replicated work and the widely divergent methodological approaches suggest the need for further research. The interpretation of current research directions, the technical challenges of integrated neuroscience and affective biometric sensors, and recommendations for future works are discussed. The reviewed literature revealed a host of traditional and recent challenges in the field, which were examined in this article and are presented below.
Biometric research aims to provide computers with advanced intelligence so that they can automatically detect, capture, process, analyze, and identify digital biometric signals—in other words, so they can “see and hear”. In addition to being one of the basic functions of machine intelligence, this is also one of the most significant challenges that we face in theoretical and applied research [627].
There are still many challenging issues in terms of improving the accuracy, efficiency, and usability of EEG-based biometric systems. There are also problems concerning the design, development and deployment of new security-related BCI applications, such as personal authentication for mobile devices, augmented and virtual reality, headsets and the Internet [628]. Albuquerque et al. [628] have presented the recent advances of EEG-based biometrics and addressed the challenges in developing EEG-based biometry systems for various practical applications. They have also put forth new ideas and directions for future development, such as signal processing and machine learning techniques; data multimodal (EEG, EMG, ECG, and other biosignals) biometrics; pattern recognition techniques; preprocessing, feature extraction, recognition and matching; protocols, standards and interfaces; cancellable EEG biometrics; security and privacy; and information fusion for biometrics involving EEG data, virtual environment applications, stimuli sets and passive BCI technology.
Some of these challenges (accuracy, efficiency, usability, etc.) are analyzed in the article. Each of these features can be examined in more detail. For example, Fierrez et al. [629] analyzed five challenges in multiple classifiers in biometrics: design of robust algorithms from uncooperative users in unconstrained and varying scenarios; better understanding about the nature of biometrics; understanding and improving the security; integration with end applications; understanding and improving the usability. “Design of robust algorithms from uncooperative users in unconstrained and varying scenarios” is a challenge that has been a major focus of biometrics research for the past 50 years [2], but the performance level for many biometric applications in realistic scenarios is still not adequate [629].
Recently, new challenges in the field have been appearing; some of which are presented below as an example. Sivaraman [630] argues that in the age of AI and machine learning, cyberattacks are more powerful and are sometimes able to crack biometric systems. Additionally, these attacks will become more frequent. Multimodal biometrics are increasingly important, where a combination of biometrics is used for greater security. The pandemic has resulted in changes to the biometric algorithm of various modalities. Facial recognition algorithms have been improved to recognize people wearing masks and cosmetics. Updates like these may improve the accuracy of biometrics systems. Biometric devices will take web and cloud-based applications to the next level, as many organizations will continue to operate remotely [630].
Furthermore, a few problems have not been solved, and additional research fields have emerged, namely: biometric and neuroscience technologies lack privacy, are invasive and persons do not like to share their personal data and be identified; lack of protection from hacking; lack of accuracy; a quite expensive life cycle (brief, design, development, set up, running, operation, etc.); lack of capability to read some human features; customer satisfaction is not always guaranteed; human figure form recognition and examination of figure fragments, examination of head vibrations, and human electrical fields are inefficient.

Author Contributions

Conceptualization and methodology, A.K.; investigation, A.K., A.A., I.U., R.K., V.L., A.B.-V., I.V. and L.K.; resources, writing—review, editing and visualization, A.K., A.A., I.U., R.K., V.L., A.B.-V., I.V. and L.K.; supervision, A.K. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported as part of the ‘Building information modeling-based tools and technologies toward fast and efficient RENovation of residential buildings—BIM4REN’ project, which received funding from the European Union’s Horizon 2020 research and innovation program under grant agreement No. 820773. This research was also supported via Project No. 2020-1-LT01-KA203-078100 “Minimizing the influence of coronavirus in a built environment” (MICROBE) from the European Union’s Erasmus+ program.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

All extracted data are included in the manuscript.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

AFFECTarousal, valence, affective attitudes, emotional and physiological states
AIArtificial Intelligence
AISsartificial intelligence subsystems
AONaction observation network
BAEPsbrainstem evoked potentials
BCIbrain–computer interface
BIM4RenBuilding Information Modelling based tools and technologies toward fast and efficient RENovation of residential buildings
BNCIbrain/neuronal computer interaction
BRbinary relevance
BTLbelow the line
CNCPmodel Collective Neuromarketing Consumer Persuasion Model
CNNconvolutional neural network
DEAPDataset for Emotions Analysis using Physiological signals
DTIdiffusion tensor imaging
DWIdiffusion-weighted imaging
EARsemotion association rules
ECGelectrocardiography
EDAelectrodermal activity
EEGelectroencephalography
EMGelectromyography
EMSsengagement marketing subsystems
EOGelectrooculogram
ERPevent-related potential
ESSsemotional salient segments
ETeye tracking
FAfractional anisotropy
FCfacial action coding
FDGfluoro-D-glucose
FDG-PET/fMRIsimultaneous [18 F]-fluorodeoxyglucose positron emission tomography and functional magnetic resonance imaging
fEMGfacial electromyography
fMRIfunctional magnetic resonance imaging
fNIRSfunctional near-infrared spectroscopy
fPETfunctional positron emission tomography
GMMgaussian mixture models
GSRgalvanometer or galvanic skin response
HMIhuman–machine interactions
HMMhidden Markov model
HRheart rate
HVACheating, ventilation, and air conditioning
ICCsintra-class correlation coefficients
IoTInternet of Things
IRTimplicit reaction time
ISinformation systems
iTBSintermittent theta burst transcranial magnetic stimulation
K-NNK-nearest neighbor
LPlabel powerset
LSTMlong short-term memory
MDSmultidimensional scaling
MEGmagnetoencephalography
MLPmulti-layer perceptron
MRImagnetic resonance imaging
MTmouse tracking
N5PSCneuromarketing, neuroeconomics, neuromanagement, neuro-information systems, neuro-industrial engineering, products, services, call centers
NEVnet emotional value
NIRSnear infrared spectroscopy
NLPnatural language processing
NTneurotransmitter
PETpositron emission tomography
PPGphotoplethysmogram
PSDpower spectral density
RAKELrandom k-label sets
RFrandom forest
ROCKRegeneration and Optimization of Cultural heritage in creative and Knowledge cities
RRArespiratory rate assessment
RTreaction times
rTMStranscranial magnetic stimulation
SCskin conductance
SDtests SDS denaturation test
SEEValthe service encounter emotional value
SSTsteady-state topography
SVMsupport vector machine
tDCStranscranial direct-current stimulation
TMStranscranial magnetic stimulation
UIT-VSMECstandard Vietnamese social media emotion corpus
VAAQvirtual agent’s acceptance questionnaire
VPAvoice pitch analysis
VRvirtual reality

References

  1. Rizzolatti, G.; Sinigaglia, C. The Mirror Mechanism: A Basic Principle of Brain Function. Nat. Rev. Neurosci. 2016, 17, 757–765. [Google Scholar] [CrossRef] [PubMed]
  2. Spunt, R.P.; Adolphs, R. The Neuroscience of Understanding the Emotions of Others. Neurosci. Lett. 2019, 693, 44–48. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  3. Berčík, J.; Neomániová, K.; Mravcová, A.; Gálová, J. Review of the Potential of Consumer Neuroscience for Aroma Marketing and Its Importance in Various Segments of Services. Appl. Sci. 2021, 11, 7636. [Google Scholar] [CrossRef]
  4. Li, L.; Gow, A.D.I.; Zhou, J. The Role of Positive Emotions in Education: A Neuroscience Perspective. Mind Brain Educ. 2020, 14, 220–234. [Google Scholar] [CrossRef]
  5. Cromwell, H.C.; Papadelis, C. Mapping the Brain Basis of Feelings, Emotions and Much More: A Special Issue Focused on ‘The Human Affectome’. Neurosci. Biobehav. Rev. 2022, 137, 104672. [Google Scholar] [CrossRef]
  6. Alexander, R.; Aragón, O.R.; Bookwala, J.; Cherbuin, N.; Gatt, J.M.; Kahrilas, I.J.; Kästner, N.; Lawrence, A.; Lowe, L.; Morrison, R.G.; et al. The Neuroscience of Positive Emotions and Affect: Implications for Cultivating Happiness and Wellbeing. Neurosci. Biobehav. Rev. 2021, 121, 220–249. [Google Scholar] [CrossRef]
  7. Vuust, P.; Heggli, O.A.; Friston, K.J.; Kringelbach, M.L. Music in the Brain. Nat. Rev. Neurosci. 2022, 23, 287–305. [Google Scholar] [CrossRef]
  8. Green, M.F.; Horan, W.P.; Lee, J. Social Cognition in Schizophrenia. Nat. Rev. Neurosci. 2015, 16, 620–631. [Google Scholar] [CrossRef]
  9. Bunge, S.A. How We Use Rules to Select Actions: A Review of Evidence from Cognitive Neuroscience. Cogn. Affect. Behav. Neurosci. 2004, 4, 564–579. [Google Scholar] [CrossRef] [Green Version]
  10. Lieberman, M.D. Social Cognitive Neuroscience: A Review of Core Processes. Annu. Rev. Psychol. 2007, 58, 259–289. [Google Scholar] [CrossRef]
  11. Sawyer, K. The Cognitive Neuroscience of Creativity: A Critical Review. Creat. Res. J. 2011, 23, 137–154. [Google Scholar] [CrossRef]
  12. Byrom, B.; McCarthy, M.; Schueler, P.; Muehlhausen, W. Brain Monitoring Devices in Neuroscience Clinical Research: The Potential of Remote Monitoring Using Sensors, Wearables, and Mobile Devices. Clin. Pharmacol. Ther. 2018, 104, 59–71. [Google Scholar] [CrossRef] [Green Version]
  13. Johnson, K.T.; Picard, R.W. Advancing Neuroscience through Wearable Devices. Neuron 2020, 108, 8–12. [Google Scholar] [CrossRef]
  14. Soroush, M.Z.; Maghooli, K.; Setarehdan, S.K.; Motie Nasrabadi, A. A Review on EEG Signals Based Emotion Recognition. Int. Clin. Neurosci. J. 2017, 4, 118–129. [Google Scholar] [CrossRef]
  15. Gui, Q.; Ruiz-Blondet, M.V.; Laszlo, S.; Jin, Z. A Survey on Brain Biometrics. ACM Comput. Surv. 2019, 51, 1–38. [Google Scholar] [CrossRef]
  16. Fairhurst, M.; Li, C.; Da Costa-Abreu, M. Predictive Biometrics: A Review and Analysis of Predicting Personal Characteristics from Biometric Data. IET Biom. 2017, 6, 369–378. [Google Scholar] [CrossRef]
  17. Zhong, Y.; Deng, Y. A Survey on Keystroke Dynamics Biometrics: Approaches, Advances, and Evaluations. In Gate to Computer Science and Research; Zhong, Y., Deng, Y., Eds.; Science Gate Publishing P.C.: Thrace, Greece, 2015; Volume 2, pp. 1–22. [Google Scholar] [CrossRef]
  18. Hernandez-de-Menendez, M.; Morales-Menendez, R.; Escobar, C.A.; Arinez, J. Biometric Applications in Education. Int. J. Interact. Des. Manuf. 2021, 15, 365–380. [Google Scholar] [CrossRef]
  19. Berčík, J.; Horská, E.; Gálová, J.; Margianti, E.S. Consumer neuroscience in practice: The impact of store atmosphere on consumer behavior. Period. Polytech. Soc. Manag. Sci. 2016, 24, 96–101. [Google Scholar] [CrossRef] [Green Version]
  20. Pisani, P.H.; Mhenni, A.; Giot, R.; Cherrier, E.; Poh, N.; Ferreira de Carvalho, A.C.P.d.L.; Rosenberger, C.; Amara, N.E.B. Adaptive Biometric Systems: Review and Perspectives. ACM Comput. Surv. 2020, 52, 1–38. [Google Scholar] [CrossRef]
  21. Xu, S.; Fang, J.; Hu, X.; Ngai, E.; Guo, Y.; Leung, V.C.M.; Cheng, J.; Hu, B. Emotion Recognition from Gait Analyses: Current Research and Future Directions. arXiv 2020, arXiv:2003.11461. [Google Scholar] [CrossRef]
  22. Merone, M.; Soda, P.; Sansone, M.; Sansone, C. ECG Databases for Biometric Systems: A Systematic Review. Expert Syst. Appl. 2017, 67, 189–202. [Google Scholar] [CrossRef]
  23. Curtin, A.; Tong, S.; Sun, J.; Wang, J.; Onaral, B.; Ayaz, H. A Systematic Review of Integrated Functional Near-Infrared Spectroscopy (FNIRS) and Transcranial Magnetic Stimulation (TMS) Studies. Front. Neurosci. 2019, 13, 84. [Google Scholar] [CrossRef] [Green Version]
  24. da Silva, F.L. EEG and MEG: Relevance to Neuroscience. Neuron 2013, 80, 1112–1128. [Google Scholar] [CrossRef] [Green Version]
  25. Khushaba, R.N.; Wise, C.; Kodagoda, S.; Louviere, J.; Kahn, B.E.; Townsend, C. Consumer Neuroscience: Assessing the Brain Response to Marketing Stimuli Using Electroencephalogram (EEG) and Eye Tracking. Expert Syst. Appl. 2013, 40, 3803–3812. [Google Scholar] [CrossRef]
  26. Krugliak, A.; Clarke, A. Towards Real-World Neuroscience Using Mobile EEG and Augmented Reality. Sci. Rep. 2022, 12, 2291. [Google Scholar] [CrossRef]
  27. Gramann, K.; Jung, T.-P.; Ferris, D.P.; Lin, C.-T.; Makeig, S. Toward a New Cognitive Neuroscience: Modeling Natural Brain Dynamics. Front. Hum. Neurosci. 2014, 8, 444. [Google Scholar] [CrossRef]
  28. An, B.W.; Heo, S.; Ji, S.; Bien, F.; Park, J.-U. Transparent and Flexible Fingerprint Sensor Array with Multiplexed Detection of Tactile Pressure and Skin Temperature. Nat. Commun. 2018, 9, 2458. [Google Scholar] [CrossRef] [Green Version]
  29. Gadaleta, M.; Radin, J.M.; Baca-Motes, K.; Ramos, E.; Kheterpal, V.; Topol, E.J.; Steinhubl, S.R.; Quer, G. Passive Detection of COVID-19 with Wearable Sensors and Explainable Machine Learning Algorithms. NPJ Digit. Med. 2021, 4, 166. [Google Scholar] [CrossRef]
  30. Hayano, J.; Tanabiki, T.; Iwata, S.; Abe, K.; Yuda, E. Estimation of Emotions by Wearable Biometric Sensors Under Daily Activities. In 2018 IEEE 7th Global Conference on Consumer Electronics (GCCE), Osaka, Tokyo, 18–21 October 2022; IEEE: Nara, Japan, 2018; pp. 240–241. [Google Scholar] [CrossRef]
  31. Oostdijk, M.; van Velzen, A.; van Dijk, J.; Terpstra, A. State-of-the-Art in Biometrics for Multi-Factor Authentication in a Federative Context. Identity 2016, 14, 15. [Google Scholar]
  32. Salman, A.S.; Salman, A.S.; Salman, O.S. Using Behavioral Biometrics of Fingerprint Authentication to Investigate Physical and Emotional User States. In Proceedings of the Future Technologies Conference (FTC) 2021, Volume 2; Arai, K., Ed.; Lecture Notes in Networks and Systems. Springer International Publishing: Cham, Switzerland, 2022; Volume 359, pp. 240–256. [Google Scholar] [CrossRef]
  33. Zhang, Y.-J. Biometric Recognition. In Handbook of Image Engineering; Springer: Singapore, 2021; pp. 1231–1256. [Google Scholar]
  34. Maffei, A.; Angrilli, A. E-MOVIE—Experimental MOVies for Induction of Emotions in Neuroscience: An Innovative Film Database with Normative Data and Sex Differences. PLoS ONE 2019, 14, e0223124. [Google Scholar] [CrossRef] [Green Version]
  35. Apicella, A.; Arpaia, P.; Mastrati, G.; Moccaldi, N. EEG-Based Detection of Emotional Valence towards a Reproducible Measurement of Emotions. Sci. Rep. 2021, 11, 21615. [Google Scholar] [CrossRef] [PubMed]
  36. Tost, H.; Reichert, M.; Braun, U.; Reinhard, I.; Peters, R.; Lautenbach, S.; Hoell, A.; Schwarz, E.; Ebner-Priemer, U.; Zipf, A.; et al. Neural Correlates of Individual Differences in Affective Benefit of Real-Life Urban Green Space Exposure. Nat. Neurosci. 2019, 22, 1389–1393. [Google Scholar] [CrossRef] [PubMed]
  37. Mashrur, F.R.; Rahman, K.M.; Miya, M.T.I.; Vaidyanathan, R.; Anwar, S.F.; Sarker, F.; Mamun, K.A. An Intelligent Neuromarketing System for Predicting Consumers’ Future Choice from Electroencephalography Signals. Physiol. Behav. 2022, 253, 113847. [Google Scholar] [CrossRef] [PubMed]
  38. Asadzadeh, S.; Yousefi Rezaii, T.; Beheshti, S.; Meshgini, S. Accurate Emotion Recognition Using Bayesian Model Based EEG Sources as Dynamic Graph Convolutional Neural Network Nodes. Sci. Rep. 2022, 12, 10282. [Google Scholar] [CrossRef] [PubMed]
  39. Čeko, M.; Kragel, P.A.; Woo, C.-W.; López-Solà, M.; Wager, T.D. Common and Stimulus-Type-Specific Brain Representations of Negative Affect. Nat. Neurosci. 2022, 25, 760–770. [Google Scholar] [CrossRef]
  40. Prete, G.; Croce, P.; Zappasodi, F.; Tommasi, L.; Capotosto, P. Exploring Brain Activity for Positive and Negative Emotions by Means of EEG Microstates. Sci. Rep. 2022, 12, 3404. [Google Scholar] [CrossRef]
  41. Sitaram, R.; Ros, T.; Stoeckel, L.; Haller, S.; Scharnowski, F.; Lewis-Peacock, J.; Weiskopf, N.; Blefari, M.L.; Rana, M.; Oblak, E.; et al. Closed-Loop Brain Training: The Science of Neurofeedback. Nat. Rev. Neurosci. 2017, 18, 86–100. [Google Scholar] [CrossRef] [Green Version]
  42. Del Negro, C.A.; Funk, G.D.; Feldman, J.L. Breathing Matters. Nat. Rev. Neurosci. 2018, 19, 351–367. [Google Scholar] [CrossRef]
  43. Pugh, Z.H.; Choo, S.; Leshin, J.C.; Lindquist, K.A.; Nam, C.S. Emotion Depends on Context, Culture and Their Interaction: Evidence from Effective Connectivity. Soc. Cogn. Affect. Neurosci. 2022, 17, 206–217. [Google Scholar] [CrossRef]
  44. Barrett, L.F. How Emotions Are Made: The Secret Life of the Brain; Houghton Mifflin Harcourt: Boston, MA, USA, 2017. [Google Scholar]
  45. Barrett, L.F. The Theory of Constructed Emotion: An Active Inference Account of Interoception and Categorization. Soc. Cogn. Affect. Neurosci. 2017, 12, 1–23. [Google Scholar] [CrossRef]
  46. Basiri, M.E.; Nemati, S.; Abdar, M.; Cambria, E.; Acharya, U.R. ABCDM: An Attention-Based Bidirectional CNN-RNN Deep Model for Sentiment Analysis. Future Gener. Comput. Syst. 2021, 115, 279–294. [Google Scholar] [CrossRef]
  47. Parry, G.; Vuong, Q. Deep Affect: Using Objects, Scenes and Facial Expressions in a Deep Neural Network to Predict Arousal and Valence Values of Images. arXiv preprint 2021. [Google Scholar] [CrossRef]
  48. Gendron, B.; Kouremenou, E.-S.; Rusu, C. Emotional Capital Development, Positive Psychology and Mindful Teaching: Which Links? Int. J. Emot. Educ. 2016, 8, 63–74. [Google Scholar]
  49. Houge Mackenzie, S.; Brymer, E. Conceptualizing Adventurous Nature Sport: A Positive Psychology Perspective. Ann. Leis. Res. 2020, 23, 79–91. [Google Scholar] [CrossRef] [Green Version]
  50. Li, C. A Positive Psychology Perspective on Chinese EFL Students’ Trait Emotional Intelligence, Foreign Language Enjoyment and EFL Learning Achievement. J. Multiling. Multicult. Dev. 2020, 41, 246–263. [Google Scholar] [CrossRef]
  51. Bower, I.; Tucker, R.; Enticott, P.G. Impact of Built Environment Design on Emotion Measured via Neurophysiological Correlates and Subjective Indicators: A Systematic Review. J. Environ. Psychol. 2019, 66, 101344. [Google Scholar] [CrossRef]
  52. Cassidy, T. Environmental Psychology: Behaviour and Experience in Context; Contemporary Psychology Series; Psychology Press: Hove, UK, 1997. [Google Scholar]
  53. Cho, H.; Li, C.; Wu, Y. Understanding Sport Event Volunteers’ Continuance Intention: An Environmental Psychology Approach. Sport Manag. Rev. 2020, 23, 615–625. [Google Scholar] [CrossRef]
  54. Lin, S.; Döngül, E.S.; Uygun, S.V.; Öztürk, M.B.; Huy, D.T.N.; Tuan, P.V. Exploring the Relationship between Abusive Management, Self-Efficacy and Organizational Performance in the Context of Human–Machine Interaction Technology and Artificial Intelligence with the Effect of Ergonomics. Sustainability 2022, 14, 1949. [Google Scholar] [CrossRef]
  55. Privitera, M.; Ferrari, K.D.; von Ziegler, L.M.; Sturman, O.; Duss, S.N.; Floriou-Servou, A.; Germain, P.-L.; Vermeiren, Y.; Wyss, M.T.; de Deyn, P.P.; et al. A Complete Pupillometry Toolbox for Real-Time Monitoring of Locus Coeruleus Activity in Rodents. Nat. Protoc. 2020, 15, 2301–2320. [Google Scholar] [CrossRef]
  56. Rebelo, F.; Noriega, P.; Vilar, E.; Filgueiras, E. Ergonomics and Human Factors Research Challenges: The ErgoUX Lab Case Study. In Advances in Ergonomics in Design; Rebelo, F., Ed.; Lecture Notes in Networks and Systems; Springer International Publishing: Cham, Switzerland, 2021; Volume 261, pp. 912–922. [Google Scholar] [CrossRef]
  57. Khan, F. Making Savings Count. Nat. Energy 2018, 3, 354. [Google Scholar] [CrossRef]
  58. Zhang, B.; Kang, J. Effect of Environmental Contexts Pertaining to Different Sound Sources on the Mood States. Build. Environ. 2022, 207, 108456. [Google Scholar] [CrossRef]
  59. Zhu, B.-W.; Xiao, Y.H.; Zheng, W.-Q.; Xiong, L.; He, X.Y.; Zheng, J.-Y.; Chuang, Y.-C. A Hybrid Multiple-Attribute Decision-Making Model for Evaluating the Esthetic Expression of Environmental Design Schemes. SAGE Open 2022, 12, 215824402210872. [Google Scholar] [CrossRef]
  60. Silva, P.L.; Kiefer, A.; Riley, M.A.; Chemero, A. Trading Perception and Action for Complex Cognition: Application of Theoretical Principles from Ecological Psychology to the Design of Interventions for Skill Learning. In Handbook of Embodied Cognition and Sport Psychology; MIT Press: Boston, MA, USA, 2019; pp. 47–74. [Google Scholar]
  61. Szokolszky, A. Perceiving Metaphors: An Approach from Developmental Ecological Psychology. Metaphor Symb. 2019, 34, 17–32. [Google Scholar] [CrossRef]
  62. Van den Berg, P.; Larosi, H.; Maussen, S.; Arentze, T. Sense of Place, Shopping Area Evaluation, and Shopping Behaviour. Geogr. Res. 2021, 59, 584–598. [Google Scholar] [CrossRef]
  63. Argent, N. Behavioral Geography. In International Encyclopedia of Geography: People, the Earth, Environment and Technology; Richardson, D., Castree, N., Goodchild, M.F., Kobayashi, A., Liu, W., Marston, R.A., Eds.; John Wiley & Sons, Ltd: Oxford, UK, 2017; pp. 1–11. [Google Scholar] [CrossRef]
  64. Schwarz, N.; Dressler, G.; Frank, K.; Jager, W.; Janssen, M.; Müller, B.; Schlüter, M.; Wijermans, N.; Groeneveld, J. Formalising Theories of Human Decision-Making for Agent-Based Modelling of Social-Ecological Systems: Practical Lessons Learned and Ways Forward. SESMO 2020, 2, 16340. [Google Scholar] [CrossRef]
  65. Plutchik, R. The Emotions, Rev. ed.; University Press of America: Lanham, MD, USA, 1991. [Google Scholar]
  66. Kaklauskas, A.; Milevicius, V.; Kaklauskiene, L. Effects of Country Success on COVID-19 Cumulative Cases and Excess Deaths in 169 Countries. Ecol. Indic. 2022, 137, 108703. [Google Scholar] [CrossRef]
  67. Kaklauskas, A. Degree of project utility and investment value assessments. Int. J. Comput. Commun. Control. 2016, 11, 666–683. [Google Scholar] [CrossRef]
  68. Kaklauskas, A.; Herrera-Viedma, E.; Echenique, V.; Zavadskas, E.K.; Ubarte, I.; Mostert, A.; Podvezko, V.; Binkyte, A.; Podviezko, A. Multiple Criteria Analysis of Environmental Sustainability and Quality of Life in Post-Soviet States. Ecol. Indic. 2018, 89, 781–807. [Google Scholar] [CrossRef]
  69. Kaklauskas, A.; Dias, W.P.S.; Binkyte-Veliene, A.; Abraham, A.; Ubarte, I.; Randil, O.P.C.; Siriwardana, C.S.A.; Lill, I.; Milevicius, V.; Podviezko, A.; et al. Are Environmental Sustainability and Happiness the Keys to Prosperity in Asian Nations? Ecol. Indic. 2020, 119, 106562. [Google Scholar] [CrossRef]
  70. Kaklauskas, A.; Kaklauskiene, L. Analysis of the impact of success on three dimensions of sustainability in 173 countries. Sci. Rep. 2022, 12, 14719. [Google Scholar] [CrossRef]
  71. Barrett, L.F. Solving the Emotion Paradox: Categorization and the Experience of Emotion. Pers. Soc. Psychol. Rev. 2006, 10, 20–46. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  72. Puce, A.; Latinus, M.; Rossi, A.; da Silva, E.; Parada, F.; Love, S.; Ashourvan, A.; Jayaraman, S. Neural Bases for Social Attention in Healthy Humans. In The Many Faces of Social Attention; Puce, A., Bertenthal, B.I., Eds.; Springer International Publishing: Cham, Switzerland, 2015; pp. 93–127. [Google Scholar] [CrossRef]
  73. Shablack, H.; Becker, M.; Lindquist, K.A. How Do Children Learn Novel Emotion Words? A Study of Emotion Concept Acquisition in Preschoolers. J. Exp. Psychol. Gen. 2020, 149, 1537–1553. [Google Scholar] [CrossRef] [PubMed]
  74. Izard, C.E. Basic Emotions, Natural Kinds, Emotion Schemas, and a New Paradigm. Perspect. Psychol. Sci. 2007, 2, 260–280. [Google Scholar] [CrossRef] [PubMed]
  75. Briesemeister, B.B.; Kuchinke, L.; Jacobs, A.M. Discrete Emotion Effects on Lexical Decision Response Times. PLoS ONE 2011, 6, e23743. [Google Scholar] [CrossRef] [PubMed]
  76. Ekman, P. An Argument for Basic Emotions. Cogn. Emot. 1992, 6, 169–200. [Google Scholar] [CrossRef]
  77. Ekman, P. Facial Expressions. In Handbook of Cognition and Emotion; Dalgleish, T., Power, M.J., Eds.; John Wiley & Sons, Ltd: Chichester, UK, 1999; pp. 301–320. [Google Scholar] [CrossRef]
  78. Colombetti, G. From Affect Programs to Dynamical Discrete Emotions. Philos. Psychol. 2009, 22, 407–425. [Google Scholar] [CrossRef] [Green Version]
  79. Fox, E. Emotion Science: Cognitive and Neuroscientific Approaches to Understanding Human Emotions; Palgrave Macmillan: Basingstoke, UK; New York, NY, USA, 2008. [Google Scholar]
  80. Russell, J.A.; Barrett, L.F. Core Affect, Prototypical Emotional Episodes, and Other Things Called Emotion: Dissecting the Elephant. J. Personal. Soc. Psychol. 1999, 76, 805–819. [Google Scholar] [CrossRef]
  81. Cross Francis, D.I.; Hong, J.; Liu, J.; Eker, A.; Lloyd, K.; Bharaj, P.K.; Jeon, M. The Dominance of Blended Emotions: A Qualitative Study of Elementary Teachers’ Emotions Related to Mathematics Teaching. Front. Psychol. 2020, 11, 1865. [Google Scholar] [CrossRef]
  82. Hakak, N.M.; Mohd, M.; Kirmani, M.; Mohd, M. Emotion Analysis: A Survey. In 2017 International Conference on Computer, Communications and Electronics (Comptelix), Jaipur, India, 1–2 July 2017; IEEE: Jaipur, India, 2017; pp. 397–402. [Google Scholar] [CrossRef]
  83. Posner, J.; Russell, J.A.; Peterson, B.S. The Circumplex Model of Affect: An Integrative Approach to Affective Neuroscience, Cognitive Development, and Psychopathology. Develop. Psychopathol. 2005, 17, 715–734. [Google Scholar] [CrossRef]
  84. Eerola, T.; Vuoskoski, J.K. A Comparison of the Discrete and Dimensional Models of Emotion in Music. Psychol. Music 2011, 39, 18–49. [Google Scholar] [CrossRef] [Green Version]
  85. Dzedzickis, A.; Kaklauskas, A.; Bucinskas, V. Human Emotion Recognition: Review of Sensors and Methods. Sensors 2020, 20, 592. [Google Scholar] [CrossRef] [Green Version]
  86. Bradley, M.M.; Greenwald, M.K.; Petry, M.C.; Lang, P.J. Remembering Pictures: Pleasure and Arousal in Memory. J. Exp. Psychol. Learn. Mem. Cogn. 1992, 18, 379–390. [Google Scholar] [CrossRef]
  87. Rubin, D.C.; Talarico, J.M. A Comparison of Dimensional Models of Emotion: Evidence from Emotions, Prototypical Events, Autobiographical Memories, and Words. Memory 2009, 17, 802–808. [Google Scholar] [CrossRef] [Green Version]
  88. Watson, D.; Tellegen, A. Toward a Consensual Structure of Mood. Psychol. Bull. 1985, 98, 219–235. [Google Scholar] [CrossRef]
  89. Karbauskaitė, R.; Sakalauskas, L.; Dzemyda, G. Kriging Predictor for Facial Emotion Recognition Using Numerical Proximities of Human Emotions. Informatica 2020, 31, 249–275. [Google Scholar] [CrossRef]
  90. Mehrabian, A. Framework for a Comprehensive Description and Measurement of Emotional States. Genet. Soc. Gen. Psychol. Monogr. 1995, 121, 339–361. [Google Scholar]
  91. Mehrabian, A. Correlations of the PAD Emotion Scales with Self-Reported Satisfaction in Marriage and Work. Genet. Soc. Gen. Psychol. Monogr. 1998, 124, 311–334. [Google Scholar]
  92. Detandt, S.; Leys, C.; Bazan, A. A French Translation of the Pleasure Arousal Dominance (PAD) Semantic Differential Scale for the Measure of Affect and Drive. Psychol. Belg. 2017, 57, 17. [Google Scholar] [CrossRef]
  93. Kaklauskas, A.; Bucinskas, V.; Dzedzickis, A.; Ubarte, I. Method for Controlling a Customized Microclimate in a Building and Realization System Thereof. European Patent Application. EP 4 020 134 A1, 7 February 2021. [Google Scholar]
  94. Nor, N.M.; Wahab, A.; Majid, H.; Kamaruddin, N. Pre-Post Accident Analysis Relates to Pre-Cursor Emotion for Driver Behavior Understanding. In Proceedings of the 11th WSEAS International Conference on Applied Computer Science, Rovaniemi, Finland, 18–20 April 2012; World Scientific and Engineering Academy and Society (WSEAS): Stevens Point, WI, USA; pp. 152–157. [Google Scholar]
  95. Kolmogorova, A.; Kalinin, A.; Malikova, A. Non-Discrete Sentiment Dataset Annotation: Case Study for Lövheim Cube Emotional Model. In Digital Transformation and Global Society; Alexandrov, D.A., Boukhanovsky, A.V., Chugunov, A.V., Kabanov, Y., Koltsova, O., Musabirov, I., Eds.; Communications in Computer and Information Science; Springer International Publishing: Cham, Switzerland, 2020; Volume 1242, pp. 154–164. [Google Scholar] [CrossRef]
  96. Lövheim, H. A New Three-Dimensional Model for Emotions and Monoamine Neurotransmitters. Med. Hypotheses 2012, 78, 341–348. [Google Scholar] [CrossRef]
  97. Mohsin, M.A.; Beltiukov, A. Summarizing Emotions from Text Using Plutchik’s Wheel of Emotions. In Proceedings of the 7th Scientific Conference on Information Technologies for Intelligent Decision Making Support (ITIDS 2019); Atlantis Press: Ufa, Russia, 2019. [Google Scholar] [CrossRef] [Green Version]
  98. Donaldson, M. A Plutchik’s Wheel of Emotions—2017 Update. 2018. Available online: https://www.uvm.edu/~mjk/013%20Intro%20to%20Wildlife%20Tracking/Plutchik’s%20Wheel%20of%20Emotions%20-%202017%20Update%20_%20Six%20Seconds.pdf (accessed on 5 September 2022).
  99. Mulder, P. Robert Plutchik’s Wheel of Emotions. 2018. Available online: https://www.toolshero.com/psychology/wheel-of-emotions-plutchik/ (accessed on 5 September 2022).
  100. Kołakowska, A.; Landowska, A.; Szwoch, M.; Szwoch, W.; Wróbel, M.R. Modeling Emotions for Affectaware Applications. In Information Systems Development and Applications; Faculty of Management, University of Gdańsk: Gdańsk, Poland, 2015; pp. 55–69. [Google Scholar]
  101. Suttles, J.; Ide, N. Distant Supervision for Emotion Classification with Discrete Binary Values. In Computational Linguistics and Intelligent Text Processing; Gelbukh, A., Hutchison, D., Kanade, T., Kittler, J., Kleinberg, J.M., Mattern, F., Mitchell, J.C., Naor, M., Eds.; Lecture Notes in Computer Science; Springer: Berlin/Heidelberg, Germany, 2013; Volume 7817, pp. 121–136. [Google Scholar] [CrossRef] [Green Version]
  102. Six seconds The Emotional Intelligence Network. Plutchik’s Wheel of Emotions: Exploring the Emotion Wheel. Available online: https://www.6seconds.org/2022/03/13/plutchik-wheel-emotions/ (accessed on 5 September 2022).
  103. Karnilowicz, H.R. The Emotion Wheel: Purpose, Definition, and Uses. Available online: https://www.berkeleywellbeing.com/emotion-wheel.html (accessed on 17 August 2022).
  104. Cambria, E.; Livingstone, A.; Hussain, A. The Hourglass of Emotions. In Cognitive Behavioural Systems; Esposito, A., Esposito, A.M., Vinciarelli, A., Hoffmann, R., Müller, V.C., Hutchison, D., Kanade, T., Eds.; Lecture Notes in Computer Science; Springer: Berlin/Heidelberg, Germany, 2012; Volume 7403, pp. 144–157. [Google Scholar] [CrossRef]
  105. Plutchik, R.; Kellerman, H. Theories of Emotion; Academic Press: Cambridge, MA, USA, 2013. [Google Scholar]
  106. Kušen, E.; Strembeck, M.; Cascavilla, G.; Conti, M. On the Influence of Emotional Valence Shifts on the Spread of Information in Social Networks. In Proceedings of the 2017 IEEE/ACM International Conference on Advances in Social Networks Analysis and Mining 2017, Sydney, Australia, 31 July–3 August 2017; ACM: Sydney, Australia, 2017; pp. 321–324. [Google Scholar] [CrossRef]
  107. Bassett, D.S.; Sporns, O. Network Neuroscience. Nat. Neurosci. 2017, 20, 353–364. [Google Scholar] [CrossRef] [Green Version]
  108. Deion, A. 8 Top Trends of Future Sensors. 2021. Available online: https://community.hackernoon.com/t/8-top-trends-of-future-sensors/57483 (accessed on 5 September 2022).
  109. Gartner; Panetta, K. Gartner Top Strategic Technology Trends for 2021. 2020. Available online: https://www.gartner.com/smarterwithgartner/gartner-top-strategic-technology-trends-for-2021 (accessed on 17 August 2022).
  110. Kobus, H. Future Sensor Technology: 21 Expected Trends. Available online: https://www.sentech.nl/en/rd-engineer/21-sensor-technology-future-trends/ (accessed on 5 September 2022).
  111. Sebastian, V. Neuromarketing and Evaluation of Cognitive and Emotional Responses of Consumers to Marketing Stimuli. Procedia-Soc. Behav. Sci. 2014, 127, 753–757. [Google Scholar] [CrossRef]
  112. Sawe, N.; Chawla, K. Environmental Neuroeconomics: How Neuroscience Can Inform Our Understanding of Human Responses to Climate Change. Curr. Opin. Behav. Sci. 2021, 42, 147–154. [Google Scholar] [CrossRef]
  113. Serra, D. Neuroeconomics: Reliable, Scientifically Legitimate and Useful Knowledge for Economists? 2020. Available online: https://hal.inrae.fr/hal-02956441 (accessed on 5 September 2022).
  114. Braeutigam, S. Neuroeconomics—From Neural Systems to Economic Behaviour. Brain Res. Bull. 2005, 67, 355–360. [Google Scholar] [CrossRef]
  115. Kenning, P.; Plassmann, H. NeuroEconomics: An Overview from an Economic Perspective. Brain Res. Bull. 2005, 67, 343–354. [Google Scholar] [CrossRef]
  116. Wirdayanti, Y.N.; Ghoni, M.A. Neuromanagement Under the Light of Maqasid Sharia. Al Tijarah 2020, 5, 63–71. [Google Scholar] [CrossRef]
  117. Teacu Parincu, A.M.; Capatina, A.; Varon, D.J.; Bennet, P.F.; Recuerda, A.M. Neuromanagement: The Scientific Approach to Contemporary Management. Proc. Int. Conf. Bus. Excell. 2020, 14, 1046–1056. [Google Scholar] [CrossRef]
  118. Arce, A.L.; Cordero, J.M.B.; Mejía, E.T.; González, B.P. Tools of Neuromanagement, to Strengthen the Leadership Competencies of Executives in the Logistics Areas of the Auto Parts Industry. StrategyTechnol. Soc. 2020, 10, 36–63. [Google Scholar]
  119. Michalczyk, S.; Jung, D.; Nadj, M.; Knierim, M.T.; Rissler, R. BrownieR: The R-Package for Neuro Information Systems Research. In Information Systems and Neuroscience; Davis, F.D., Riedl, R., vom Brocke, J., Léger, P.-M., Randolph, A.B., Eds.; Lecture Notes in Information Systems and Organisation; Springer International Publishing: Cham, Switzerland, 2019; Volume 29, pp. 101–109. [Google Scholar] [CrossRef]
  120. Riedl, R.; Léger, P. Neuro-Information-Systems (NeuroIS). In Association for Information Systems; Springer: Berlin/Heidelberg, Germany, 2016. [Google Scholar] [CrossRef]
  121. Ma, Q.; Ji, W.; Fu, H.; Bian, J. Neuro-Industrial Engineering: The New Stage of Modern IE—From the Human-Oriented Perspective. Int. J. Serv. Oper. Inform. 2012, 7, 150–166. [Google Scholar] [CrossRef]
  122. Rymer, W.Z. Neural Engineering. Encyclopedia Britannica. 2018. Available online: https://www.britannica.com/science/neural-engineering (accessed on 5 September 2022).
  123. Hodson, H. Hang on Your Every Word. New Sci. 2014, 222, 20. [Google Scholar]
  124. Tzirakis, P.; Zhang, J.; Schuller, B.W. End-to-End Speech Emotion Recognition Using Deep Neural Networks. In 2018 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Calgary, AB, Canada, 15–20 April 2018; IEEE: Calgary, AB, Canada, 2018; pp. 5089–5093. [Google Scholar] [CrossRef]
  125. Parkin, B.L.; Ekhtiari, H.; Walsh, V.F. Non-Invasive Human Brain Stimulation in Cognitive Neuroscience: A Primer. Neuron 2015, 87, 932–945. [Google Scholar] [CrossRef]
  126. Annavarapu, R.N.; Kathi, S.; Vadla, V.K. Non-Invasive Imaging Modalities to Study Neurodegenerative Diseases of Aging Brain. J. Chem. Neuroanat. 2019, 95, 54–69. [Google Scholar] [CrossRef] [PubMed]
  127. Bergmann, T.O.; Karabanov, A.; Hartwigsen, G.; Thielscher, A.; Siebner, H.R. Combining Non-Invasive Transcranial Brain Stimulation with Neuroimaging and Electrophysiology: Current Approaches and Future Perspectives. NeuroImage 2016, 140, 4–19. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  128. Cao, M.; Galvis, D.; Vogrin, S.J.; Woods, W.P.; Vogrin, S.; Wang, F.; Woldman, W.; Terry, J.R.; Peterson, A.; Plummer, C.; et al. Virtual Intracranial EEG Signals Reconstructed from MEG with Potential for Epilepsy Surgery. Nat. Commun. 2022, 13, 994. [Google Scholar] [CrossRef] [PubMed]
  129. Currà, A.; Gasbarrone, R.; Cardillo, A.; Trompetto, C.; Fattapposta, F.; Pierelli, F.; Missori, P.; Bonifazi, G.; Serranti, S. Near-Infrared Spectroscopy as a Tool for in Vivo Analysis of Human Muscles. Sci. Rep. 2019, 9, 8623. [Google Scholar] [CrossRef] [Green Version]
  130. De Camp, N.V.; Kalinka, G.; Bergeler, J. Light-Cured Polymer Electrodes for Non-Invasive EEG Recordings. Sci. Rep. 2018, 8, 14041. [Google Scholar] [CrossRef] [Green Version]
  131. Etchell, A.C.; Civier, O.; Ballard, K.J.; Sowman, P.F. A Systematic Literature Review of Neuroimaging Research on Developmental Stuttering between 1995 and 2016. J. Fluen. Disord. 2018, 55, 6–45. [Google Scholar] [CrossRef]
  132. Peters, J.C.; Reithler, J.; de Graaf, T.A.; Schuhmann, T.; Goebel, R.; Sack, A.T. Concurrent Human TMS-EEG-FMRI Enables Monitoring of Oscillatory Brain State-Dependent Gating of Cortico-Subcortical Network Activity. Commun. Biol. 2020, 3, 40. [Google Scholar] [CrossRef] [Green Version]
  133. Shibasaki, H. Human Brain Mapping: Hemodynamic Response and Electrophysiology. Clin. Neurophysiol. 2008, 119, 731–743. [Google Scholar] [CrossRef]
  134. Silberstein, R.B.; Nield, G.E. Brain Activity Correlates of Consumer Brand Choice Shift Associated with Television Advertising. Int. J. Advert. 2008, 27, 359–380. [Google Scholar] [CrossRef]
  135. Uludag, U.; Pankanti, S.; Prabhakar, S.; Jain, A.K. Biometric Cryptosystems: Issues and Challenges. Proc. IEEE 2004, 92, 948–960. [Google Scholar] [CrossRef]
  136. Presby, D.M.; Capodilupo, E.R. Biometrics from a Wearable Device Reveal Temporary Effects of COVID-19 Vaccines on Cardiovascular, Respiratory, and Sleep Physiology. J. Appl. Physiol. 2022, 132, 448–458. [Google Scholar] [CrossRef]
  137. Stephen, M.J.; Reddy, P. Implementation of Easy Fingerprint Image Authentication with Traditional Euclidean and Singular Value Decomposition Algorithms. Int. J. Adv. Soft Comput. Its Appl. 2011, 3, 1–19. [Google Scholar]
  138. Banirostam, H.; Shamsinezhad, E.; Banirostam, T. Functional Control of Users by Biometric Behavior Features in Cloud Computing. In 2013 4th International Conference on Intelligent Systems, Modelling and Simulation, Bangkok, Thailand, 29–30 January 2013; IEEE: Bangkok, Thailand, 2013; pp. 94–98. [Google Scholar] [CrossRef]
  139. Yang, W.; Wang, S.; Hu, J.; Zheng, G.; Chaudhry, J.; Adi, E.; Valli, C. Securing Mobile Healthcare Data: A Smart Card Based Cancelable Finger-Vein Bio-Cryptosystem. IEEE Access 2018, 6, 36939–36947. [Google Scholar] [CrossRef]
  140. Kodituwakku, S.R. Biometric Authentication: A Review. Int. J. Trend Res. Dev. 2015, 2, 113–123. [Google Scholar]
  141. Jain, A.; Hong, L.; Pankanti, S. Biometric Identification. Commun. ACM 2000, 43, 90–98. [Google Scholar] [CrossRef]
  142. Choudhary, S.K.; Naik, A.K. Multimodal Biometric Authentication with Secured Templates—A Review. In 2019 3rd International Conference on Trends in Electronics and Informatics (ICOEI), Tirunelveli, India, 23–25 April 2019; IEEE: Tirunelveli, India, 2019; pp. 1062–1069. [Google Scholar] [CrossRef]
  143. Kim, J.S.; Pan, S.B. A Study on EMG-Based Biometrics. Internet Serv. Inf. Secur. (JISIS) 2017, 7, 19–31. [Google Scholar]
  144. Maiorana, E. Deep Learning for EEG-Based Biometric Recognition. Neurocomputing 2020, 410, 374–386. [Google Scholar] [CrossRef]
  145. Revett, K. Cognitive Biometrics: A Novel Approach to Person Authentication. IJCB 2012, 1, 1–9. [Google Scholar] [CrossRef]
  146. Prasse, P.; Jäger, L.A.; Makowski, S.; Feuerpfeil, M.; Scheffer, T. On the Relationship between Eye Tracking Resolution and Performance of Oculomotoric Biometric Identification. Procedia Comput. Sci. 2020, 176, 2088–2097. [Google Scholar] [CrossRef]
  147. Cho, Y. Rethinking Eye-Blink: Assessing Task Difficulty through Physiological Representation of Spontaneous Blinking. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, Yokohama, Japan, 8–13 May 2021; ACM: Yokohama, Japan, 2021; pp. 1–12. [Google Scholar] [CrossRef]
  148. Abdulrahman, S.A.; Alhayani, B. A Comprehensive Survey on the Biometric Systems Based on Physiological and Behavioural Characteristics. Mater. Today Proc. 2021, In Press, Corrected Proof. S2214785321048513. [Google Scholar] [CrossRef]
  149. Allado, E.; Poussel, M.; Moussu, A.; Saunier, V.; Bernard, Y.; Albuisson, E.; Chenuel, B. Innovative Measurement of Routine Physiological Variables (Heart Rate, Respiratory Rate and Oxygen Saturation) Using a Remote Photoplethysmography Imaging System: A Prospective Comparative Trial Protocol. BMJ Open 2021, 11, e047896. [Google Scholar] [CrossRef] [PubMed]
  150. Dargan, S.; Kumar, M. A Comprehensive Survey on the Biometric Recognition Systems Based on Physiological and Behavioral Modalities. Expert Syst. Appl. 2020, 143, 113114. [Google Scholar] [CrossRef]
  151. Mordini, E.; Tzovaras, D.; Ashton, H. Introduction. In Second Generation Biometrics: The Ethical, Legal and Social Context; Mordini, E., Tzovaras, D., Eds.; The International Library of Ethics, Law and Technology; Springer: Dordrecht, The Netherlands, 2012; Volume 11, pp. 1–19. [Google Scholar] [CrossRef]
  152. Fuster, G.G. Artificial Intelligence and Law Enforcement: Impact on Fundamental Rights (European Parliament 2020). 2020. Available online: http://www.europarl.europa.eu/supporting-analyses (accessed on 5 September 2022).
  153. Ghilardi, G.; Keller, F. Epistemological Foundation of Biometrics. In Second Generation Biometrics: The Ethical, Legal and Social Context; Mordini, E., Tzovaras, D., Eds.; The International Library of Ethics, Law and Technology; Springer: Dordrecht, The Netherlands, 2012; Volume 11, pp. 23–47. [Google Scholar] [CrossRef]
  154. Riera, A.; Dunne, S.; Cester, I.; Ruffini, G. Electrophysiological biometrics: Opportunities and risks. In Second Generation Biometrics: The Ethical, Legal and Social Context; Mordini, E., Tzovaras, D., Eds.; The International Library of Ethics, Law and Technology; Springer: Dordrecht, The Netherlands, 2012; Volume 11, pp. 149–176. [Google Scholar] [CrossRef]
  155. Smith, M.; Mann, M.; Urbas, G. Biometrics, Crime and Security; Law, science and society; Routledge: New York, NY, USA, 2018. [Google Scholar]
  156. Simó, F.Z. Then and Now. Profuturo 2019, 9, 78–90. [Google Scholar] [CrossRef]
  157. U.S Department of Homeland Security. Future Attribute Screening Technology. 2014. Available online: https://www.dhs.gov/sites/default/files/publications/Future%20Attribute%20Screening%20Technology-FAST.pdf (accessed on 5 September 2022).
  158. Alhalaseh, R.; Alasasfeh, S. Machine-Learning-Based Emotion Recognition System Using EEG Signals. Computers 2020, 9, 95. [Google Scholar] [CrossRef]
  159. Ma, X.; Jiang, X.; Jiang, Y. Increased Spontaneous Fronto-Central Oscillatory Power during Eye Closing in Patients with Multiple Somatic Symptoms. Psychiatry Res. Neuroimaging 2022, 324. [Google Scholar] [CrossRef]
  160. Ramesh, S.; Gomathi, S.; Sasikala, S.; Saravanan, T.R. Automatic Speech Emotion Detection Using Hybrid of Gray Wolf Optimizer and Naïve Bayes. Int. J. Speech Technol. 2021, 1–8. [Google Scholar] [CrossRef]
  161. Moses, E.; Clark, K.R.; Jacknis, N.J. The Future of Advertising: Influencing and Predicting Response Through Artificial Intelligence, Machine Learning, and Neuroscience. In Advances in Business Information Systems and Analytics; Chkoniya, V., Ed.; IGI Global: Hershey, PA, USA, 2021; pp. 151–166. [Google Scholar] [CrossRef]
  162. Sun, L.; Fu, S.; Wang, F. Decision Tree SVM Model with Fisher Feature Selection for Speech Emotion Recognition. J. Audio Speech Music Proc. 2019, 2019, 2. [Google Scholar] [CrossRef] [Green Version]
  163. Sun, L.; Zou, B.; Fu, S.; Chen, J.; Wang, F. Speech Emotion Recognition Based on DNN-Decision Tree SVM Model. Speech Commun. 2019, 115, 29–37. [Google Scholar] [CrossRef]
  164. Chen, L.; Su, W.; Feng, Y.; Wu, M.; She, J.; Hirota, K. Two-Layer Fuzzy Multiple Random Forest for Speech Emotion Recognition in Human-Robot Interaction. Inf. Sci. 2020, 509, 150–163. [Google Scholar] [CrossRef]
  165. Rai, M.; Husain, A.A.; Sharma, R.; Maity, T.; Yadav, R. Facial Feature-Based Human Emotion Detection Using Machine Learning: An Overview. In Artificial Intelligence and Cybersecurity; CRC Press: Boca Raton, FL, USA, 2022; pp. 107–120. [Google Scholar]
  166. Zhang, J.; Yin, Z.; Chen, P.; Nichele, S. Emotion Recognition Using Multi-Modal Data and Machine Learning Techniques: A Tutorial and Review. Inf. Fusion 2020, 59, 103–126. [Google Scholar] [CrossRef]
  167. Aouani, H.; Ben Ayed, Y. Deep Support Vector Machines for Speech Emotion Recognition. In Intelligent Systems Design and Applications; Abraham, A., Siarry, P., Ma, K., Kaklauskas, A., Eds.; Advances in Intelligent Systems and Computing; Springer International Publishing: Cham, Switzerland, 2021; Volume 1181, pp. 406–415. [Google Scholar] [CrossRef]
  168. Bhavan, A.; Chauhan, P.; Hitkul; Shah, R.R. Bagged Support Vector Machines for Emotion Recognition from Speech. Knowl.-Based Syst. 2019, 184, 104886. [Google Scholar] [CrossRef]
  169. Miller, C.H.; Sacchet, M.D.; Gotlib, I.H. Support Vector Machines and Affective Science. Emot. Rev. 2020, 12, 297–308. [Google Scholar] [CrossRef]
  170. Abo, M.E.M.; Idris, N.; Mahmud, R.; Qazi, A.; Hashem, I.A.T.; Maitama, J.Z.; Naseem, U.; Khan, S.K.; Yang, S. A Multi-Criteria Approach for Arabic Dialect Sentiment Analysis for Online Reviews: Exploiting Optimal Machine Learning Algorithm Selection. Sustainability 2021, 13, 10018. [Google Scholar] [CrossRef]
  171. Singh, B.K.; Khare, A.; Soni, A.K.; Kumar, A. Electroencephalography-Based Classification of Human Emotion: A Hybrid Strategy in Machine Learning Paradigm. Int. J. Comput. Vis. Robot. 2019, 9, 583–598. [Google Scholar] [CrossRef]
  172. Yudhana, A.; Muslim, A.; Wati, D.E.; Puspitasari, I.; Azhari, A.; Mardhia, M.M. Human Emotion Recognition Based on EEG Signal Using Fast Fourier Transform and K-Nearest Neighbor. Adv. Sci. Technol. Eng. Syst. J. 2020, 5, 1082–1088. [Google Scholar] [CrossRef]
  173. Assielou, K.A.; Haba, C.T.; Gooré, B.T.; Kadjo, T.L.; Yao, K.D. Emotional Impact for Predicting Student Performance in Intelligent Tutoring Systems (ITS). Int. J. Adv. Comput. Sci. Appl. 2020, 11, 219–225. [Google Scholar] [CrossRef]
  174. Lenzoni, S.; Bozzoni, V.; Burgio, F.; de Gelder, B.; Wennberg, A.; Botta, A.; Pegoraro, E.; Semenza, C. Recognition of Emotions Conveyed by Facial Expression and Body Postures in Myotonic Dystrophy (DM). Cortex 2020, 127, 58–66. [Google Scholar] [CrossRef]
  175. Li, Y.; Zheng, W.; Cui, Z.; Zong, Y.; Ge, S. EEG Emotion Recognition Based on Graph Regularized Sparse Linear Regression. Neural Process Lett. 2019, 49, 555–571. [Google Scholar] [CrossRef]
  176. Loos, E.; Egli, T.; Coynel, D.; Fastenrath, M.; Freytag, V.; Papassotiropoulos, A.; de Quervain, D.J.-F.; Milnik, A. Predicting Emotional Arousal and Emotional Memory Performance from an Identical Brain Network. NeuroImage 2019, 189, 459–467. [Google Scholar] [CrossRef]
  177. Tottenham, N.; Weissman, M.M.; Wang, Z.; Warner, V.; Gameroff, M.J.; Semanek, D.P.; Hao, X.; Gingrich, J.A.; Peterson, B.S.; Posner, J.; et al. Depression Risk Is Associated with Weakened Synchrony Between the Amygdala and Experienced Emotion. Biol. Psychiatry Cogn. Neurosci. Neuroimaging 2021, 6, 343–351. [Google Scholar] [CrossRef]
  178. Doma, V.; Pirouz, M. A Comparative Analysis of Machine Learning Methods for Emotion Recognition Using EEG and Peripheral Physiological Signals. J. Big Data 2020, 7, 18. [Google Scholar] [CrossRef] [Green Version]
  179. Pan, C.; Shi, C.; Mu, H.; Li, J.; Gao, X. EEG-Based Emotion Recognition Using Logistic Regression with Gaussian Kernel and Laplacian Prior and Investigation of Critical Frequency Bands. Appl. Sci. 2020, 10, 1619. [Google Scholar] [CrossRef] [Green Version]
  180. Rafi, T.H.; Farhan, F.; Hoque, M.Z.; Quayyum FMRafi, T.H.; Farhan, F.; Hoque, M.Z.; Quayyum, F.M. Electroencephalogram (EEG) Brainwave Signal-Based Emotion Recognition Using Extreme Gradient Boosting Algorithm. Ann. Eng. 2020, 1, 1–19. [Google Scholar]
  181. Jackson-Koku, G.; Grime, P. Emotion Regulation and Burnout in Doctors: A Systematic Review. Occup. Med. 2019, 69, 9–21. [Google Scholar] [CrossRef]
  182. Shams, S. Predicting Coronavirus Anxiety Based on Cognitive Emotion Regulation Strategies, Anxiety Sensitivity, and Psychological Hardiness in Nurses. Q. J. Nurs. Manag. 2021, 10, 25–36. [Google Scholar]
  183. Scribner, D.R. Predictors of Shoot–Don’t Shoot Decision-Making Performance: An Examination of Cognitive and Emotional Factors. J. Cogn. Eng. Decis. Mak. 2016, 10, 3–13. [Google Scholar] [CrossRef]
  184. Smith, G. Be Wary of Black-Box Trading Algorithms. JOI 2019, 28, 7–15. [Google Scholar] [CrossRef]
  185. Hajarolasvadi, N.; Demirel, H. 3D CNN-Based Speech Emotion Recognition Using K-Means Clustering and Spectrograms. Entropy 2019, 21, 479. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  186. Morawetz, C.; Riedel, M.C.; Salo, T.; Berboth, S.; Eickhoff, S.B.; Laird, A.R.; Kohn, N. Multiple Large-Scale Neural Networks Underlying Emotion Regulation. Neurosci. Biobehav. Rev. 2020, 116, 382–395. [Google Scholar] [CrossRef] [PubMed]
  187. Zou, L.; Guo, Q.; Xu, Y.; Yang, B.; Jiao, Z.; Xiang, J. Functional Connectivity Analysis of the Neural Bases of Emotion Regulation: A Comparison of Independent Component Method with Density-Based k-Means Clustering Method. Technol. Health Care 2016, 24, S817–S825. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  188. Mohammed, N.S.; Abdul Hassan, K.A. The Effect of the Number of Key-Frames on the Facial Emotion Recognition Accuracy. Eng. Technol. J. 2021, 39, 89–100. [Google Scholar] [CrossRef]
  189. Shi, F.; Dey, N.; Ashour, A.S.; Sifaki-Pistolla, D.; Sherratt, R.S. Meta-KANSEI Modeling with Valence-Arousal FMRI Dataset of Brain. Cogn. Comput. 2019, 11, 227–240. [Google Scholar] [CrossRef] [Green Version]
  190. Kaunhoven, R.J.; Dorjee, D. Mindfulness Versus Cognitive Reappraisal: The Impact of Mindfulness-Based Stress Reduction (MBSR) on the Early and Late Brain Potential Markers of Emotion Regulation. Mindfulness 2021, 12, 2266–2280. [Google Scholar] [CrossRef]
  191. Li, G.; Zhang, W.; Hu, Y.; Wang, J.; Li, J.; Jia, Z.; Zhang, L.; Sun, L.; von Deneen, K.M.; Duan, S.; et al. Distinct Basal Brain Functional Activity and Connectivity in the Emotional-Arousal Network and Thalamus in Patients With Functional Constipation Associated With Anxiety and/or Depressive Disorders. Psychosom. Med. 2021, 83, 707–714. [Google Scholar] [CrossRef]
  192. Xiao, G.; Ma, Y.; Liu, C.; Jiang, D. A Machine Emotion Transfer Model for Intelligent Human-Machine Interaction Based on Group Division. Mech. Syst. Signal Processing 2020, 142, 106736. [Google Scholar] [CrossRef]
  193. Li, H.; Xu, H. Deep Reinforcement Learning for Robust Emotional Classification in Facial Expression Recognition. Knowl.-Based Syst. 2020, 204, 106172. [Google Scholar] [CrossRef]
  194. Li, Y.; Chen, Y. Research on Chorus Emotion Recognition and Intelligent Medical Application Based on Health Big Data. J. Healthc. Eng. 2022, 2022, 1363690. [Google Scholar] [CrossRef]
  195. Yakovyna, V.; Khavalko, V.; Sherega, V.; Boichuk, A.; Barna, A. Biosignal and Image Processing System for Emotion Recognition Applications. In Proceedings of the IT&AS’2021: Symposium on Information Technologies & Applied Sciences, Bratislava, Slovakia, 5 March 2021; pp. 181–191. [Google Scholar]
  196. Chan, J.C.P.; Ho, E.S.L. Emotion Transfer for 3D Hand and Full Body Motion Using StarGAN. Computers 2021, 10, 38. [Google Scholar] [CrossRef]
  197. Global Industry Analysts Inc. Neuroscience—Global Market Trajectory & Analytics. 2021. Available online: https://www.prnewswire.com/news-releases/new-analysis-from-global-industry-analysts-reveals-steady-growth-for-neuroscience-with-the-market-to-reach-36-2-billion-worldwide-by-2026--301404252.html (accessed on 5 September 2022).
  198. Neuroscience Market. Global Industry Analysis, Size, Share, Growth, Trends, and Forecast, 2021–2031. Available online: https://www.transparencymarketresearch.com/neuroscience-market.html (accessed on 17 August 2022).
  199. Celeghin, A.; Diano, M.; Bagnis, A.; Viola, M.; Tamietto, M. Basic Emotions in Human Neuroscience: Neuroimaging and Beyond. Front. Psychol. 2017, 8, 1432. [Google Scholar] [CrossRef] [Green Version]
  200. Sander, D.; Nummenmaa, L. Reward and Emotion: An Affective Neuroscience Approach. Curr. Opin. Behav. Sci. 2021, 39, 161–167. [Google Scholar] [CrossRef]
  201. Podladchikova, L.N.; Shaposhnikov, D.G.; Kozubenko, E.A. Towards Neuroinformatic Approach for Second-Person Neuroscience. In Advances in Neural Computation, Machine Learning, and Cognitive Research IV.; Kryzhanovsky, B., Dunin-Barkowski, W., Redko, V., Tiumentsev, Y., Eds.; Studies in Computational Intelligence; Springer International Publishing: Cham, Switzerland, 2021; Volume 925, pp. 143–148. [Google Scholar] [CrossRef]
  202. Tan, C.; Liu, X.; Zhang, G. Inferring Brain State Dynamics Underlying Naturalistic Stimuli Evoked Emotion Changes with DHA-HMM. Neuroinform 2022, 20, 737–753. [Google Scholar] [CrossRef]
  203. Blair, R.J.R.; Meffert, H.; White, S.F. Psychopathy and Brain Function: Insights from Neuroimaging Research. In Handbook of Psychopathy; The Guilford Press: New York, NY, USA, 2018; pp. 401–421. [Google Scholar]
  204. Blair, R.J.R.; Mathur, A.; Haines, N.; Bajaj, S. Future Directions for Cognitive Neuroscience in Psychiatry: Recommendations for Biomarker Design Based on Recent Test Re-Test Reliability Work. Curr. Opin. Behav. Sci. 2022, 44, 101102. [Google Scholar] [CrossRef]
  205. Hamann, S. Integrating Perspectives on Affective Neuroscience: Introduction to the Special Section on the Brain and Emotion. Emot. Rev. 2018, 10, 187–190. [Google Scholar] [CrossRef]
  206. Shaffer, C.; Westlin, C.; Quigley, K.S.; Whitfield-Gabrieli, S.; Barrett, L.F. Allostasis, Action, and Affect in Depression: Insights from the Theory of Constructed Emotion. Annu. Rev. Clin. Psychol. 2022, 18, 553–580. [Google Scholar] [CrossRef]
  207. Hackel, L.M.; Amodio, D.M. Computational Neuroscience Approaches to Social Cognition. Curr. Opin. Psychol. 2018, 24, 92–97. [Google Scholar] [CrossRef]
  208. Smith, R.; Lane, R.D.; Nadel, L.; Moutoussis, M. A Computational Neuroscience Perspective on the Change Process in Psychotherapy. In Neuroscience of Enduring Change; Oxford University Press: New York, NY, USA, 2020; pp. 395–432. [Google Scholar] [CrossRef]
  209. Hill, K.E.; South, S.C.; Egan, R.P.; Foti, D. Abnormal Emotional Reactivity in Depression: Contrasting Theoretical Models Using Neurophysiological Data. Biol. Psychol. 2019, 141, 35–43. [Google Scholar] [CrossRef]
  210. Kontaris, I.; East, B.S.; Wilson, D.A. Behavioral and Neurobiological Convergence of Odor, Mood and Emotion: A Review. Front. Behav. Neurosci. 2020, 14, 35. [Google Scholar] [CrossRef]
  211. Kyrios, M.; Trotzke, P.; Lawrence, L.; Fassnacht, D.B.; Ali, K.; Laskowski, N.M.; Müller, A. Behavioral Neuroscience of Buying-Shopping Disorder: A Review. Curr. Behav. Neurosci. Rep. 2018, 5, 263–270. [Google Scholar] [CrossRef]
  212. Wang, J.; Cheng, R.; Liao, P.-C. Trends of Multimodal Neural Engineering Study: A Bibliometric Review. Arch. Comput. Methods Eng. 2021, 28, 4487–4501. [Google Scholar] [CrossRef]
  213. Wu, X.; Zheng, W.-L.; Li, Z.; Lu, B.-L. Investigating EEG-Based Functional Connectivity Patterns for Multimodal Emotion Recognition. J. Neural Eng. 2022, 19, 016012. [Google Scholar] [CrossRef]
  214. Balconi, M.; Sansone, M. Neuroscience and Consumer Behavior: Where to Now? Front. Psychol. 2021, 12, 705850. [Google Scholar] [CrossRef] [PubMed]
  215. Serra, D. Decision-Making: From Neuroscience to Neuroeconomics—An Overview. Theory Decis. 2021, 91, 1–80. [Google Scholar] [CrossRef]
  216. Hinojosa, J.A.; Moreno, E.M.; Ferré, P. Affective Neurolinguistics: Towards a Framework for Reconciling Language and Emotion. Lang. Cogn. Neurosci. 2020, 35, 813–839. [Google Scholar] [CrossRef]
  217. Wu, C.; Zhang, J. Emotion Word Type Should Be Incorporated in Affective Neurolinguistics: A Commentary on Hinojosa, Moreno and Ferré (2019). Lang. Cogn. Neurosci. 2020, 35, 840–843. [Google Scholar] [CrossRef]
  218. Burkitt, I. Emotions, Social Activity and Neuroscience: The Cultural-Historical Formation of Emotion. New Ideas Psychol. 2019, 54, 1–7. [Google Scholar] [CrossRef]
  219. Gluck, M.A.; Mercado, E.; Myers, C.E. Learning and Memory: From Brain to Behavior; Worth Publishers: New York, NY, USA, 2008. [Google Scholar]
  220. Shaw, S.D.; Bagozzi, R.P. The Neuropsychology of Consumer Behavior and Marketing. Soc. Consum. Psychol. 2018, 1, 22–40. [Google Scholar] [CrossRef]
  221. Al-Rodhan, N.R.F. Emotional Amoral Egoism: A Neurophilosophy of Human Nature and Motivations, 1st ed.; The Lutterworth Press: Cambridge, UK, 2021. [Google Scholar] [CrossRef]
  222. Carrozzo, C. Scientific Practice and the Moral Task of Neurophilosophy. AJOB Neurosci. 2019, 10, 115–117. [Google Scholar] [CrossRef]
  223. Northoff, G. Neurophilosophy and Neuroethics: Template for Neuropsychoanalysis? In Neuropsychodynamic Psychiatry; Boeker, H., Hartwich, P., Northoff, G., Eds.; Springer International Publishing: Cham, Switzerland, 2018; pp. 599–615. [Google Scholar] [CrossRef]
  224. Chatterjee, A.; Coburn, A.; Weinberger, A. The Neuroaesthetics of Architectural Spaces. Cogn. Process. 2021, 22, 115–120. [Google Scholar] [CrossRef]
  225. Li, R.; Zhang, J. Review of Computational Neuroaesthetics: Bridging the Gap between Neuroaesthetics and Computer Science. Brain Inf. 2020, 7, 16. [Google Scholar] [CrossRef]
  226. Nadal, M.; Chatterjee, A. Neuroaesthetics and Art’s Diversity and Universality. WIREs Cogn. Sci. 2019, 10, e1487. [Google Scholar] [CrossRef]
  227. Klemm, W. Expanding the Vision of Neurotheology: Make Neuroscience Religion’s Ally. J. Spiritual. Ment. Health 2020, 24, 1–16. [Google Scholar] [CrossRef]
  228. Klemm, W.R. Whither Neurotheology? Religions 2019, 10, 634. [Google Scholar] [CrossRef] [Green Version]
  229. Newberg, A. Chapter Three. Neuroscience and Neurotheology. In Neurotheology; Columbia University Press: New York, NY, USA, 2018; pp. 46–66. [Google Scholar] [CrossRef]
  230. Haas, I.J.; Warren, C.; Lauf, S.J. Political Neuroscience: Understanding How the Brain Makes Political Decisions. In Oxford Research Encyclopedia of Politics; Redlawsk, D., Ed.; Oxford University Press: Oxford, UK, 2020. [Google Scholar] [CrossRef]
  231. Murphy, E. Anarchism and Science. In The Palgrave Handbook of Anarchism; Levy, C., Adams, M.S., Eds.; Springer International Publishing: Cham, Switzerland, 2019; pp. 193–209. [Google Scholar] [CrossRef]
  232. Yun, J.H.; Kim, Y.; Lee, E.-J. ERP Study of Liberals’ and Conservatives’ Moral Reasoning Processes: Evidence from South Korea. J. Bus. Ethics 2022, 176, 723–739. [Google Scholar] [CrossRef]
  233. Bush, S.S.; Tussey, C.M. Neuroscience and Neurolaw: Special Issue of Psychological Injury and Law. Psychol. Inj. Law 2013, 6, 1–2. [Google Scholar] [CrossRef]
  234. Schleim, S. Real Neurolaw in the Netherlands: The Role of the Developing Brain in the New Adolescent Criminal Law. Front. Psychol. 2020, 11, 1762. [Google Scholar] [CrossRef]
  235. Shen, F.X. The Law and Neuroscience Bibliography: Navigating the Emerging Field of Neurolaw. Int. J. Leg. Inf. 2010, 38, 352–399. [Google Scholar] [CrossRef]
  236. Long, M.; Verbeke, W.; Ein-Dor, T.; Vrtička, P. A Functional Neuro-Anatomical Model of Human Attachment (NAMA): Insights from First- and Second-Person Social Neuroscience. Cortex 2020, 126, 281–321. [Google Scholar] [CrossRef]
  237. Weisz, E.; Zaki, J. Motivated Empathy: A Social Neuroscience Perspective. Curr. Opin. Psychol. 2018, 24, 67–71. [Google Scholar] [CrossRef]
  238. Chiao, J.Y. Developmental Aspects in Cultural Neuroscience. Dev. Rev. 2018, 50, 77–89. [Google Scholar] [CrossRef]
  239. Chiao, J.Y. Cultural neuroscience: A once and future discipline. Progress in brain research 2009, 178, 287–304. [Google Scholar] [CrossRef]
  240. Antolin, P. “I Am a Freak of Nature”: Tourette’s and the Grotesque in Jonathan Lethem’s Motherless Brooklyn. Transatlantica 2019, 1, 1–20. [Google Scholar] [CrossRef]
  241. Burn, S.J. The Gender of the Neuronovel: Joyce Carol Oates and the Double Brain. Eur. J. Am. Stud. 2021, 16, 1–17. [Google Scholar] [CrossRef]
  242. Rahaman, V.; Sharma, S. Reading an Extremist Mind through Literary Language: Approaching Cognitive Literary Hermeneutics to R.N. Tagore’s Play the Post Office for Neuro-Computational Predictions. In Cognitive Informatics, Computer Modelling, and Cognitive Science; Elsevier: Amsterdam, The Netherlands, 2020; pp. 197–210. [Google Scholar] [CrossRef]
  243. Ceciu, R.L. Neurocinematics, the (Brain) Child of Film and Neuroscience. J. Commun. Behav. Sci. 2020, 1, 46–62. [Google Scholar]
  244. Moghadasi, A.N. Evaluation of Neurocinema as An Introduction to an Interdisciplinary Science. CINEJ 2020, 8, 307–323. [Google Scholar] [CrossRef]
  245. Olenina, A.H. Sergei Eisenstein, Neurocinematics, and Embodied Cognition: A Reassessment. Discourse 2021, 43, 351–382. [Google Scholar] [CrossRef]
  246. Bearman, H. Music & The Brain–How Music Affects Mood, Cognition, and Mental Health. 2018. Available online: https://www.naturalnootropic.com/music-and-the-brain/ (accessed on 15 August 2022).
  247. Garg, A.; Chaturvedi, V.; Kaur, A.B.; Varshney, V.; Parashar, A. Machine Learning Model for Mapping of Music Mood and Human Emotion Based on Physiological Signals. Multimed. Tools Appl. 2022, 81, 5137–5177. [Google Scholar] [CrossRef]
  248. Liu, Y. Research on the Characteristics and Functions of Brain Activity in Musical Performance. Acad. J. Humanit. Soc. Sci. 2020, 3, 71–79. [Google Scholar]
  249. Berčík, J.; Paluchová, J.; Neomániová, K. Neurogastronomy as a Tool for Evaluating Emotions and Visual Preferences of Selected Food Served in Different Ways. Foods 2021, 10, 354. [Google Scholar] [CrossRef]
  250. Girona-Ruíz, D.; Cano-Lamadrid, M.; Carbonell-Barrachina, Á.A.; López-Lluch, D.; Esther, S. Aromachology Related to Foods, Scientific Lines of Evidence: A Review. Appl. Sci. 2021, 11, 6095. [Google Scholar] [CrossRef]
  251. Lim, W.M. Demystifying Neuromarketing. J. Bus. Res. 2018, 91, 205–220. [Google Scholar] [CrossRef]
  252. Sliwinska, M.W.; Vitello, S.; Devlin, J.T. Transcranial Magnetic Stimulation for Investigating Causal Brain-Behavioral Relationships and Their Time Course. J. Vis. Exp. 2014, 89, e51735. [Google Scholar] [CrossRef] [PubMed]
  253. Agarwal, S.; Xavier, M.J. Innovations in Consumer Science: Applications of Neuro-Scientific Research Tools. In Adoption of Innovation; Brem, A., Viardot, É., Eds.; Springer International Publishing: Cham, Switzerland, 2015; pp. 25–42. [Google Scholar] [CrossRef]
  254. Bakardjieva, E.; Kimmel, A.J. Neuromarketing Research Practices: Attitudes, Ethics, and Behavioral Intentions. Ethics Behav. 2017, 27, 179–200. [Google Scholar] [CrossRef]
  255. Bercea, M.D. Anatomy of Methodologies for Measuring Consumer Behavior in Neuromarketing Research. In Proceedings of the Lupcon Center for Business Research (LCBR) European Marketing Conference, Ebermannstadt, Germany, 9 August 2012. [Google Scholar]
  256. Bitbrain. Business & Marketing. The 7 Most Common Neuromarketing Research Techniques and Tools. 2019. Available online: https://www.bitbrain.com/blog/neuromarketing-research-techniques-tools (accessed on 15 August 2022).
  257. CoolTool. How To Choose the Most Suitable NeuroLab Technology. Available online: https://cooltool.com/blog/-infographics-how-to-choose-the-most-suitable-neurolab-technology (accessed on 17 August 2022).
  258. Farnsworth, B. Neuromarketing Methods [Cheat Sheet]. 2020. Available online: https://imotions.com/blog/neuromarketing-methods/ (accessed on 17 August 2022).
  259. Fortunato, V.C.R.; Giraldi, J.D.M.E.; de Oliveira, J.H.C. A Review of Studies on Neuromarketing: Practical Results, Techniques, Contributions and Limitations. J. Manag. Res. 2014, 6, 201–220. [Google Scholar] [CrossRef] [Green Version]
  260. Ganapathy, K. Neuromarketing: An Overview. Asian Hosp. Healthc. Manag. 2019. Available online: https://www.asianhhm.com/healthcare-management/current-concepts-on-neuromarketing (accessed on 15 August 2022).
  261. Gill, G. Innerscope Research Inc. JITE DC 2012, 1, 5. [Google Scholar] [CrossRef]
  262. Ohme, R.; Matukin, M.; Pacula-Lesniak, B. Biometric Measures for Interactive Advertising Research. J. Interact. Advert. 2011, 11, 60–72. [Google Scholar] [CrossRef]
  263. Nazarova, R.; Lazizovich, T.K. Neuromarketing—A Tool for Influencing Consumer Behavior. Int. J. Innov. Technol. Econ. 2019, 5, 11–14. [Google Scholar] [CrossRef]
  264. Saltini, T. Some Neuromarketing Tools. 2015. Available online: https://tiphainesaltini.wordpress.com/2015/03/10/some-neuromarketing-tools/ (accessed on 15 August 2022).
  265. Stasi, A.; Songa, G.; Mauri, M.; Ciceri, A.; Diotallevi, F.; Nardone, G.; Russo, V. Neuromarketing Empirical Approaches and Food Choice: A Systematic Review. Food Res. Int. 2018, 108, 650–664. [Google Scholar] [CrossRef]
  266. Yağci, M.I.; Kuhzady, S.; Balik, Z.S.; Öztürk, L. In Search of Consumer’s Black Box: A Bibliometric Analysis of Neuromarketing Research. J. Consum. Consum. Res. 2018, 10, 101–134. [Google Scholar]
  267. Klinčeková, S. Neuromarketing—Research and Prediction of the Future. Int. J. Manag. Sci. Bus. Adm. 2016, 2, 54–58. [Google Scholar] [CrossRef]
  268. Malvern Panalytical. Near-Infrared (NIR) Spectroscopy. Available online: https://www.malvernpanalytical.com/en/products/technology/spectroscopy/near-infrared-spectroscopy/ (accessed on 15 August 2022).
  269. Villringer, A.; Planck, J.; Hock, C.; Schleinkofer, L.; Dirnagl, U. Near Infrared Spectroscopy (NIRS): A New Tool to Study Hemodynamic Changes during Activation of Brain Function in Human Adults. Neurosci. Lett. 1993, 154, 101–104. [Google Scholar] [CrossRef]
  270. Assaf, Y.; Pasternak, O. Diffusion Tensor Imaging (DTI)-Based White Matter Mapping in Brain Research: A Review. J. Mol. Neurosci. 2008, 34, 51–61. [Google Scholar] [CrossRef]
  271. Imagilys. Diffusion Tensor Imaging. Available online: https://www.imagilys.com/diffusion-tensor-imaging-dti/ (accessed on 15 August 2022).
  272. Sun, F.; Zang, W.; Gravina, R.; Fortino, G.; Li, Y. Gait-Based Identification for Elderly Users in Wearable Healthcare Systems. Inf. Fusion 2020, 53, 134–144. [Google Scholar] [CrossRef]
  273. Majumder, S.; Mondal, T.; Deen, M.J. A Simple, Low-Cost and Efficient Gait Analyzer for Wearable Healthcare Applications. IEEE Sens. J. 2019, 19, 2320–2329. [Google Scholar] [CrossRef]
  274. Arvaneh, M.; Tanaka, T. Brain–Computer Interfaces and Electroencephalogram: Basics and Practical Issues. In Signal Processing and Machine Learning for Brain—Machine Interfaces; 2018; Available online: http://dl.konkur.in/post/Book/Bargh/Signal-Processing-and-Machine-Learning-for-Brain-Machine-Interfaces-%5Bkonkur.in%5D.pdf#page=16 (accessed on 15 August 2022).
  275. Hantus, S. Continuous EEG Monitoring: Principles and Practice. J. Clin. Neurophysiol. 2019, 37, 1. [Google Scholar] [CrossRef]
  276. Tyagi, A.; Semwal, S.; Shah, G. A Review of Eeg Sensors Used for Data Acquisition. Int. J. Comput. Appl. 2012, 1, 13–18. [Google Scholar]
  277. Burgess, R.C. MEG Reporting. J. Clin. Neurophysiol. 2020, 37, 545–553. [Google Scholar] [CrossRef]
  278. Harmsen, I.E.; Rowland, N.C.; Wennberg, R.A.; Lozano, A.M. Characterizing the Effects of Deep Brain Stimulation with Magnetoencephalography: A Review. Brain Stimul. 2018, 11, 481–491. [Google Scholar] [CrossRef]
  279. Seymour, R.A.; Alexander, N.; Mellor, S.; O’Neill, G.C.; Tierney, T.M.; Barnes, G.R.; Maguire, E.A. Interference Suppression Techniques for OPM-Based MEG: Opportunities and Challenges. NeuroImage 2021, 247, 118834. [Google Scholar] [CrossRef]
  280. Shirinpour, S. Tools for Improving and Understanding Transcranial Magnetic Stimulation. 2020. Available online: https://hdl.handle.net/11299/217801 (accessed on 15 August 2022).
  281. Shirinpour, S.; Hananeia, N.; Rosado, J.; Tran, H.; Galanis, C.; Vlachos, A.; Jedlicka, P.; Queisser, G.; Opitz, A. Multi-Scale Modeling Toolbox for Single Neuron and Subcellular Activity under Transcranial Magnetic Stimulation. Brain Stimul. 2021, 14, 1470–1482. [Google Scholar] [CrossRef]
  282. Widhalm, M.L.; Rose, N.S. How Can Transcranial Magnetic Stimulation Be Used to Causally Manipulate Memory Representations in the Human Brain? WIREs Cogn. Sci. 2019, 10, e1469. [Google Scholar] [CrossRef] [Green Version]
  283. Gannouni, S.; Aledaily, A.; Belwafi, K.; Aboalsamh, H. Emotion Detection Using Electroencephalography Signals and a Zero-Time Windowing-Based Epoch Estimation and Relevant Electrode Identification. Sci. Rep. 2021, 11, 7071. [Google Scholar] [CrossRef]
  284. Dixson, B.J.W.; Spiers, T.; Miller, P.A.; Sidari, M.J.; Nelson, N.L.; Craig, B.M. Facial Hair May Slow Detection of Happy Facial Expressions in the Face in the Crowd Paradigm. Sci. Rep. 2022, 12, 5911. [Google Scholar] [CrossRef]
  285. Wang, X.-W.; Nie, D.; Lu, B.-L. EEG-Based Emotion Recognition Using Frequency Domain Features and Support Vector Machines. In Neural Information Processing; Lu, B.-L., Zhang, L., Kwok, J., Eds.; Lecture Notes in Computer Science; Springer Berlin Heidelberg: Berlin/Heidelberg, Germany, 2011; Volume 7062, pp. 734–743. [Google Scholar] [CrossRef] [Green Version]
  286. John, E.R. Principles of Neurometries. Am. J. EEG Technol. 1990, 30, 251–266. [Google Scholar] [CrossRef]
  287. Alkhasli, I.; Sakreida, K.; Mottaghy, F.M.; Binkofski, F. Modulation of Fronto-Striatal Functional Connectivity Using Transcranial Magnetic Stimulation. Front. Hum. Neurosci. 2019, 13, 190. [Google Scholar] [CrossRef] [Green Version]
  288. Jamadar, S.D.; Ward, P.G.D.; Close, T.G.; Fornito, A.; Premaratne, M.; O’Brien, K.; Stäb, D.; Chen, Z.; Shah, N.J.; Egan, G.F. Simultaneous BOLD-FMRI and Constant Infusion FDG-PET Data of the Resting Human Brain. Sci. Data 2020, 7, 363. [Google Scholar] [CrossRef]
  289. Kraft, R.H.; Dagro, A.M. Design and Implementation of a Numerical Technique to Inform Anisotropic Hyperelastic Finite Element Models Using Diffusion-Weighted Imaging. 2011. Available online: https://apps.dtic.mil/sti/pdfs/ADA565877.pdf (accessed on 15 August 2022).
  290. Koong, C.-S.; Yang, T.-I.; Tseng, C.-C. A User Authentication Scheme Using Physiological and Behavioral Biometrics for Multitouch Devices. Sci. World J. 2014, 2014, 781234. [Google Scholar] [CrossRef] [Green Version]
  291. Heydarzadegan, A.; Moradi, M.; Toorani, A. Biometric Recognition Systems: A Survey. Int. Res. J. Appl. Basic Sci. 2013, 6, 1609–1618. [Google Scholar]
  292. Shingetsu. Global Biometric Systems Market. 2021. Available online: https://www.shingetsuresearch.com/biometric-systems-market/?gclid=Cj0KCQiAybaRBhDtARIsAIEG3kkQZsv-1LwHknyBvnAfURBeXvBbB-uk9YGdpwf22Uw6waMmssmt1ycaAr9hEALw_wcB (accessed on 29 July 2022).
  293. Abo-Zahhad, M.; Ahmed, S.M.; Abbas, S.N. A Novel Biometric Approach for Human Identification and Verification Using Eye Blinking Signal. IEEE Signal Process. Lett. 2015, 22, 876–880. [Google Scholar] [CrossRef]
  294. Larsson, M.; Pedersen, N.L.; Stattin, H. Associations between Iris Characteristics and Personality in Adulthood. Biol. Psychol. 2007, 75, 165–175. [Google Scholar] [CrossRef]
  295. Gentry, T.A.; Polzine, K.M.; Wakefield, J.A. Human Genetic Markers Associated with Variation in Intellectual Abilities and Personality. Personal. Individ. Differ. 1985, 6, 111–113. [Google Scholar] [CrossRef]
  296. Gary, A.L.; Glover, J.A. Eye Color, Sex, and Children’s Behavior; Nelson-Hall Publishers: Chicago, IL, USA, 1976. [Google Scholar]
  297. Markle, A. Eye Color and Responsiveness to Arousing Stimuli. Percept. Mot. Ski. 1976, 43, 127–133. [Google Scholar] [CrossRef] [PubMed]
  298. Bailador, G.; Sanchez-Avila, C.; Guerra-Casanova, J.; de Santos Sierra, A. Analysis of Pattern Recognition Techniques for In-Air Signature Biometrics. Pattern Recognit. 2011, 44, 2468–2478. [Google Scholar] [CrossRef]
  299. Miller, W. Different Types of Biometrics. 2019. Available online: https://www.ibeta.com/different-types-of-biometrics/ (accessed on 29 July 2022).
  300. Biometrics Institute. Types of Biometrics. Available online: https://www.biometricsinstitute.org/what-is-biometrics/types-of-biometrics/ (accessed on 29 July 2022).
  301. Chen, D.; Haviland-Jones, J. Human Olfactory Communication of Emotion. Percept. Mot. Ski. 2000, 91, 771–781. [Google Scholar] [CrossRef] [PubMed]
  302. Kaklauskas, A.; Zavadskas, E.K.; Seniut, M.; Dzemyda, G.; Stankevic, V.; Simkevičius, C.; Stankevic, T.; Paliskiene, R.; Matuliauskaite, A.; Kildiene, S.; et al. Web-Based Biometric Computer Mouse Advisory System to Analyze a User’s Emotions and Work Productivity. Eng. Appl. Artif. Intell. 2011, 24, 928–945. [Google Scholar] [CrossRef]
  303. American Heart Association. Electrocardiogram (ECG or EKG). 2015. Available online: https://www.heart.org/en/health-topics/heart-attack/diagnosing-a-heart-attack/electrocardiogram-ecg-or-ekg (accessed on 29 July 2022).
  304. Nicolò, A.; Massaroni, C.; Schena, E.; Sacchetti, M. The Importance of Respiratory Rate Monitoring: From Healthcare to Sport and Exercise. Sensors 2020, 20, 6396. [Google Scholar] [CrossRef]
  305. Wang, B.; Zhou, H.; Yang, G.; Li, X.; Yang, H. Human Digital Twin (HDT) Driven Human-Cyber-Physical Systems: Key Technologies and Applications. Chin. J. Mech. Eng. 2022, 35, 11. [Google Scholar] [CrossRef]
  306. Nahavandi, S. Industry 5.0—A Human-Centric Solution. Sustainability 2019, 11, 4371. [Google Scholar] [CrossRef] [Green Version]
  307. Lugovic, S.; Dunder, I.; Horvat, M. Techniques and Applications of Emotion Recognition in Speech. In 2016 39th International Convention on Information and Communication Technology, Electronics and Microelectronics (MIPRO), Opatija, Croatia, 30 May–3 June 2016; IEEE: Opatija, Croatia, 2016; pp. 1278–1283. [Google Scholar] [CrossRef]
  308. Ge, Y.; Liu, J. Psychometric Analysis on Neurotransmitter Deficiency of Internet Addicted Urban Left-behind Children. J. Alcohol Drug Depend. 2015, 3, 1–6. [Google Scholar] [CrossRef]
  309. Lafta, H.A.; Abbas, S.S. Effectiveness of Extended Invariant Moments in Fingerprint Analysis. Asian J. Comput. Inf. Syst. 2013, 01, 78–89. [Google Scholar]
  310. Singh, J.; Goyal, G.; Gill, R. Use of Neurometrics to Choose Optimal Advertisement Method for Omnichannel Business. Enterp. Inf. Syst. 2020, 14, 243–265. [Google Scholar] [CrossRef]
  311. Fiedler, K.; Bluemke, M. Faking the IAT: Aided and Unaided Response Control on the Implicit Association Tests. Basic Appl. Soc. Psychol. 2005, 27, 307–316. [Google Scholar] [CrossRef]
  312. Simons, S.; Zhou, J.; Liao, Y.; Bradway, L.; Aguilar, M.; Connolly, P.M. Cognitive Biometrics Using Mouse Perturbation. US Patent Application US14/011,351, 20 March 2014. [Google Scholar]
  313. Martinez-Marquez, D.; Pingali, S.; Panuwatwanich, K.; Stewart, R.A.; Mohamed, S. Application of Eye Tracking Technology in Aviation, Maritime, and Construction Industries: A Systematic Review. Sensors 2021, 21, 4289. [Google Scholar] [CrossRef]
  314. Skvarekova, I.; Skultety, F. Objective Measurement of Pilot’s Attention Using Eye Track Technology during IFR Flights. Transp. Res. Procedia 2019, 40, 1555–1562. [Google Scholar] [CrossRef]
  315. Eachus, P. The Use of Eye Tracking Technology in the Evaluation of E-Learning: A Feasibility Study; University of Salford: Manchester, UK, 2008; pp. 12–14. [Google Scholar]
  316. Sharafi, Z.; Soh, Z.; Guéhéneuc, Y.-G. A Systematic Literature Review on the Usage of Eye-Tracking in Software Engineering. Inf. Softw. Technol. 2015, 67, 79–107. [Google Scholar] [CrossRef]
  317. Gonzalez-Sanchez, J.; Chavez-Echeagaray, M.E.; Atkinson, R.; Burleson, W. ABE: An Agent-Based Software Architecture for a Multimodal Emotion Recognition Framework. In 2011 Ninth Working IEEE/IFIP Conference on Software Architecture, Washington, United States, 20–24 June 2011; IEEE: Boulder, CO, USA, 2011; pp. 187–193. [Google Scholar] [CrossRef]
  318. Borkhataria, C. The Algorithm That Could End Office Thermostat Wars: Researchers Claim New Software Can Find the Best Temperature for Everyone. 2017. Available online: https://www.dailymail.co.uk/sciencetech/article-4979148/The-algorithm-end-office-thermostat-war.html (accessed on 29 July 2022).
  319. Rukavina, S.; Gruss, S.; Hoffmann, H.; Tan, J.-W.; Walter, S.; Traue, H.C. Affective Computing and the Impact of Gender and Age. PLoS ONE 2016, 11, e0150584. [Google Scholar] [CrossRef] [Green Version]
  320. Saini, R.; Rana, N. Comparison of Various Biometric Methods. Int. J. Adv. Sci. Technol. 2014, 2, 24–30. [Google Scholar]
  321. Elprocus. Biometric Sensors—Types and Its Working. 2022. Available online: https://www.elprocus.com/different-types-biometric-sensors/ (accessed on 29 July 2022).
  322. Loaiza, J.R. Emotions and the Problem of Variability. Rev. Phil. Psych. 2021, 12, 329–351. [Google Scholar] [CrossRef]
  323. Pace-Schott, E.F.; Amole, M.C.; Aue, T.; Balconi, M.; Bylsma, L.M.; Critchley, H.; Demaree, H.A.; Friedman, B.H.; Gooding, A.E.K.; Gosseries, O.; et al. Physiological Feelings. Neurosci. Biobehav. Rev. 2019, 103, 267–304. [Google Scholar] [CrossRef]
  324. Dolensek, N.; Gehrlach, D.A.; Klein, A.S.; Gogolla, N. Facial Expressions of Emotion States and Their Neuronal Correlates in Mice. Science 2020, 368, 89–94. [Google Scholar] [CrossRef]
  325. Kamila, S.; Hasanuzzaman, M.; Ekbal, A.; Bhattacharyya, P. Investigating the Impact of Emotion on Temporal Orientation in a Deep Multitask Setting. Sci. Rep. 2022, 12, 493. [Google Scholar] [CrossRef]
  326. Saganowski, S.; Komoszyńska, J.; Behnke, M.; Perz, B.; Kunc, D.; Klich, B.; Kaczmarek, Ł.D.; Kazienko, P. Emognition Dataset: Emotion Recognition with Self-Reports, Facial Expressions, and Physiology Using Wearables. Sci. Data 2022, 9, 158. [Google Scholar] [CrossRef]
  327. Swanborough, H.; Staib, M.; Frühholz, S. Neurocognitive Dynamics of Near-Threshold Voice Signal Detection and Affective Voice Evaluation. Sci. Adv. 2020, 6, eabb3884. [Google Scholar] [CrossRef]
  328. Singh, R.; Baby, B.; Suri, A. A Virtual Repository of Neurosurgical Instrumentation for Neuroengineering Research and Collaboration. World Neurosurg. 2019, 126, e84–e93. [Google Scholar] [CrossRef]
  329. Alonso-Fernandez, F.; Fierrez, J.; Ortega-Garcia, J. Quality measures in biometric systems. IEEE Secur. Priv. 2011, 10, 52–62. [Google Scholar] [CrossRef] [Green Version]
  330. De Angel, V.; Lewis, S.; White, K.; Oetzmann, C.; Leightley, D.; Oprea, E.; Lavelle, G.; Matcham, F.; Pace, A.; Mohr, D.C.; et al. Digital health tools for the passive monitoring of depression: A systematic review of methods. NPJ Digit. Med. 2022, 5, 3. [Google Scholar] [CrossRef]
  331. Kable, J.W. The Cognitive Neuroscience Toolkit for the Neuroeconomist: A Functional Overview. J. Neurosci. Psychol. Econ. 2011, 4, 63–84. [Google Scholar] [CrossRef] [Green Version]
  332. Zurawicki, L. Neuromarketing: Exploring the Brain of the Consumer; Springer: Berlin/Heidelberg, Germany, 2010. [Google Scholar] [CrossRef]
  333. Magdin, M.; Prikler, F. Are Instructed Emotional States Suitable for Classification? Demonstration of How They Can Significantly Influence the Classification Result in An Automated Recognition System. IJIMAI 2019, 5, 141–147. [Google Scholar] [CrossRef]
  334. Camurri, A.; Lagerlöf, I.; Volpe, G. Recognizing Emotion from Dance Movement: Comparison of Spectator Recognition and Automated Techniques. Int. J. Hum.-Comput. Stud. 2003, 59, 213–225. [Google Scholar] [CrossRef]
  335. Alarcao, S.M.; Fonseca, M.J. Emotions Recognition Using EEG Signals: A Survey. IEEE Trans. Affect. Comput. 2017, 10, 374–393. [Google Scholar] [CrossRef]
  336. Kim, M.-K.; Kim, M.; Oh, E.; Kim, S.-P. A Review on the Computational Methods for Emotional State Estimation from the Human EEG. Comput. Math. Methods Med. 2013, 2013, 573734. [Google Scholar] [CrossRef] [Green Version]
  337. Xu, Q.; Ruohonen, E.M.; Ye, C.; Li, X.; Kreegipuu, K.; Stefanics, G.; Luo, W.; Astikainen, P. Automatic Processing of Changes in Facial Emotions in Dysphoria: A Magnetoencephalography Study. Front. Hum. Neurosci. 2018, 12, 186. [Google Scholar] [CrossRef] [PubMed]
  338. Bublatzky, F.; Kavcıoğlu, F.; Guerra, P.; Doll, S.; Junghöfer, M. Contextual Information Resolves Uncertainty about Ambiguous Facial Emotions: Behavioral and Magnetoencephalographic Correlates. NeuroImage 2020, 215, 116814. [Google Scholar] [CrossRef] [PubMed]
  339. Van Loon, A.M.; van den Wildenberg, W.P.M.; van Stegeren, A.H.; Ridderinkhof, K.R.; Hajcak, G. Emotional Stimuli Modulate Readiness for Action: A Transcranial Magnetic Stimulation Study. Cogn. Affect. Behav. Neurosci. 2010, 10, 174–181. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  340. Bandara, D.; Velipasalar, S.; Bratt, S.; Hirshfield, L. Building Predictive Models of Emotion with Functional Near-Infrared Spectroscopy. Int. J. Hum.-Comput. Stud. 2018, 110, 75–85. [Google Scholar] [CrossRef]
  341. Bae, S.; Kang, K.D.; Kim, S.W.; Shin, Y.J.; Nam, J.J.; Han, D.H. Investigation of an Emotion Perception Test Using Functional Magnetic Resonance Imaging. Comput. Methods Programs Biomed. 2019, 179, 104994. [Google Scholar] [CrossRef]
  342. Dweck, M.R. Multisystem Positron Emission Tomography: Interrogating Vascular Inflammation, Emotional Stress, and Bone Marrow Activity in a Single Scan. Eur. Heart J. 2021, 42, 1896–1897. [Google Scholar] [CrossRef]
  343. Reiman, E.M. The Application of Positron Emission Tomography to the Study of Normal and Pathologic Emotions. J. Clin. Psychiatry 1997, 58 (Suppl. S16), 4–12. [Google Scholar]
  344. Takahashi, M.; Kitamura, S.; Matsuoka, K.; Yoshikawa, H.; Yasuno, F.; Makinodan, M.; Kimoto, S.; Miyasaka, T.; Kichikawa, K.; Kishimoto, T. Uncinate Fasciculus Disruption Relates to Poor Recognition of Negative Facial Emotions in Alzheimer’s Disease: A Cross-sectional Diffusion Tensor Imaging Study. Psychogeriatrics 2020, 20, 296–303. [Google Scholar] [CrossRef]
  345. Kaklauskas, A.; Abraham, A.; Dzemyda, G.; Raslanas, S.; Seniut, M.; Ubarte, I.; Kurasova, O.; Binkyte-Veliene, A.; Cerkauskas, J. Emotional, Affective and Biometrical States Analytics of a Built Environment. Eng. Appl. Artif. Intell. 2020, 91, 103621. [Google Scholar] [CrossRef]
  346. Kaklauskas, A.; Jokubauskas, D.; Cerkauskas, J.; Dzemyda, G.; Ubarte, I.; Skirmantas, D.; Podviezko, A.; Simkute, I. Affective Analytics of Demonstration Sites. Eng. Appl. Artif. Intell. 2019, 81, 346–372. [Google Scholar] [CrossRef]
  347. Kaklauskas, A.; Zavadskas, E.K.; Bardauskiene, D.; Cerkauskas, J.; Ubarte, I.; Seniut, M.; Dzemyda, G.; Kaklauskaite, M.; Vinogradova, I.; Velykorusova, A. An Affect-Based Built Environment Video Analytics. Autom. Constr. 2019, 106, 102888. [Google Scholar] [CrossRef]
  348. Kaklauskas, A.; Bardauskiene, D.; Cerkauskiene, R.; Ubarte, I.; Raslanas, S.; Radvile, E.; Kaklauskaite, U.; Kaklauskiene, L. Emotions Analysis in Public Spaces for Urban Planning. Land Use Policy 2021, 107, 105458. [Google Scholar] [CrossRef]
  349. Porcherot, C.; Raviot-Derrien, S.; Beague, M.-P.; Henneberg, S.; Niedziela, M.; Ambroze, K.; McEwan, J.A. Effect of Context on Fine Fragrance-Elicited Emotions: Comparison of Three Experimental Methodologies. Food Qual. Prefer. 2022, 95, 104342. [Google Scholar] [CrossRef]
  350. Child, S.; Oakhill, J.; Garnham, A. Tracking Your Emotions: An Eye-Tracking Study on Reader’s Engagement with Perspective during Text Comprehension. Q. J. Exp. Psychol. 2020, 73, 929–940. [Google Scholar] [CrossRef]
  351. Tarnowski, P.; Kołodziej, M.; Majkowski, A.; Rak, R.J. Eye-Tracking Analysis for Emotion Recognition. Comput. Intell. Neurosci. 2020, 2020, 2909267. [Google Scholar] [CrossRef]
  352. Coutinho, E.; Miranda, E.R.; Cangelosi, A. Towards a Model for Embodied Emotions. In 2005 Purtuguese Conference on Artificial Intelligence, Covilha, Portugal, 5–8 December 2005; IEEE: Covilha, Portugal, 2005; pp. 54–63. [Google Scholar] [CrossRef] [Green Version]
  353. Kim, M.; Lee, H.S.; Park, J.W.; Jo, S.H.; Chung, M.J. Determining Color and Blinking to Support Facial Expression of a Robot for Conveying Emotional Intensity. In RO-MAN 2008—The 17th IEEE International Symposium on Robot and Human Interactive Communication, Munich, Germany, 1–3 August 2008; IEEE: Munich, Germany, 2008; pp. 219–224. [Google Scholar] [CrossRef]
  354. Terada, K.; Yamauchi, A.; Ito, A. Artificial Emotion Expression for a Robot by Dynamic Color Change. In 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication, Paris, France, 9–13 September 2012; IEEE: Paris, France, 2012; pp. 314–321. [Google Scholar] [CrossRef]
  355. Li, S.; Walters, G.; Packer, J.; Scott, N. Using Skin Conductance and Facial Electromyography to Measure Emotional Responses to Tourism Advertising. Curr. Issues Tour. 2018, 21, 1761–1783. [Google Scholar] [CrossRef] [Green Version]
  356. Nakasone, A.; Prendinger, H.; Ishizuka, M. Emotion Recognition from Electromyography and Skin Conductance. In Proceedings of the 5th International Workshop on Biosignal Interpretation; 2005; pp. 219–222. Available online: https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.64.7269&rep=rep1&type=pdf (accessed on 29 July 2022).
  357. Val-Calvo, M.; Álvarez-Sánchez, J.R.; Ferrández-Vicente, J.M.; Díaz-Morcillo, A.; Fernández-Jover, E. Real-Time Multi-Modal Estimation of Dynamically Evoked Emotions Using EEG, Heart Rate and Galvanic Skin Response. Int. J. Neur. Syst. 2020, 30, 2050013. [Google Scholar] [CrossRef] [Green Version]
  358. Minhad, K.N.; Ali, S.H.M.; Reaz, M.B.I. Happy-Anger Emotions Classifications from Electrocardiogram Signal for Automobile Driving Safety and Awareness. J. Transp. Health 2017, 7, 75–89. [Google Scholar] [CrossRef]
  359. Orini, M.; Bailón, R.; Enk, R.; Koelsch, S.; Mainardi, L.; Laguna, P. A Method for Continuously Assessing the Autonomic Response to Music-Induced Emotions through HRV Analysis. Med. Biol. Eng. Comput. 2010, 48, 423–433. [Google Scholar] [CrossRef]
  360. Hernando, A.; Lazaro, J.; Gil, E.; Arza, A.; Garzon, J.M.; Lopez-Anton, R.; de la Camara, C.; Laguna, P.; Aguilo, J.; Bailon, R. Inclusion of Respiratory Frequency Information in Heart Rate Variability Analysis for Stress Assessment. IEEE J. Biomed. Health Inform. 2016, 20, 1016–1025. [Google Scholar] [CrossRef]
  361. Dasgupta, P.B. Detection and Analysis of Human Emotions through Voice and Speech Pattern Processing. Int. J. Comput. Trends Technol. 2017, 52, 1–3. [Google Scholar] [CrossRef]
  362. Rüsch, N.; Corrigan, P.W.; Bohus, M.; Kühler, T.; Jacob, G.A.; Lieb, K. The Impact of Posttraumatic Stress Disorder on Dysfunctional Implicit and Explicit Emotions Among Women with Borderline Personality Disorder. J. Nerv. Ment. Dis. 2007, 195, 537–539. [Google Scholar] [CrossRef]
  363. Yi, Q.; Xiong, S.; Wang, B.; Yi, S. Identification of Trusted Interactive Behavior Based on Mouse Behavior Considering Web User’s Emotions. Int. J. Ind. Ergon. 2020, 76, 102903. [Google Scholar] [CrossRef]
  364. Lozano-Goupil, J.; Bardy, B.G.; Marin, L. Toward an Emotional Individual Motor Signature. Front. Psychol. 2021, 12, 647704. [Google Scholar] [CrossRef]
  365. Venture, G.; Kadone, H.; Zhang, T.; Grèzes, J.; Berthoz, A.; Hicheur, H. Recognizing Emotions Conveyed by Human Gait. Int. J. Soc. Robot. 2014, 6, 621–632. [Google Scholar] [CrossRef]
  366. Bevacqua, E.; Mancini, M. Speaking with Emotions. In Proceedings of the AISB Symposium on Motion, Emotion and Cognition, Leeds, UK, 29 March–1 April 2004; pp. 58–65. [Google Scholar]
  367. Maalej, A.; Kallel, I. Does Keystroke Dynamics Tell Us about Emotions? A Systematic Literature Review and Dataset Construction. In 2020 16th International Conference on Intelligent Environments (IE), Madrid, Spain, 20–23 July 2020; IEEE: Madrid, Spain, 2020; pp. 60–67. [Google Scholar] [CrossRef]
  368. Chanel, G.; Kierkels, J.J.M.; Soleymani, M.; Pun, T. Short-Term Emotion Assessment in a Recall Paradigm. Int. J. Hum.-Comput. Stud. 2009, 67, 607–627. [Google Scholar] [CrossRef]
  369. Chanel, G.; Kronegg, J.; Grandjean, D.; Pun, T. Emotion Assessment: Arousal Evaluation Using EEG’s and Peripheral Physiological Signals. In Multimedia Content Representation, Classification and Security; Gunsel, B., Jain, A.K., Tekalp, A.M., Sankur, B., Hutchison, D., Kanade, T., Eds.; Lecture Notes in Computer Science; Springer: Berlin/Heidelberg, Germany, 2006; Volume 4105, pp. 530–537. [Google Scholar] [CrossRef] [Green Version]
  370. Peter, C.; Ebert, E.; Beikirch, H. A Wearable Multi-Sensor System for Mobile Acquisition of Emotion-Related Physiological Data. In Affective Computing and Intelligent Interaction; Tao, J., Tan, T., Picard, R.W., Hutchison, D., Kanade, T., Kittler, J., Kleinberg, J.M., Eds.; Lecture Notes in Computer Science; Springer: Berlin/Heidelberg, Germany, 2005; Volume 3784, pp. 691–698. [Google Scholar] [CrossRef]
  371. Villon, O.; Lisetti, C. A User-Modeling Approach to Build User’s Psycho-Physiological Maps of Emotions Using Bio-Sensors. In ROMAN 2006—The 15th IEEE International Symposium on Robot and Human Interactive Communication, Hatfield, UK, 6–8 September 2006; IEEE: Hatfield, UK, 2006; pp. 269–276. [Google Scholar] [CrossRef]
  372. Lee, S.; Hong, C.-s.; Lee, Y.K.; Shin, H.-s. Experimental Emotion Recognition System and Services for Mobile Network Environments. In Proceedings of the 2010 IEEE Sensors; IEEE: Kona, HI, USA, 2010; pp. 136–140. Available online: https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=5690670 (accessed on 29 July 2022). [CrossRef]
  373. De Santos Sierra, A.; Ávila, C.S.; Casanova, J.G.; del Pozo, G.B. Real-Time Stress Detection by Means of Physiological Signals. In Advanced Biometric Technologies; IntechOpen: London, UK, 2011; pp. 23–44. [Google Scholar]
  374. Hsieh, P.-Y.; Chin, C.-L. The Emotion Recognition System with Heart Rate Variability and Facial Image Features. In Proceedings of the 2011 IEEE International Conference on Fuzzy Systems (FUZZ-IEEE 2011); IEEE: Taipei, Taiwan, 2011; pp. 1933–1940. [Google Scholar] [CrossRef]
  375. Zhang, J.; Chen, M.; Hu, S.; Cao, Y.; Kozma, R. PNN for EEG-Based Emotion Recognition. In Proceedings of the 2016 IEEE International Conference on Systems, Man, and Cybernetics (SMC); IEEE: Budapest, Hungary, 2016; pp. 2319–2323. [Google Scholar] [CrossRef]
  376. Mehmood, R.; Lee, H. Towards Building a Computer Aided Education System for Special Students Using Wearable Sensor Technologies. Sensors 2017, 17, 317. [Google Scholar] [CrossRef]
  377. Purnamasari, P.; Ratna, A.; Kusumoputro, B. Development of Filtered Bispectrum for EEG Signal Feature Extraction in Automatic Emotion Recognition Using Artificial Neural Networks. Algorithms 2017, 10, 63. [Google Scholar] [CrossRef]
  378. Li, Y.; Huang, J.; Zhou, H.; Zhong, N. Human Emotion Recognition with Electroencephalographic Multidimensional Features by Hybrid Deep Neural Networks. Appl. Sci. 2017, 7, 1060. [Google Scholar] [CrossRef] [Green Version]
  379. Hu, X.; Yu, J.; Song, M.; Yu, C.; Wang, F.; Sun, P.; Wang, D.; Zhang, D. EEG Correlates of Ten Positive Emotions. Front. Hum. Neurosci. 2017, 11, 26. [Google Scholar] [CrossRef] [Green Version]
  380. Taneli, B.; Krahne, W. EEG Changes of Transcendental Meditation Practitioners. In Advances in Biological Psychiatry; Taneli, B., Perris, C., Kemali, D., Eds.; S. Karger AG: Basel, Switzerland, 1987; Volume 16, pp. 41–71. [Google Scholar] [CrossRef]
  381. Si, Y.; Jiang, L.; Tao, Q.; Chen, C.; Li, F.; Jiang, Y.; Zhang, T.; Cao, X.; Wan, F.; Yao, D.; et al. Predicting Individual Decision-Making Responses Based on the Functional Connectivity of Resting-State EEG. J. Neural Eng. 2019, 16, 066025. [Google Scholar] [CrossRef] [PubMed]
  382. Akash, K.; Hu, W.-L.; Jain, N.; Reid, T. A Classification Model for Sensing Human Trust in Machines Using EEG and GSR. ACM Trans. Interact. Intell. Syst. 2018, 8, 1–20. [Google Scholar] [CrossRef] [Green Version]
  383. Tsao, Y.-C.; Huang, C.-M.; Miou, Y.-C. The Role of Opposing Emotions in Design Satisfaction and Perceived Innovation. J. Sci. Des. 2021, 5, 111–120. [Google Scholar]
  384. Martin, O.; Kotsia, I.; Macq, B.; Pitas, I. The eNTERFACE’ 05 Audio-Visual Emotion Database. In Proceedings of the 22nd International Conference on Data Engineering Workshops (ICDEW’06), Atlanta, GA, USA, 3–7 April 2006; IEEE: Atlanta, GA, USA, 2006; p. 8. [Google Scholar] [CrossRef]
  385. McDermott, O.D.; Prigerson, H.G.; Reynolds, C.F.; Houck, P.R.; Dew, M.A.; Hall, M.; Mazumdar, S.; Buysse, D.J.; Hoch, C.C.; Kupfer, D.J. Sleep in the Wake of Complicated Grief Symptoms: An Exploratory Study. Biol. Psychiatry 1997, 41, 710–716. [Google Scholar] [CrossRef]
  386. Rusalova, M.N.; Kostyunina, M.B.; Kulikov, M.A. Spatial Distribution of Coefficients of Asymmetry of Brain Bioelectrical Activity during the Experiencing of Negative Emotions. Neurosci. Behav. Physiol. 2003, 33, 703–706. [Google Scholar] [CrossRef] [PubMed]
  387. Uyan, U. EEG-Based Assessment of Cybersickness in a VR Environment and Adjusting Stereoscopic Parameters According to Level of Sickness to Present a Comfortable Vision; Hacettepe University: Ankara, Turkey, 2020. [Google Scholar]
  388. Yankovsky, A.E.; Veilleux, M.; Dubeau, F.; Andermann, F. Post-Ictal Rage and Aggression: A Video-EEG Study. Epileptic Disord. 2005, 7, 143–147. [Google Scholar]
  389. Kim, S.-H.; Nguyen Thi, N.A. Feature Extraction of Emotional States for EEG-Based Rage Control. In Proceedings of the 2016 39th International Conference on Telecommunications and Signal Processing (TSP), Vienna, Austria, 27–29 June 2016; IEEE: Vienna, Austria, 2016; pp. 361–364. [Google Scholar] [CrossRef]
  390. Cannon, P.A.; Drake, M.E. EEG and Brainstem Auditory Evoked Potentials in Brain-Injured Patients with Rage Attacks and Self-Injurious Behavior. Clin. Electroencephalogr. 1986, 17, 169–172. [Google Scholar]
  391. Chen, X.; Lin, J.; Jin, H.; Huang, Y.; Liu, Z. The Psychoacoustics Annoyance Research Based on EEG Rhythms for Passengers in High-Speed Railway. Appl. Acoust. 2021, 171, 107575. [Google Scholar] [CrossRef]
  392. Li, Z.-G.; Di, G.-Q.; Jia, L. Relationship between Electroencephalogram Variation and Subjective Annoyance under Noise Exposure. Appl. Acoust. 2014, 75, 37–42. [Google Scholar] [CrossRef]
  393. Benlamine, M.S.; Chaouachi, M.; Frasson, C.; Dufresne, A. Physiology-Based Recognition of Facial Micro-Expressions Using EEG and Identification of the Relevant Sensors by Emotion. In Proceedings of the 3rd International Conference on Physiological Computing Systems; SCITEPRESS—Science and Technology Publications: Lisbon, Portugal, 2016; pp. 130–137. Available online: https://www.scitepress.org/Papers/2016/60027/60027.pdf (accessed on 9 July 2022). [CrossRef] [Green Version]
  394. Aftanas, L.I.; Pavlov, S.V. Trait Anxiety Impact on Posterior Activation Asymmetries at Rest and during Evoked Negative Emotions: EEG Investigation. Int. J. Psychophysiol. 2005, 55, 85–94. [Google Scholar] [CrossRef]
  395. Ragozinskaya, V.G. Features of Psychosomatic Patient’s Aggressiveness. Procedia-Soc. Behav. Sci. 2013, 86, 232–235. [Google Scholar] [CrossRef]
  396. Konareva, I.N. Correlation between Level of Aggressiveness of Personality and Characteristics of EEG Frequency Components. Neurophysiology 2006, 38, 380–388. [Google Scholar] [CrossRef]
  397. Munian, L.; Wan Ahmad, W.K.; Xu, T.K.; Mustafa, W.A.; Rahim, M.A. An Aggressiveness Level Analysis Based On Buss Perry Questionnaire (BPQ) And Brain Signal (EEG). J. Phys.: Conf. Ser. 2021, 2107, 012045. [Google Scholar] [CrossRef]
  398. Flores, A.; Münte, T.F.; Doñamayor, N. Event-Related EEG Responses to Anticipation and Delivery of Monetary and Social Reward. Biol. Psychol. 2015, 109, 10–19. [Google Scholar] [CrossRef]
  399. Gorka, S.M.; Phan, K.L.; Shankman, S.A. Convergence of EEG and FMRI Measures of Reward Anticipation. Biol. Psychol. 2015, 112, 12–19. [Google Scholar] [CrossRef] [Green Version]
  400. Alazrai, R.; Homoud, R.; Alwanni, H.; Daoud, M. EEG-Based Emotion Recognition Using Quadratic Time-Frequency Distribution. Sensors 2018, 18, 2739. [Google Scholar] [CrossRef] [Green Version]
  401. Cai, J.; Chen, W.; Yin, Z. Multiple Transferable Recursive Feature Elimination Technique for Emotion Recognition Based on EEG Signals. Symmetry 2019, 11, 683. [Google Scholar] [CrossRef] [Green Version]
  402. Chao, H.; Dong, L.; Liu, Y.; Lu, B. Emotion Recognition from Multiband EEG Signals Using CapsNet. Sensors 2019, 19, 2212. [Google Scholar] [CrossRef] [Green Version]
  403. Gao, Z.; Cui, X.; Wan, W.; Gu, Z. Recognition of Emotional States Using Multiscale Information Analysis of High Frequency EEG Oscillations. Entropy 2019, 21, 609. [Google Scholar] [CrossRef] [Green Version]
  404. Garg, D.; Verma, G.K. Emotion recognition in valence-arousal space from multi-channel EEG data and wavelet based deep learning framework. Procedia Computer Science 2020, 171, 857–867. [Google Scholar] [CrossRef]
  405. Soleymani, M.; Lichtenauer, J.; Pun, T.; Pantic, M. A Multimodal Database for Affect Recognition and Implicit Tagging. IEEE Trans. Affect. Comput. 2012, 3, 42–55. [Google Scholar] [CrossRef]
  406. Yogeeswaran, K.; Nash, K.; Jia, H.; Adelman, L.; Verkuyten, M. Intolerant of Being Tolerant? Examining the Impact of Intergroup Toleration on Relative Left Frontal Activity and Outgroup Attitudes. Curr. Psychol. 2021, 41, 7228–7239. [Google Scholar] [CrossRef]
  407. Fan, C.; Peng, Y.; Peng, S.; Zhang, H.; Wu, Y.; Kwong, S. Detection of Train Driver Fatigue and Distraction Based on Forehead EEG: A Time-Series Ensemble Learning Method. IEEE Trans. Intell. Transport. Syst. 2021, 23, 13559–13569. [Google Scholar] [CrossRef]
  408. Mück, M.; Ohmann, K.; Dummel, S.; Mattes, A.; Thesing, U.; Stahl, J. Face Perception and Narcissism: Variations of Event-Related Potential Components (P1 & N170) with Admiration and Rivalry. Cogn. Affect. Behav. Neurosci. 2020, 20, 1041–1055. [Google Scholar] [CrossRef]
  409. Tolgay, B.; Dell’Orco, S.; Maldonato, M.N.; Vogel, C.; Trojano, L.; Esposito, A. EEGs as Potential Predictors of Virtual Agents’ Acceptance. In 2019 10th IEEE International Conference on Cognitive Infocommunications (CogInfoCom), Naples, Italy, 23–25 October 2019; IEEE: Naples, Italy, 2019; pp. 433–438. [Google Scholar] [CrossRef]
  410. Tarai, S.; Mukherjee, R.; Qurratul, Q.A.; Singh, B.K.; Bit, A. Use of Prosocial Word Enhances the Processing of Language: Frequency Domain Analysis of Human EEG. J. Psycholinguist Res. 2019, 48, 145–161. [Google Scholar] [CrossRef]
  411. Ho, V.A.; Nguyen, D.H.-C.; Nguyen, D.H.; Pham, L.T.-V.; Nguyen, D.-V.; Nguyen, K.V.; Nguyen, N.L.-T. Emotion Recognition for Vietnamese Social Media Text. In Computational Linguistics; Nguyen, L.-M., Phan, X.-H., Hasida, K., Eds.; Communications in Computer and Information Science; Springer: Singapore, 2020; Volume 1215, pp. 319–333. [Google Scholar] [CrossRef]
  412. Hu, X.; Zhuang, C.; Wang, F.; Liu, Y.-J.; Im, C.-H.; Zhang, D. FNIRS Evidence for Recognizably Different Positive Emotions. Front. Hum. Neurosci. 2019, 13, 120. [Google Scholar] [CrossRef] [Green Version]
  413. Khazankin, G.R.; Shmakov, I.S.; Malinin, A.N. Remote Facial Emotion Recognition System. In Proceedings of the 2019 International Multi-Conference on Engineering, Computer and Information Sciences (SIBIRCON), Novosibirsk, Russia, 21–27 October 2019; IEEE: Novosibirsk, Russia, 2019; pp. 0975–0979. [Google Scholar] [CrossRef]
  414. Guo, J.; Lei, Z.; Wan, J.; Avots, E.; Hajarolasvadi, N.; Knyazev, B.; Kuharenko, A.; Jacques Junior, J.C.S.; Baro, X.; Demirel, H.; et al. Dominant and Complementary Emotion Recognition from Still Images of Faces. IEEE Access 2018, 6, 26391–26403. [Google Scholar] [CrossRef]
  415. Mumenthaler, C.; Sander, D.; Manstead, A. Emotion Recognition in Simulated Social Interactions. IEEE Trans. Affect. Comput. 2018, 11, 308–312. [Google Scholar] [CrossRef]
  416. Zheng, W.-L.; Lu, B.-L. A Multimodal Approach to Estimating Vigilance Using EEG and Forehead EOG. J. Neural Eng. 2017, 14, 026017. [Google Scholar] [CrossRef] [Green Version]
  417. Tomar, D.; Agarwal, S. Multi-Label Classifier for Emotion Recognition from Music. In Proceedings of 3rd International Conference on Advanced Computing, Networking and Informatics; Nagar, A., Mohapatra, D.P., Chaki, N., Eds.; Smart Innovation, Systems and Technologies; Springer: New Delhi, India, 2016; Volume 43, pp. 111–123. [Google Scholar] [CrossRef]
  418. Bhatti, A.M.; Majid, M.; Anwar, S.M.; Khan, B. Human Emotion Recognition and Analysis in Response to Audio Music Using Brain Signals. Comput. Hum. Behav. 2016, 65, 267–275. [Google Scholar] [CrossRef]
  419. Shih, Y.-L.; Lin, C.-Y. The Relationship between Action Anticipation and Emotion Recognition in Athletes of Open Skill Sports. Cogn. Process 2016, 17, 259–268. [Google Scholar] [CrossRef]
  420. Patwardhan, A.; Knapp, G. Aggressive Actions and Anger Detection from Multiple Modalities Using Kinect, 2016. arXiv preprint 2016, arXiv:1607.01076. [Google Scholar]
  421. Fernández-Alcántara, M.; Cruz-Quintana, F.; Pérez-Marfil, M.N.; Catena-Martínez, A.; Pérez-García, M.; Turnbull, O.H. Assessment of Emotional Experience and Emotional Recognition in Complicated Grief. Front. Psychol. 2016, 7, 126. [Google Scholar] [CrossRef] [Green Version]
  422. Naji, M.; Firoozabadi, M.; Azadfallah, P. Classification of Music-Induced Emotions Based on Information Fusion of Forehead Biosignals and Electrocardiogram. Cogn. Comput. 2014, 6, 241–252. [Google Scholar] [CrossRef]
  423. Wen, W.; Liu, G.; Cheng, N.; Wei, J.; Shangguan, P.; Huang, W. Emotion Recognition Based on Multi-Variant Correlation of Physiological Signals. IEEE Trans. Affect. Comput. 2014, 5, 126–140. [Google Scholar] [CrossRef]
  424. Kamińska, D.; Pelikant, A. Recognition of Human Emotion from a Speech Signal Based on Plutchik’s Model. Int. J. Electron. Telecommun. 2012, 58, 165–170. [Google Scholar] [CrossRef]
  425. Furley, P.; Dicks, M.; Memmert, D. Nonverbal Behavior in Soccer: The Influence of Dominant and Submissive Body Language on the Impression Formation and Expectancy of Success of Soccer Players. J. Sport Exerc. Psychol. 2012, 34, 61–82. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  426. Wagner, J.; Kim, J.; Andre, E. From Physiological Signals to Emotions: Implementing and Comparing Selected Methods for Feature Extraction and Classification. In Proceedings of the 2005 IEEE International Conference on Multimedia and Expo, Amsterdam, The Netherlands, 6–8 July 2005; IEEE: Amsterdam, The Netherlands, 2005; pp. 940–943. [Google Scholar] [CrossRef]
  427. Furman, J.M.; Wuyts, F.L. Vestibular Laboratory Testing. In Aminoff’s Electrodiagnosis in Clinical Neurology; Elsevier: Philadelphia, PA, USA, 2012; pp. 699–723. [Google Scholar] [CrossRef]
  428. Lord Mary, P.; Wright, W.D. The Investigation of Eye Movements. Rep. Prog. Phys. 1950, 13, 1–23. [Google Scholar] [CrossRef]
  429. Landowska, A. Emotion Monitoring—Verification of Physiological Characteristics Measurement Procedures. Metrol. Meas. Syst. 2014, 21, 719–732. [Google Scholar] [CrossRef]
  430. Skiendziel, T.; Rösch, A.G.; Schultheiss, O.C. Assessing the Convergent Validity between the Automated Emotion Recognition Software Noldus FaceReader 7 and Facial Action Coding System Scoring. PLoS ONE 2019, 14, e0223905. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  431. Frescura, A.; Lee, P.J. Emotions and Physiological Responses Elicited by Neighbours Sounds in Wooden Residential Buildings. Build. Environ. 2022, 210, 108729. [Google Scholar] [CrossRef]
  432. Nikolova, D.; Petkova, P.; Manolova, A.; Georgieva, P. ECG-Based Emotion Recognition: Overview of Methods and Applications. In ANNA ’18; Advances in Neural Networks and Applications 2018; VDE: Varna, Bulgaria, 2018; pp. 118–122. [Google Scholar]
  433. Nakanishi, R.; Imai-Matsumura, K. Facial Skin Temperature Decreases in Infants with Joyful Expression. Infant Behav. Dev. 2008, 31, 137–144. [Google Scholar] [CrossRef]
  434. Park, M.W.; Kim, C.J.; Hwang, M.; Lee, E.C. Individual Emotion Classification between Happiness and Sadness by Analyzing Photoplethysmography and Skin Temperature. In 2013 Fourth World Congress on Software Engineering; IEEE: Hong Kong, China, 2013; pp. 190–194. [Google Scholar] [CrossRef]
  435. Gouizi, K.; Bereksi Reguig, F.; Maaoui, C. Emotion Recognition from Physiological Signals. J. Med. Eng. Technol. 2011, 35, 300–307. [Google Scholar] [CrossRef]
  436. Abadi, M.K.; Kia, S.M.; Subramanian, R.; Avesani, P.; Sebe, N. User-Centric Affective Video Tagging from MEG and Peripheral Physiological Responses. In Proceedings of the 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction, Geneva, Switzerland, 2–5 September 2013; IEEE: Geneva, Switzerland, 2013; pp. 582–587. [Google Scholar] [CrossRef] [Green Version]
  437. Aguiñaga, A.R.; Lopez Ramirez, M.; Alanis Garza, A.; Baltazar, R.; Zamudio, V.M. Emotion Analysis through Hysiological Measurements; IOS Press: Amsterdam, The Netherlands, 2013; pp. 97–106. [Google Scholar]
  438. Javaid, M.; Haleem, A.; Vaishya, R.; Bahl, S.; Suman, R.; Vaish, A. Industry 4.0 Technologies and Their Applications in Fighting COVID-19 Pandemic. Diabetes Metab. Syndr. Clin. Res. Rev. 2020, 14, 419–422. [Google Scholar] [CrossRef]
  439. Kalhori, S.R.N.; Bahaadinbeigy, K.; Deldar, K.; Gholamzadeh, M.; Hajesmaeel-Gohari, S.; Ayyoubzadeh, S.M. Digital Health Solutions to Control the COVID-19 Pandemic in Countries with High Disease Prevalence: Literature Review. J. Med. Internet Res. 2021, 23, e19473. [Google Scholar] [CrossRef]
  440. Rahman, M.S.; Peeri, N.C.; Shrestha, N.; Zaki, R.; Haque, U.; Hamid, S.H.A. Defending against the Novel Coronavirus (COVID-19) Outbreak: How Can the Internet of Things (IoT) Help to Save the World? Health Policy Technol. 2020, 9, 136–138. [Google Scholar] [CrossRef]
  441. Rajeesh Kumar, N.V.; Arun, M.; Baraneetharan, E.; Stanly Jaya Prakash, J.; Kanchana, A.; Prabu, S. Detection and Monitoring of the Asymptotic COVID-19 Patients Using IoT Devices and Sensors. Int. J. Pervasive Comput. Commun. 2020, 18(4), 407–418. [Google Scholar] [CrossRef]
  442. Stojanovic, R.; Skraba, A.; Lutovac, B. A Headset Like Wearable Device to Track COVID-19 Symptoms. In 2020 9th Mediterranean Conference on Embedded Computing (MECO), Budva, Montenegro, 8–11 June 2020; IEEE: Budva, Montenegro, 2020; pp. 8–11. [Google Scholar] [CrossRef]
  443. Xian, M.; Luo, H.; Xia, X.; Fares, C.; Carey, P.H.; Chiu, C.-W.; Ren, F.; Shan, S.-S.; Liao, Y.-T.; Hsu, S.-M.; et al. Fast SARS-CoV-2 Virus Detection Using Disposable Cartridge Strips and a Semiconductor-Based Biosensor Platform. J. Vac. Sci. Technol. B 2021, 39, 033202. [Google Scholar] [CrossRef]
  444. Chamberlain, S.D.; Singh, I.; Ariza, C.; Daitch, A.; Philips, P.; Dalziel, B.D. Real-Time Detection of COVID-19 Epicenters within the United States Using a Network of Smart Thermometers. Epidemiology 2020, 1–15. [Google Scholar] [CrossRef]
  445. Cherry, K. The Role of Neurotransmitters. 2021. Available online: https://www.verywellmind.com/what-is-a-neurotransmitter-2795394 (accessed on 14 June 2022).
  446. Ali Fahmi, P.N.; Kodirov, E.; Choi, D.-J.; Lee, G.-S.; Mohd Fikri Azli, A.; Sayeed, S. Implicit Authentication Based on Ear Shape Biometrics Using Smartphone Camera during a Call. In 2012 IEEE International Conference on Systems, Man, and Cybernetics (SMC), Seoul, Korea, 14–17 October 2012; IEEE: Seoul, Korea, 2012; pp. 2272–2276. [Google Scholar] [CrossRef]
  447. Calvert, G. Everything You Need to Know about Implicit Reaction Time (IRTs). 2015. Available online: http://gemmacalvert.com/everything-you-need-to-know-about-implicit-reaction-time/ (accessed on 17 August 2022).
  448. Harris, J.M.; Ciorciari, J.; Gountas, J. Consumer Neuroscience for Marketing Researchers. J. Consum. Behav. 2018, 17, 239–252. [Google Scholar] [CrossRef]
  449. Fox, E. Perspectives from Affective Science on Understanding the Nature of Emotion. Brain Neurosci. Adv. 2018, 2, 239821281881262. [Google Scholar] [CrossRef] [Green Version]
  450. Casado-Aranda, L.A.; Sanchez-Fernandez, J. Advances in neuroscience and marketing: Analyzing tool possibilities and research opportunities. Span. J. Mark. – ESIC 2022, 26, 3–22. [Google Scholar] [CrossRef]
  451. Lantrip, C.; Gunning, F.M.; Flashman, L.; Roth, R.M.; Holtzheimer, P.E. Effects of Transcranial Magnetic Stimulation on the Cognitive Control of Emotion: Potential Antidepressant Mechanisms. J. ECT 2017, 33, 73–80. [Google Scholar] [CrossRef]
  452. Catalino, M.P.; Yao, S.; Green, D.L.; Laws, E.R.; Golby, A.J.; Tie, Y. Mapping Cognitive and Emotional Networks in Neurosurgical Patients Using Resting-State Functional Magnetic Resonance Imaging. Neurosurg. Focus 2020, 48, E9. [Google Scholar] [CrossRef] [Green Version]
  453. Grèzes, J.; Valabrègue, R.; Gholipour, B.; Chevallier, C. A Direct Amygdala-Motor Pathway for Emotional Displays to Influence Action: A Diffusion Tensor Imaging Study: A Direct Limbic Motor Anatomical Pathway. Hum. Brain Mapp. 2014, 35, 5974–5983. [Google Scholar] [CrossRef]
  454. Alhargan, A.; Cooke, N.; Binjammaz, T. Affect Recognition in an Interactive Gaming Environment Using Eye Tracking. In 2017 Seventh International Conference on Affective Computing and Intelligent Interaction (ACII), San Antonio, TX, USA, 23–26 October 2017; IEEE: San Antonio, TX, USA, 2017; pp. 285–291. [Google Scholar] [CrossRef]
  455. Szwoch, M.; Szwoch, W. Emotion Recognition for Affect Aware Video Games. In Image Processing & Communications Challenges 6; Choraś, R.S., Ed.; Advances in Intelligent Systems and Computing; Springer International Publishing: Cham, Switzerland, 2015; Volume 313, pp. 227–236. [Google Scholar] [CrossRef]
  456. Krol, L.R.; Freytag, S.-C.; Zander, T.O. Meyendtris: A Hands-Free, Multimodal Tetris Clone Using Eye Tracking and Passive BCI for Intuitive Neuroadaptive Gaming. In Proceedings of the 19th ACM International Conference on Multimodal Interaction, Glasgow, UK, 13–17 November 2017; ACM: Glasgow, UK, 2017; pp. 433–437. [Google Scholar] [CrossRef]
  457. Elor, A.; Powell, M.; Mahmoodi, E.; Teodorescu, M.; Kurniawan, S. Gaming Beyond the Novelty Effect of Immersive Virtual Reality for Physical Rehabilitation. IEEE Trans. Games 2022, 14, 107–115. [Google Scholar] [CrossRef]
  458. Tiwari, S.; Agarwal, S. A Shrewd Artificial Neural Network-Based Hybrid Model for Pervasive Stress Detection of Students Using Galvanic Skin Response and Electrocardiogram Signals. Big Data 2021, 9, 427–442. [Google Scholar] [CrossRef]
  459. Nakayama, N.; Arakawa, N.; Ejiri, H.; Matsuda, R.; Makino, T. Heart Rate Variability Can Clarify Students’ Level of Stress during Nursing Simulation. PLoS ONE 2018, 13, e0195280. [Google Scholar] [CrossRef] [Green Version]
  460. Tautchin, L.; Dussome, W. The Expanding Reach of Non-Traditional Marketing: A Discussion on the Application of Neuromarketing and Big Data Analytics in the Marketplace. Available online: https://lowelltautchin.ca/wp-content/uploads/2016/08/Neuromarketing-and-Big-Data-Analytics-Project.pdf (accessed on 14 June 2022).
  461. Goyal, G.; Singh, J. Minimum Annotation Identification of Facial Affects for Video Advertisement. In Proceedings of the 2018 International Conference on Intelligent Circuits and Systems (ICICS), Phagwara, India, 20–21 April 2018; IEEE: Phagwara, India, 2018; pp. 300–305. [Google Scholar] [CrossRef]
  462. Yadava, M.; Kumar, P.; Saini, R.; Roy, P.P.; Prosad Dogra, D. Analysis of EEG Signals and Its Application to Neuromarketing. Multimed. Tools Appl. 2017, 76, 19087–19111. [Google Scholar] [CrossRef]
  463. Lakhan, P.; Banluesombatkul, N.; Changniam, V.; Dhithijaiyratn, R.; Leelaarporn, P.; Boonchieng, E.; Hompoonsup, S.; Wilaiprasitporn, T. Consumer Grade Brain Sensing for Emotion Recognition. IEEE Sens. J. 2019, 19, 9896–9907. [Google Scholar] [CrossRef] [Green Version]
  464. Kong, W.; Wang, L.; Xu, S.; Babiloni, F.; Chen, H. EEG Fingerprints: Phase Synchronization of EEG Signals as Biomarker for Subject Identification. IEEE Access 2019, 7, 121165–121173. [Google Scholar] [CrossRef]
  465. El-Amir, M.M.; Al-Atabany, W.; Eldosoky, M.A. Emotion Recognition via Detrended Fluctuation Analysis and Fractal Dimensions. In Proceedings of the 2019 36th National Radio Science Conference (NRSC), Port Said, Egypt, 16–18 April 2019; IEEE: Port Said, Egypt, 2019; pp. 200–208. [Google Scholar] [CrossRef]
  466. Plassmann, H.; Kenning, P.; Deppe, M.; Kugel, H.; Schwindt, W. How Choice Ambiguity Modulates Activity in Brain Areas Representing Brand Preference: Evidence from Consumer Neuroscience. J. Consum. Behav. 2008, 7, 360–367. [Google Scholar] [CrossRef]
  467. Perrachione, T.K.; Perrachione, J.R. Brains and Brands: Developing Mutually Informative Research in Neuroscience and Marketing. J. Consum. Behav. 2008, 7, 303–318. [Google Scholar] [CrossRef]
  468. Gruter, D. Neuromarketing—New Science of Consumer Behavior. Available online: http://emarketingblog.nl/2014/12/neuromarketing-new-science-of-consumer-behavior/ (accessed on 14 June 2022).
  469. Leon, E.; Clarke, G.; Callaghan, V.; Sepulveda, F. A User-Independent Real-Time Emotion Recognition System for Software Agents in Domestic Environments. Eng. Appl. Artif. Intell. 2007, 20, 337–345. [Google Scholar] [CrossRef]
  470. Monajati, M.; Abbasi, S.H.; Shabaninia, F.; Shamekhi, S. Emotions States Recognition Based on Physiological Parameters by Employing of Fuzzy-Adaptive Resonance Theory. Int. J. Intell. Sci. 2012, 02, 166–175. [Google Scholar] [CrossRef] [Green Version]
  471. Andrew, H.; Haines, H.; Seixas, S. Using Neuroscience to Understand the Impact of Premium Digital Out-of-Home Media. Int. J. Mark. Res. 2019, 61, 588–600. [Google Scholar] [CrossRef]
  472. Kaklauskas, A.; Bucinskas, V.; Dzedzickis, A. Computer Implemented Neuromarketing Research Method. European Patent Application EP4016431, 7 February 2021. [Google Scholar]
  473. Lajante, M.; Ladhari, R. The Promise and Perils of the Peripheral Psychophysiology of Emotion in Retailing and Consumer Services. J. Retail. Consum. Serv. 2019, 50, 305–313. [Google Scholar] [CrossRef]
  474. Michael, I.; Ramsoy, T.; Stephens, M.; Kotsi, F. A Study of Unconscious Emotional and Cognitive Responses to Tourism Images Using a Neuroscience Method. J. Islamic Mark. 2019, 10, 543–564. [Google Scholar] [CrossRef]
  475. Libert, A.; van Hulle, M.M. Predicting Premature Video Skipping and Viewer Interest from EEG Recordings. Entropy 2019, 21, 1014. [Google Scholar] [CrossRef] [Green Version]
  476. Jiménez-Marín, G.; Bellido-Pérez, E.; López-Cortés, Á. Marketing Sensorial: El Concepto, Sus Técnicas y Su Aplicación En El Punto de Venta. Vivat Acad. 2019, 148, 121–147. [Google Scholar] [CrossRef] [Green Version]
  477. Cherubino, P.; Martinez-Levy, A.C.; Caratù, M.; Cartocci, G.; Di Flumeri, G.; Modica, E.; Rossi, D.; Mancini, M.; Trettel, A. Consumer Behaviour through the Eyes of Neurophysiological Measures: State-of-the-Art and Future Trends. Comput. Intell. Neurosci. 2019, 2019, 1976847. [Google Scholar] [CrossRef] [Green Version]
  478. Țichindelean, M.B.; Iuliana, C.; Țichindelean, M. Studying the User Experience in Online Banking Services: An Eye-Tracking Application. Stud. Bus. Econ. 2019, 14, 193–208. [Google Scholar] [CrossRef] [Green Version]
  479. Soria Morillo, L.M.; Alvarez-Garcia, J.A.; Gonzalez-Abril, L.; Ortega Ramírez, J.A. Discrete Classification Technique Applied to TV Advertisements Liking Recognition System Based on Low-Cost EEG Headsets. BioMed. Eng. OnLine 2016, 15, 75. [Google Scholar] [CrossRef] [Green Version]
  480. Pringle, H.; Field, P. Institute of Practitioners in Advertising. In Brand Immortality: How Brands Can Live Long and Prosper; Kogan Page: London, UK, 2008. [Google Scholar]
  481. Takahashi, K. Remarks on Emotion Recognition from Bio-Potential Signals. In Proceedings of the 2nd International Conference on Autonomous Robots and Agents, Palmerston North, New Zealand, 13–15 December 2004. [Google Scholar]
  482. Light, K.C.; Girdler, S.S.; Sherwood, A.; Bragdon, E.E.; Brownley, K.A.; West, S.G.; Hinderliter, A.L. High Stress Responsivity Predicts Later Blood Pressure Only in Combination with Positive Family History and High Life Stress. Hypertension 1999, 33, 1458–1464. [Google Scholar] [CrossRef] [Green Version]
  483. Gray, M.A.; Taggart, P.; Sutton, P.M.; Groves, D.; Holdright, D.R.; Bradbury, D.; Brull, D.; Critchley, H.D. A Cortical Potential Reflecting Cardiac Function. Proc. Natl. Acad. Sci. USA 2007, 104, 6818–6823. [Google Scholar] [CrossRef] [Green Version]
  484. Adrogué, H.J.; Madias, N.E. Sodium and Potassium in the Pathogenesis of Hypertension. N. Engl. J. Med. 2007, 356, 1966–1978. [Google Scholar] [CrossRef] [Green Version]
  485. Blair, D.A.; Glover, W.E.; Greenfield, A.D.M.; Roddie, I.C. Excitation of Cholinergic Vasodilator Nerves to Human Skeletal Muscles during Emotional Stress. J. Physiol. 1959, 148, 633–647. [Google Scholar] [CrossRef]
  486. Gasperin, D.; Netuveli, G.; Dias-da-Costa, J.S.; Pattussi, M.P. Effect of Psychological Stress on Blood Pressure Increase: A Meta-Analysis of Cohort Studies. Cad. Saúde Pública 2009, 25, 715–726. [Google Scholar] [CrossRef] [Green Version]
  487. Sun, F.-T.; Kuo, C.; Cheng, H.-T.; Buthpitiya, S.; Collins, P.; Griss, M. Activity-Aware Mental Stress Detection Using Physiological Sensors. In Mobile Computing, Applications, and Services; Gris, M., Yang, G., Eds.; Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering; Springer: Berlin/Heidelberg, Germany, 2012; Volume 76, pp. 211–230. [Google Scholar] [CrossRef]
  488. Singh, R.R.; Conjeti, S.; Banerjee, R. A Comparative Evaluation of Neural Network Classifiers for Stress Level Analysis of Automotive Drivers Using Physiological Signals. Biomed. Signal Processing Control 2013, 8, 740–754. [Google Scholar] [CrossRef]
  489. Palacios, D.; Rodellar, V.; Lázaro, C.; Gómez, A.; Gómez, P. An ICA-Based Method for Stress Classification from Voice Samples. Neural Comput. Applic. 2020, 32, 17887–17897. [Google Scholar] [CrossRef]
  490. Oka, T.; Oka, K.; Hori, T. Mechanisms and Mediators of Psychological Stress-Induced Rise in Core Temperature. Psychosom. Med. 2001, 63, 476–486. [Google Scholar] [CrossRef] [PubMed]
  491. Wu, C.-H.; Liang, W.-B. Emotion Recognition of Affective Speech Based on Multiple Classifiers Using Acoustic-Prosodic Information and Semantic Labels. IEEE Trans. Affect. Comput. 2011, 2, 10–21. [Google Scholar] [CrossRef]
  492. Nilashi, M.; Mardani, A.; Liao, H.; Ahmadi, H.; Manaf, A.A.; Almukadi, W. A Hybrid Method with TOPSIS and Machine Learning Techniques for Sustainable Development of Green Hotels Considering Online Reviews. Sustainability 2019, 11, 6013. [Google Scholar] [CrossRef] [Green Version]
  493. Kaklauskas, A.; Ubarte, I.; Kalibatas, D.; Lill, I.; Velykorusova, A.; Volginas, P.; Vinogradova, I.; Milevicius, V.; Vetloviene, I.; Grubliauskas, I.; et al. A Multisensory, Green, and Energy Efficient Housing Neuromarketing Method. Energies 2019, 12, 3836. [Google Scholar] [CrossRef] [Green Version]
  494. Kaklauskas, A.; Dzitac, D.; Sliogeriene, J.; Lepkova, N.; Vetloviene, I. VINERS Method for the Multiple Criteria Analysis and Neuromarketing of Best Places to Live. Int. J. Comput. Commun. Control 2019, 14, 629–646. [Google Scholar] [CrossRef]
  495. Etzold, V.; Braun, A.; Wanner, T. Eye Tracking as a Method of Neuromarketing for Attention Research—An Empirical Analysis Using the Online Appointment Booking Platform from Mercedes-Benz. In Intelligent Decision Technologies 2019; Czarnowski, I., Howlett, R.J., Jain, L.C., Eds.; Smart Innovation, Systems and Technologies; Springer: Singapore, 2019; Volume 143, pp. 167–182. [Google Scholar] [CrossRef]
  496. Dedeoglu, B.B.; Bilgihan, A.; Ye, B.H.; Buonincontri, P.; Okumus, F. The Impact of Servicescape on Hedonic Value and Behavioral Intentions: The Importance of Previous Experience. Int. J. Hosp. Manag. 2018, 72, 10–20. [Google Scholar] [CrossRef]
  497. Khan, S.N.; Mohsin, M. The Power of Emotional Value: Exploring the Effects of Values on Green Product Consumer Choice Behavior. J. Clean. Prod. 2017, 150, 65–74. [Google Scholar] [CrossRef]
  498. Puustinen, P.; Maas, P.; Karjaluoto, H. Development and Validation of the Perceived Investment Value (PIV) Scale. J. Econ. Psychol. 2013, 36, 41–54. [Google Scholar] [CrossRef]
  499. Shaw, C. What’s Your Companies Emotion Score? Introducing Net Emotional Value (Nev) and Its Relationship to NPS and CSAT. 2012. Available online: https://beyondphilosophy.com/whats-your-companies-emotion-score-introducing-net-emotional-value-nev-and-its-relationship-to-nps-and-csat/ (accessed on 14 June 2022).
  500. Shaw, C. New CX Measure to Compliment NPS: Net Emotional Value. 2016. Available online: https://customerthink.com/new-cx-measure-to-compliment-nps-net-emotional-value/ (accessed on 14 June 2022).
  501. Shaw, C. How to Measure Customer Emotions. 2018. Available online: https://beyondphilosophy.com/measurecustomer-emotions/ (accessed on 14 June 2022).
  502. Situmorang, S.H. Gen C and Gen Y: Experience, Net Emotional Value and Net Promoter Score. In Proceedings of the 1st International Conference on Social and Political Development (ICOSOP 2016), Medan, Indonesia, 21–22 November 2016; Atlantis Press: Medan, Indonesia, 2017; pp. 259–265. [Google Scholar] [CrossRef] [Green Version]
  503. Williams, P.; Soutar, G.N. Value, Satisfaction and Behavioral Intentions in an Adventure Tourism Context. Ann. Tour. Res. 2009, 36, 413–438. [Google Scholar] [CrossRef]
  504. Bailey, J.J.; Gremler, D.D.; McCollough, M.A. Service Encounter Emotional Value: The Dyadic Influence of Customer and Employee Emotions. Serv. Mark. Q. 2001, 23, 1–24. [Google Scholar] [CrossRef]
  505. Zavadskas, E.K.; Bausys, R.; Kaklauskas, A.; Raslanas, S. Hedonic Shopping Rent Valuation by One-to-One Neuromarketing and Neutrosophic PROMETHEE Method. Appl. Soft Comput. 2019, 85, 105832. [Google Scholar] [CrossRef]
  506. De Leersnyder, J.; Mesquita, B.; Boiger, M. What Has Culture Got to Do with Emotions?: (A Lot). In Handbook of Advances in Culture and Psychology; Oxford University Press: Oxford, UK, 2021; Volume 8, pp. 62–119. [Google Scholar] [CrossRef]
  507. Frijda, N.H. The Laws of Emotion, 1st ed.; Psychology Press: New York, NY, USA, 2017. [Google Scholar] [CrossRef]
  508. Levenson, R.W. Human Emotions: A Functional View. In The Nature of Emotion: Fundamental Questions; Oxford University Press: New York, NY, USA, 1994; pp. 123–126. [Google Scholar]
  509. Nesse, R.M. Evolutionary Explanations of Emotions. Hum. Nat. 1990, 1, 261–289. [Google Scholar] [CrossRef]
  510. Bonanno, G.A.; Colak, D.M.; Keltner, D.; Shiota, M.N.; Papa, A.; Noll, J.G.; Putnam, F.W.; Trickett, P.K. Context Matters: The Benefits and Costs of Expressing Positive Emotion among Survivors of Childhood Sexual Abuse. Emotion 2007, 7, 824–837. [Google Scholar] [CrossRef]
  511. Coifman, K.G.; Bonanno, G.A. Emotion Context Sensitivity in Adaptation and Recovery. In Emotion Regulation and Psychopathology: A Transdiagnostic Approach to Etiology and Treatment; The Guilford Press: New York, NY, USA, 2010; pp. 157–173. [Google Scholar]
  512. Pugh, Z.H.; Huang, J.; Leshin, J.; Lindquist, K.A.; Nam, C.S. Culture and Gender Modulate DlPFC Integration in the Emotional Brain: Evidence from Dynamic Causal Modeling. Cogn. Neurodyn. 2022. Available online: https://link.springer.com/content/pdf/10.1007/s11571-022-09805-2.pdf (accessed on 14 June 2022). [CrossRef]
  513. Tomasino, B.; Maggioni, E.; Bonivento, C.; Nobile, M.; D’Agostini, S.; Arrigoni, F.; Fabbro, F.; Brambilla, P. Effects of Age and Gender on Neural Correlates of Emotion Imagery. Hum. Brain Mapp. 2022. [Google Scholar] [CrossRef]
  514. Hampton, R.S.; Varnum, M.E.W. The Cultural Neuroscience of Emotion Regulation. Cult. Brain 2018, 6, 130–150. [Google Scholar] [CrossRef]
  515. Rule, N.O.; Freeman, J.B.; Ambady, N. Culture in Social Neuroscience: A Review. Soc. Neurosci. 2013, 8, 3–10. [Google Scholar] [CrossRef]
  516. Kraus, M.W.; Piff, P.K.; Keltner, D. Social Class, Sense of Control, and Social Explanation. J. Personal. Soc. Psychol. 2009, 97, 992–1004. [Google Scholar] [CrossRef] [Green Version]
  517. Gallo, L.C.; Matthews, K.A. Understanding the Association between Socioeconomic Status and Physical Health: Do Negative Emotions Play a Role? Psychol. Bull. 2003, 129, 10–51. [Google Scholar] [CrossRef]
  518. Choudhury, S.; Nagel, S.K.; Slaby, J. Critical Neuroscience: Linking Neuroscience and Society through Critical Practice. BioSocieties 2009, 4, 61–77. [Google Scholar] [CrossRef]
  519. Goldfarb, M.G.; Brown, D.R. Diversifying Participation: The Rarity of Reporting Racial Demographics in Neuroimaging Research. NeuroImage 2022, 254, 119122. [Google Scholar] [CrossRef]
  520. Lane, R.D. From Reconstruction to Construction: The Power of Corrective Emotional Experiences in Memory Reconsolidation and Enduring Change. J. Am. Psychoanal. Assoc. 2018, 66, 507–516. [Google Scholar] [CrossRef]
  521. Nakamura, F. Creating or Performing Words? Observations on Contemporary Japanese Calligraphy. In Creativity and Cultural Improvisation; Routledge: Oxfordshire, UK, 2021; pp. 79–98. [Google Scholar]
  522. Markus, H.R.; Kitayama, S. Cultural Variation in the Self-Concept. In The Self: Interdisciplinary Approaches; Strauss, J., Goethals, G.R., Eds.; Springer: New York, NY, USA, 1991; pp. 18–48. [Google Scholar] [CrossRef]
  523. Mesquita, B.; Frijda, N.H. Cultural Variations in Emotions: A Review. Psychol. Bull. 1992, 112, 179–204. [Google Scholar] [CrossRef]
  524. Mesquita, B.; Leu, J. The Cultural Psychology of Emotion. In Handbook of Cultural Psychology; The Guilford Press: New York, NY, USA, 2007; pp. 734–759. [Google Scholar]
  525. Lim, N. Cultural Differences in Emotion: Differences in Emotional Arousal Level between the East and the West. Integr. Med. Res. 2016, 5, 105–109. [Google Scholar] [CrossRef] [Green Version]
  526. Hareli, S.; Kafetsios, K.; Hess, U. A Cross-Cultural Study on Emotion Expression and the Learning of Social Norms. Front. Psychol. 2015, 6, 1501. [Google Scholar] [CrossRef] [Green Version]
  527. Scollon, C.N.; Diener, E.; Oishi, S.; Biswas-Diener, R. Emotions Across Cultures and Methods. J. Cross-Cult. Psychol. 2004, 35, 304–326. [Google Scholar] [CrossRef]
  528. Siddiqui, H.U.R.; Shahzad, H.F.; Saleem, A.A.; Khan Khakwani, A.B.; Rustam, F.; Lee, E.; Ashraf, I.; Dudley, S. Respiration Based Non-Invasive Approach for Emotion Recognition Using Impulse Radio Ultra Wide Band Radar and Machine Learning. Sensors 2021, 21, 8336. [Google Scholar] [CrossRef]
  529. Houssein, E.H.; Hammad, A.; Ali, A.A. Human Emotion Recognition from EEG-Based Brain–Computer Interface Using Machine Learning: A Comprehensive Review. Neural Comput Applic 2022, 34, 12527–12557. [Google Scholar] [CrossRef]
  530. Shi, Y.; Zheng, X.; Li, T. Unconscious Emotion Recognition Based on Multi-Scale Sample Entropy. In Proceedings of the 2018 IEEE International Conference on Bioinformatics and Biomedicine (BIBM), Madrid, Spain, 3–6 December 2018; IEEE: Madrid, Spain, 2018; pp. 1221–1226. [Google Scholar] [CrossRef]
  531. Thomson, D.M.H.; Coates, T. Are Unconscious Emotions Important in Product Assessment? How Can We Access Them? Food Qual. Prefer. 2021, 92, 104123. [Google Scholar] [CrossRef]
  532. Poria, S.; Cambria, E.; Bajpai, R.; Hussain, A. A Review of Affective Computing: From Unimodal Analysis to Multimodal Fusion. Inf. Fusion 2017, 37, 98–125. [Google Scholar] [CrossRef]
  533. Caridakis, G.; Castellano, G.; Kessous, L.; Raouzaiou, A.; Malatesta, L.; Asteriadis, S.; Karpouzis, K. Multimodal Emotion Recognition from Expressive Faces, Body Gestures and Speech. In Artificial Intelligence and Innovations 2007: From Theory to Applications; Boukis, C., Pnevmatikakis, A., Polymenakos, L., Eds.; IFIP The International Federation for Information Processing; Springer: Boston, MA, USA, 2007; Volume 247, pp. 375–388. [Google Scholar] [CrossRef] [Green Version]
  534. Cambria, E.; Das, D.; Bandyopadhyay, S.; Feraco, A. Affective Computing and Sentiment Analysis. In A Practical Guide to Sentiment Analysis; Cambria, E., Das, D., Bandyopadhyay, S., Feraco, A., Eds.; Socio-Affective Computing; Springer International Publishing: Cham, Switzerland, 2017; Volume 5, pp. 102–107. [Google Scholar] [CrossRef]
  535. Dhanapal, R.; Bhanu, D. Electroencephalogram classification using various artificial neural networks. J. Crit. Rev. 2020, 7, 891–894. [Google Scholar] [CrossRef]
  536. Gunawan, T.S.; Alghifari, M.F.; Morshidi, M.A.; Kartiwi, M. A Review on Emotion Recognition Algorithms Using Speech Analysis. Indones. J. Electr. Eng. Inform. 2018, 6, 12–20. [Google Scholar] [CrossRef] [Green Version]
  537. Sánchez-Reolid, R.; García, A.; Vicente-Querol, M.; Fernández-Aguilar, L.; López, M.; González, A. Artificial Neural Networks to Assess Emotional States from Brain-Computer Interface. Electronics 2018, 7, 384. [Google Scholar] [CrossRef] [Green Version]
  538. Nakisa, B.; Rastgoo, M.N.; Tjondronegoro, D.; Chandran, V. Evolutionary Computation Algorithms for Feature Selection of EEG-Based Emotion Recognition Using Mobile Sensors. Expert Syst. Appl. 2018, 93, 143–155. [Google Scholar] [CrossRef] [Green Version]
  539. Saxena, A.; Khanna, A.; Gupta, D. Emotion Recognition and Detection Methods: A Comprehensive Survey. J. Artif. Intell. Syst. 2020, 2, 53–79. [Google Scholar] [CrossRef]
  540. Ahmed, F.; Sieu, B.; Gavrilova, M.L. Score and Rank-Level Fusion for Emotion Recognition Using Genetic Algorithm. In Proceedings of the 2018 IEEE 17th International Conference on Cognitive Informatics & Cognitive Computing (ICCI*CC), IEEE, Berkeley, CA, USA, 7 October 2018; pp. 46–53. [Google Scholar] [CrossRef]
  541. Slimani, K.; Kas, M.; El Merabet, Y.; Ruichek, Y.; Messoussi, R. Local Feature Extraction Based Facial Emotion Recognition: A Survey. Int. J. Electr. Comput. Eng. 2020, 10, 4080–4092. [Google Scholar] [CrossRef]
  542. Maheshwari, D.; Ghosh, S.K.; Tripathy, R.K.; Sharma, M.; Acharya, U.R. Automated Accurate Emotion Recognition System Using Rhythm-Specific Deep Convolutional Neural Network Technique with Multi-Channel EEG Signals. Comput. Biol. Med. 2021, 134, 104428. [Google Scholar] [CrossRef]
  543. Zatarain Cabada, R.; Rodriguez Rangel, H.; Barron Estrada, M.L.; Cardenas Lopez, H.M. Hyperparameter Optimization in CNN for Learning-Centered Emotion Recognition for Intelligent Tutoring Systems. Soft Comput. 2020, 24, 7593–7602. [Google Scholar] [CrossRef]
  544. Poria, S.; Hazarika, D.; Majumder, N.; Naik, G.; Cambria, E.; Mihalcea, R. MELD: A Multimodal Multi-Party Dataset for Emotion Recognition in Conversations. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, Florence, Italy, 28 July–2 August 2019; Association for Computational Linguistics: Florence, Italy, 2019; pp. 527–536. [Google Scholar] [CrossRef] [Green Version]
  545. Xu, Y.; Sun, Y.; Liu, X.; Zheng, Y. A Digital-Twin-Assisted Fault Diagnosis Using Deep Transfer Learning. IEEE Access 2019, 7, 19990–19999. [Google Scholar] [CrossRef]
  546. Daneshfar, F.; Kabudian, S.J.; Neekabadi, A. Speech Emotion Recognition Using Hybrid Spectral-Prosodic Features of Speech Signal/Glottal Waveform, Metaheuristic-Based Dimensionality Reduction, and Gaussian Elliptical Basis Function Network Classifier. Appl. Acoust. 2020, 166, 107360. [Google Scholar] [CrossRef]
  547. Shi, W.; Jiang, M. Fuzzy Wavelet Network with Feature Fusion and LM Algorithm for Facial Emotion Recognition. In Proceedings of the 2018 IEEE International Conference of Safety Produce Informatization (IICSPI), Chongqing, China, 10–12 December 2018; IEEE: Chongqing, China, 2018; pp. 582–586. [Google Scholar] [CrossRef]
  548. Yildirim, S.; Kaya, Y.; Kılıç, F. A Modified Feature Selection Method Based on Metaheuristic Algorithms for Speech Emotion Recognition. Appl. Acoust. 2021, 173, 107721. [Google Scholar] [CrossRef]
  549. Bellamkonda, S.S. Facial Emotion Recognition by Hyper-Parameter Tuning of Convolutional Neural Network Using Genetic Algorithm. 2021. Available online: http://urn.kb.se/resolve?urn=urn:nbn:se:bth-22308 (accessed on 14 June 2022).
  550. Jalili, L.; Cervantes, J.; García-Lamont, F.; Trueba, A. Emotion Recognition from Facial Expressions Using a Genetic Algorithm to Feature Extraction. In Intelligent Computing Theories and Application; Huang, D.-S., Jo, K.-H., Li, J., Gribova, V., Bevilacqua, V., Eds.; Lecture Notes in Computer Science; Springer International Publishing: Cham, Switzerland, 2021; Volume 12836, pp. 59–71. [Google Scholar] [CrossRef]
  551. Sun, L.; Li, Q.; Fu, S.; Li, P. Speech Emotion Recognition Based on Genetic Algorithm–Decision Tree Fusion of Deep and Acoustic Features. ETRI J. 2022, 44, 462–475. [Google Scholar] [CrossRef]
  552. Madhoushi, Z.; Hamdan, A.R.; Zainudin, S. Sentiment Analysis Techniques in Recent Works. In Proceedings of the 2015 Science and Information Conference (SAI), London, UK, 28–30 August 2015; IEEE: London, UK, 2015; pp. 288–291. [Google Scholar] [CrossRef]
  553. Li, G.; Zhou, X.; Cao, L. AI Meets Database: AI4DB and DB4AI. In Proceedings of the 2021 International Conference on Management of Data, Virtual Event, China; 2021; pp. 2859–2866. Available online: https://dbgroup.cs.tsinghua.edu.cn/ligl/papers/sigmod21-tutorial-paper.pdf (accessed on 14 June 2022). [CrossRef]
  554. Arnau-Gonzalez, P.; Katsigiannis, S.; Arevalillo-Herraez, M.; Ramzan, N. BED: A New Data Set for EEG-Based Biometrics. IEEE Internet Things J. 2021, 8, 12219–12230. [Google Scholar] [CrossRef]
  555. Stappen, L.; Schuller, B.; Lefter, I.; Cambria, E.; Kompatsiaris, I. Summary of MuSe 2020: Multimodal Sentiment Analysis, Emotion-Target Engagement and Trustworthiness Detection in Real-Life Media. In Proceedings of the 28th ACM International Conference on Multimedia, Seattle, WA, USA; ACM: Seattle, WA, USA, 2020; pp. 4769–4770. Available online: https://dl.acm.org/doi/pdf/10.1145/3394171.3421901 (accessed on 14 June 2022). [CrossRef]
  556. Poria, S.; Majumder, N.; Mihalcea, R.; Hovy, E. Emotion Recognition in Conversation: Research Challenges, Datasets, and Recent Advances. IEEE Access 2019, 7, 100943–100953. [Google Scholar] [CrossRef]
  557. Petta, P.; Pelachaud, C.; Cowie, R. (Eds.) Emotion-Oriented Systems: The Humaine Handbook; Cognitive Technologies; Springer: Berlin, Germany; London, UK, 2011. [Google Scholar]
  558. Busso, C.; Bulut, M.; Lee, C.-C.; Kazemzadeh, A.; Mower, E.; Kim, S.; Chang, J.N.; Lee, S.; Narayanan, S.S. IEMOCAP: Interactive Emotional Dyadic Motion Capture Database. Lang Resour. Eval. 2008, 42, 335–359. [Google Scholar] [CrossRef]
  559. Douglas-Cowie, E.; Campbell, N.; Cowie, R.; Roach, P. Emotional Speech: Towards a New Generation of Databases. Speech Commun. 2003, 40, 33–60. [Google Scholar] [CrossRef] [Green Version]
  560. McKeown, G.; Valstar, M.; Cowie, R.; Pantic, M.; Schroder, M. The SEMAINE Database: Annotated Multimodal Records of Emotionally Colored Conversations between a Person and a Limited Agent. IEEE Trans. Affect. Comput. 2012, 3, 5–17. [Google Scholar] [CrossRef] [Green Version]
  561. Koelstra, S.; Muhl, C.; Soleymani, M.; Lee, J.-S.; Yazdani, A.; Ebrahimi, T.; Pun, T.; Nijholt, A.; Patras, I. DEAP: A Database for Emotion Analysis; Using Physiological Signals. IEEE Trans. Affect. Comput. 2012, 3, 18–31. [Google Scholar] [CrossRef] [Green Version]
  562. Katsigiannis, S.; Ramzan, N. DREAMER: A Database for Emotion Recognition Through EEG and ECG Signals from Wireless Low-Cost Off-the-Shelf Devices. IEEE J. Biomed. Health Inform. 2018, 22, 98–107. [Google Scholar] [CrossRef] [Green Version]
  563. GitHub. EEG-Datasets. Available online: https://github.com/meagmohit/EEG-Datasets (accessed on 17 August 2022).
  564. Olivas, E.S.; Guerrero, J.D.M.; Martinez-Sober, M.; Magdalena-Benedito, J.R.; Serrano, L. Handbook Of Research On Machine Learning Applications and Trends: Algorithms, Methods and Techniques; IGI global: Hershey, PA, USA, 2009. [Google Scholar]
  565. Haerpfer, C.; Inglehart, R.; Moreno, A.; Welzel, C.; Kizilova, K.; Diez-Medrano, J.; Lagos, M.; Norris, P.; Ponarin, E.; Puranen, B. World Values Survey Wave 7 (2017–2022) Cross-National Data-Set. 2022. Available online: https://www.worldvaluessurvey.org/WVSDocumentationWV7.jsp (accessed on 14 June 2022). [CrossRef]
  566. Sýkorová, K.; Flegr, J. Faster Life History Strategy Manifests Itself by Lower Age at Menarche, Higher Sexual Desire, and Earlier Reproduction in People with Worse Health. Sci. Rep. 2021, 11, 11254. [Google Scholar] [CrossRef]
  567. Wlezien, C. Patterns of Representation: Dynamics of Public Preferences and Policy. J. Politics 2004, 66, 1–24. [Google Scholar] [CrossRef] [Green Version]
  568. Kelley, K.; Preacher, K.J. On effect size. Psychol. Methods 2012, 17, 137–152. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  569. Wilkinson, L. Task Force on Statistical Inference, American Psychological Association, Science Directorate. Statistical methods in psychology journals: Guidelines and explanations. Am. Psychol. 1999, 54, 594–604. [Google Scholar] [CrossRef]
  570. Durlak, J.A. How to select, calculate, and interpret effect sizes. J. Pediatric Psychol. 2009, 34, 917–928. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  571. Courville, T.; Thompson, B. Use of structure coefficients in published multiple regression articles: β is not enough. Educ. Psychol. Meas. 2001, 61, 229–248. [Google Scholar] [CrossRef] [Green Version]
  572. Johnson, J.W. A heuristic method for estimating the relative weight of predictor variables in multiple regression. Multivar. Behav. Res. 2000, 35, 1–19. [Google Scholar] [CrossRef]
  573. Depuydt, C.E.; Jonckheere, J.; Berth, M.; Salembier, G.M.; Vereecken, A.J.; Bogers, J.J. Serial type-specific human papillomavirus (HPV) load measurement allows differentiation between regressing cervical lesions and serial virion productive transient infections. Cancer Med. 2015, 4, 1294–1302. [Google Scholar] [CrossRef]
  574. Funder, D.C.; Ozer, D.J. Evaluating effect size in psychological research: Sense and nonsense. Adv. Methods Pract. Psychol. Sci. 2019, 2, 156–168. [Google Scholar] [CrossRef]
  575. Pogrow, S. How effect size (practical significance) misleads clinical practice: The case for switching to practical benefit to assess applied research findings. Am. Stat. 2019, 73, 223–234. [Google Scholar] [CrossRef] [Green Version]
  576. Tabassi, E.; Wilson, C. A novel approach to fingerprint image quality. In International Conference on Image Processing, ICIP’05, Genoa, Italy, 11–14 September 2005; IEEE: Genova, Italy, 2005; pp. 37–40. [Google Scholar]
  577. El-Abed, M.; Giot, R.; Charrier, C.; Rosenberger, C. Evaluation of biometric systems: An svm-based quality index. In Proceedings of the Third Norsk Information Security Conference, NISK; 2010; pp. 57–68. Available online: https://hal.archives-ouvertes.fr/hal-00995094/ (accessed on 14 June 2022).
  578. iSO 13407:1999. Human Centred Design Process for Interactive Systems. Available online: https://www.iso.org/obp/ui/#iso:std:iso:13407:ed-1:v1:en (accessed on 10 May 2022).
  579. Giot, R.; El-Abed, M.; Rosenberger, C. Fast computation of the performance evaluation of biometric systems: Application to multibiometrics. Future Gener. Comput. Syst. 2013, 29, 788–799. [Google Scholar] [CrossRef]
  580. Mansfield, A. ISO/IEC 19795-1:2006; Information technology–biometric performance testing and reporting–part 1: Principles and framework. 2006. Available online: https://www.iso.org/standard/41447.html (accessed on 14 June 2022).
  581. iSO/IEC FCD 19792; Information Technology—Security Techniques—Security Evaluation of Biometrics. Available online: https://webstore.iec.ch/preview/info_isoiec19792%7Bed1.0%7Den.pdf (accessed on 10 May 2022).
  582. Rane, S. Standardization of biometric template protection. IEEE MultiMedia 2014, 21, 94–99. [Google Scholar] [CrossRef]
  583. Dube, A.; Singh, D.; Asthana, R.K.; Walia, G.S. A Framework for Evaluation of Biometric Based Authentication System. In Proceedings of the 2020 3rd International Conference on Intelligent Sustainable Systems, ICISS, Thoothukudi, India, 3–5 December 2020; IEEE: Thoothukudi, India, 2020; pp. 925–932. [Google Scholar]
  584. Mannepalli, K.; Sastry, P.N.; Suman, M. FDBN: Design and development of Fractional Deep Belief Networks for speaker emotion recognition. Int. J. Speech Technol. 2016, 19, 779–790. [Google Scholar] [CrossRef]
  585. Al-Shayea, Q.; Al-Ani, M. Biometric face recognition based on enhanced histogram approach. Int. J. Commun. Netw. Inf. Secur. 2018, 10, 148–154. [Google Scholar] [CrossRef]
  586. Valiyavalappil Haridas, A.; Marimuthu, R.; Sivakumar, V.G.; Chakraborty, B. Emotion recognition of speech signal using Taylor series and deep belief network based classification. Evol. Intell. 2020, 15, 1145–1158. [Google Scholar] [CrossRef]
  587. Arora, M.; Kumar, M. AutoFER: PCA and PSO based automatic facial emotion recognition. Multimed. Tools Appl. 2021, 80, 3039–3049. [Google Scholar] [CrossRef]
  588. Karmarkar, U.R.; Plassmann, H. Consumer neuroscience: Past, present, and future. Organ. Res. Methods 2019, 22, 174–195. [Google Scholar] [CrossRef]
  589. Seitamaa-Hakkarainen, P.; Huotilainen, M.; Mäkelä, M.; Groth, C.; Hakkarainen, K. The Promise of Cognitive Neuroscience in Design Studies. Available online: https://dl.designresearchsociety.org/drs-conference-papers/drs2014/researchpapers/62 (accessed on 14 June 2022).
  590. Su, F.; Xia, L.; Cai, A.; Ma, J. A dual-biometric-modality identification system based on fingerprint and EEG. In Proceedings of the IEEE 4th International Conference on Biometrics Theory, Applications and Systems, BTAS, Washington, DC, USA, 27–29 September 2010; IEEE: Washington, DC, USA, 2010; pp. 3–8. [Google Scholar]
  591. Pal, S.; Mitra, M. Increasing the accuracy of ECG based biometric analysis by data modelling. Measurement 2012, 45, 1927–1932. [Google Scholar] [CrossRef]
  592. Singh, Y.N.; Singh, S.K.; Gupta, P. Fusion of electrocardiogram with unobtrusive biometrics: An efficient individual authentication system. Pattern Recognit. Lett. 2012, 33, 1932–1941. [Google Scholar] [CrossRef]
  593. Lourenço, A.; Silva, H.; Fred, A. Unveiling the biometric potential of finger-based ECG signals. Comput. Intell. Neurosci. 2011, 2011, 1–8. [Google Scholar] [CrossRef] [Green Version]
  594. Wahabi, S.; Member, S.; Pouryayevali, S.; Member, S. On evaluating ECG biometric systems: Session-dependence and body posture. IEEE Trans. Inf. Forensics Secur. 2014, 9, 2002–2013. [Google Scholar] [CrossRef]
  595. Havenetidis, K. Encryption and Biometrics: Context, methodologies and perspectives of biological data. J. Appl. Math. Bioinform. 2013, 3, 141. [Google Scholar]
  596. Sanjeeva Reddy, M.; Narasimha, B.; Suresh, E.; Subba Rao, K. Analysis of EOG signals using wavelet transform for detecting eye blinks. In Proceedings of the 2010 International Conference on Wireless Communications & Signal Processing, WCSP 2010, Suzhou, China, 21–23 October 2010; pp. 1–3. [Google Scholar]
  597. Punsawad, Y.; Wongsawat, Y.; Parnichkun, M. Hybrid EEG-EOG brain-computer interface system for practical machine control. In Proceedings of the 2010 Annual International Conference of the IEEE Engineering in Medicine Biology Society, EMBC 2010, Buenos Aires, Argentina, 31 August–4 September 2010; pp. 1360–1363. [Google Scholar]
  598. Zapata, J.C.; Duque, C.M.; Rojas-Idarraga, Y.; Gonzalez, M.E.; Guzmán, J.A.; Botero, B. Data fusion applied to biometric identification–A review. In Colombian Conference on Computing; Springer: Cham, Switzerland, 2017; pp. 721–733. [Google Scholar]
  599. Gutu, D. A Study of Facial Electromyography for Improving Image Quality Assessment. Ph.D. Thesis, University of Toyama, Toyama, Japan, 2015. [Google Scholar]
  600. Jain, A.K.; Ross, A.; Prabhakar, S. An introduction to biometric recognition. IEEE Transactions on circuits and systems for video technology 2004, 14(1), 4–20. [Google Scholar] [CrossRef] [Green Version]
  601. National Research Council. Biometric Recognition: Challenges and Opportunities; The National Academies Press: Washington, DC, USA, 2010; p. 182. [Google Scholar] [CrossRef]
  602. Bhatia, R. Biometrics and face recognition techniques. Int. J. Adv. Res. Comput. Sci. Softw. Eng. 2013, 3, 93–99. [Google Scholar]
  603. Sabhanayagam, T.; Venkatesan, V.P.; Senthamaraikannan, K. A comprehensive survey on various biometric systems. Int. J. Appl. Eng. Res. 2018, 13, 2276–2297. [Google Scholar]
  604. Delac, K.; Grgic, M. A survey of biometric recognition methods. In Proceedings Elmar-200, 46th International Symposium on Electronics in Marine, Zadar, Croatia, 16–18 June 2004; IEEE: Zadar, Croatia, 2004; pp. 184–193. [Google Scholar]
  605. Kataria, A.N.; Adhyaru, D.M.; Sharma, A.K.; Zaveri, T.H. A survey of automated biometric authentication techniques. In Proceedings of the 2013 Nirma University International Conference on Engineering (NUiCONE), Ahmedabad, India, 28–30 November 2013; IEEE: Ahmedabad, India, 2013; pp. 1–6. [Google Scholar]
  606. Khairwa, A.; Abhishek, K.; Prakash, S.; Pratap, T. A comprehensive study of various biometric identification techniques. In Proceedings of the 2012 Third International Conference on Computing, Communication and Networking Technologies (ICCCNT’12), Coimbatore, India, 26–28 July 2012; IEEE: Coimbatore, India, 2012; pp. 1–6. [Google Scholar]
  607. Ooms, K.; Dupont, L.; Lapon, L.; Popelka, S. Accuracy and precision of fixation locations recorded with the Low-cost Eye Tribe tracker in different experimental setups. J. Eye Mov. Res. 2015, 8, 1–24. [Google Scholar] [CrossRef]
  608. Lopez-Basterretxea, A.; Mendez-Zorrilla, A.; Garcia-Zapirain, B. Eye/head tracking technology to improve HCI with iPad applications. Sensors 2015, 15, 2244–2264. [Google Scholar] [CrossRef] [Green Version]
  609. Harinda, E.; Ntagwirumugara, E. Security & privacy implications in the placement of biometric-based ID card for Rwanda Universities. J. Inf. Secur. 2015, 6, 93. [Google Scholar] [CrossRef] [Green Version]
  610. Ibrahim, D.R.; Tamimi, A.A.; Abdalla, A.M. Performance analysis of biometric recognition modalities. In Proceedings of the 2017 8th International Conference on Information Technology (ICIT); IEEE: Amman, Jordan, 2017; pp. 980–984. Available online: https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=8079977 (accessed on 14 June 2022).
  611. Vats, S.; Kaur, H. A Comparative Study of Different Biometric Features. Int. J. Adv. Res. Comput. Sci. 2016, 7, 30–35. [Google Scholar] [CrossRef]
  612. Yu, F.X.; Suo, Y.N. Application of gesture recognition based on the somatosensory kinect sensor in human-computer interaction framework. Rev. Fac. Ing. 2017, 32, 580–585. [Google Scholar]
  613. Meitram, R.; Choudhary, P. Palm vein recognition based on 2D Gabor filter and artificial neural network. J. Adv. Inf. Technol. 2018, 9, 68–72. [Google Scholar] [CrossRef]
  614. Ahmed, A.A.E.; Traore, I. A new biometric technology based on mouse dynamics. IEEE Trans. Dependable Secur. Comput. 2007, 4, 165–179. [Google Scholar] [CrossRef]
  615. Trewin, S.; Swart, C.; Koved, L.; Martino, J.; Singh, K.; Ben-David, S. Biometric authentication on a mobile device: A study of user effort, error and task disruption. In Proceedings of the 28th Annual Computer Security Applications Conference, ACSAC, New York, NY, USA, 3–7 December 2012; pp. 159–168. [Google Scholar] [CrossRef]
  616. Haghighat, M.; Abdel-Mottaleb, M.; Alhalabi, W. Discriminant Correlation Analysis: Real-Time Feature Level Fusion for Multimodal Biometric Recognition. IEEE Trans. Inf. Forensics Security. Wash. Bus. J. 2016, 11, 1984–1996. [Google Scholar] [CrossRef]
  617. Flook, B. This is the ’biometric war’ Michael Saylor was talking about. Wash. Bus. J. 2013, 9, 91–98. [Google Scholar]
  618. Islam, M. Feature and score fusion based multiple classifier selection for iris recognition. Comput. Intell. Neurosci. 2014, 2014, 380585. [Google Scholar] [CrossRef]
  619. De Leersnyder, J.; Mesquita, B.; Kim, H.S. Where Do My Emotions Belong? A Study of Immigrants’ Emotional Acculturation. Pers. Soc. Psychol. Bull. 2011, 37, 451–463. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  620. Vuong, Q.H.; Napier, N.K. Acculturation and Global Mindsponge: An Emerging Market Perspective. Int. J. Intercult. Relat. 2015, 49, 354–367. [Google Scholar] [CrossRef]
  621. Vuong, Q.-H. Global Mindset as the Integration of Emerging Socio-Cultural Values through Mindsponge Processes: A Transition Economy Perspective. In Global Mindsets: Exploration and Perspectives; Routledge: London, UK, 2016; pp. 109–126. [Google Scholar]
  622. Vuong, Q.-H.; Bui, Q.-K.; La, V.-P.; Vuong, T.-T.; Nguyen, V.-H.T.; Ho, M.-T.; Nguyen, H.-K.T.; Ho, M.-T. Cultural Additivity: Behavioural Insights from the Interaction of Confucianism, Buddhism and Taoism in Folktales. Palgrave Commun. 2018, 4, 143. [Google Scholar] [CrossRef] [Green Version]
  623. Vuong, Q.-H.; Ho, M.-T.; Nguyen, H.-K.T.; Vuong, T.-T.; Tran, T.; Hoang, K.-L.; Vu, T.-H.; Hoang, P.-H.; Nguyen, M.-H.; Ho, M.-T.; et al. On How Religions Could Accidentally Incite Lies and Violence: Folktales as a Cultural Transmitter. Palgrave Commun. 2020, 6, 82. [Google Scholar] [CrossRef]
  624. Ho, M.-T.; Mantello, P.; Nguyen, H.-K.T.; Vuong, Q.-H. Affective Computing Scholarship and the Rise of China: A View from 25 Years of Bibliometric Data. Hum. Soc. Sci. Commun. 2021, 8, 282. [Google Scholar] [CrossRef]
  625. FaceReader. Reference Manual Version 7. Tool for Automatic Analysis of Facial Expressions. Available online: http://sslab.nwpu.edu.cn/uploads/1500604789-971697563f64.pdf (accessed on 2 March 2022).
  626. Kaklauskas, A.; Abraham, A.; Milevicius, V. Diurnal Emotions, Valence and the Coronavirus Lockdown Analysis in Public Spaces. Eng. Appl. Artif. Intell. 2021, 98, 104122. [Google Scholar] [CrossRef]
  627. Sun, Z.; Li, Q.; Liu, Y.; Zhu, Y. Opportunities and Challenges for Biometrics. China’s E-Sci. Blue Book 2021, 101–125. [Google Scholar]
  628. Albuquerque, V.H.C.D.; Damaševičius, R.; Tavares, J.M.R.; Pinheiro, P.R. EEG-based biometrics: Challenges and applications. Comput. Intell. Neurosci. 2018, 2018, 5483921. [Google Scholar] [CrossRef] [Green Version]
  629. Fierrez, J.; Morales, A.; Vera-Rodriguez, R.; Camacho, D. Multiple classifiers in biometrics. Part 2: Trends and challenges. Inf. Fusion 2018, 44, 103–112. [Google Scholar] [CrossRef]
  630. Sivaraman, S. Top 10 Trending Biometric Technology for 2022. Available online: https://blog.mantratec.com/Top-10-trending-Biometric-technology-for-2022 (accessed on 2 March 2022).
Figure 1. Plutchik’s wheel of emotions, biometrics and neuroscience sensors, and trends.
Figure 1. Plutchik’s wheel of emotions, biometrics and neuroscience sensors, and trends.
Sensors 22 07824 g001
Figure 2. Neuroscience and biometric branches analyzing AFFECT in various sciences and fields.
Figure 2. Neuroscience and biometric branches analyzing AFFECT in various sciences and fields.
Sensors 22 07824 g002
Figure 3. Resting state TMS brain scan image [287].
Figure 3. Resting state TMS brain scan image [287].
Sensors 22 07824 g003
Figure 4. Raw images of fPET and fMRI scans [288].
Figure 4. Raw images of fPET and fMRI scans [288].
Sensors 22 07824 g004
Figure 5. DTI can be used to construct a transversely isotropic model by overlaying axonal fiber tractography on a finite element mesh: (a) DTI-informed Finite Element Model; tractography shows complex fibers from (b) the dorsal view, (c) the right lateral side view, and (d) the posterior view. Cartography of the tracts’ position, direction by color: red for right-left, blue for foot-head, green for anterior-posterior [289].
Figure 5. DTI can be used to construct a transversely isotropic model by overlaying axonal fiber tractography on a finite element mesh: (a) DTI-informed Finite Element Model; tractography shows complex fibers from (b) the dorsal view, (c) the right lateral side view, and (d) the posterior view. Cartography of the tracts’ position, direction by color: red for right-left, blue for foot-head, green for anterior-posterior [289].
Sensors 22 07824 g005
Figure 6. Sample of various kinds of eye-tracking tools: (a) eye-tracking glasses [314]; (b) helmet-mounted [315]; (c) remote or table [316].
Figure 6. Sample of various kinds of eye-tracking tools: (a) eye-tracking glasses [314]; (b) helmet-mounted [315]; (c) remote or table [316].
Sensors 22 07824 g006
Figure 7. Facial expression recognition: (a) feature point tracking; (b) dense flow tracking [317].
Figure 7. Facial expression recognition: (a) feature point tracking; (b) dense flow tracking [317].
Sensors 22 07824 g007
Figure 8. Placement of fEMG electrodes and a sample of a filtered EMG signal [319].
Figure 8. Placement of fEMG electrodes and a sample of a filtered EMG signal [319].
Sensors 22 07824 g008
Figure 9. Other examples of biometric traits.
Figure 9. Other examples of biometric traits.
Sensors 22 07824 g009
Figure 10. Real-time Vilnius Happiness Index (a) and the mean magnitudes of valence, by the hour, on weekdays (b).
Figure 10. Real-time Vilnius Happiness Index (a) and the mean magnitudes of valence, by the hour, on weekdays (b).
Sensors 22 07824 g010
Figure 11. Distribution of correlations based on 15 criteria applied to 169 countries, their publications, and citations, as a CSP map.
Figure 11. Distribution of correlations based on 15 criteria applied to 169 countries, their publications, and citations, as a CSP map.
Sensors 22 07824 g011
Figure 12. CSP map showing the success of countries in terms of the numbers of publications on AFFECT recognition (CSPN) in Web of Science journals with impact factor.
Figure 12. CSP map showing the success of countries in terms of the numbers of publications on AFFECT recognition (CSPN) in Web of Science journals with impact factor.
Sensors 22 07824 g012
Figure 13. CSP map showing the success of countries in terms of the number of citations of their publications on AFFECT recognition (CSPC) in Web of Science journals with impact factor.
Figure 13. CSP map showing the success of countries in terms of the number of citations of their publications on AFFECT recognition (CSPC) in Web of Science journals with impact factor.
Sensors 22 07824 g013
Figure 14. CSP map showing the number of articles on AFFECT recognition and the numbers of citations in Web of Science journals with impact factor.
Figure 14. CSP map showing the number of articles on AFFECT recognition and the numbers of citations in Web of Science journals with impact factor.
Sensors 22 07824 g014
Table 1. Traditional non-invasive neuroscience methods.
Table 1. Traditional non-invasive neuroscience methods.
MethodsAuthor(s)Description
Electroencephalography (EEG)[111,253,254,255,256,257,258,259,260,261,262,263,264,265,266]EEGs capture brainwave variations, using recorded amplitudes to monitor mental states that include alpha waves (relaxation), beta waves (wakefulness), delta waves (sleep), and theta waves (calmness) [255]. An EEG signal comprises five brain waves and measuring the activity of certain brain areas can reveal the state of the subject’s cortical activation. Each wave is characterized by different amplitudes and frequencies, and corresponds to distinct cognitive states [265].
Magnetoencephalography (MEG)[111,253,254,255,256,259,260,267]Using magnetic potentials, an MEG records brain activity at the scalp level. A helmet with sensitive detectors is placed on the subject’s head to track the signal [255], and the MEG detects the magnetic fields produced by electromagnetic fields [111].
Transcranial Magnetic Stimulation (TMS) (Figure 3)[111,251,253,255,258,260,267]TMS modulates the activity of certain brain areas located 1–2 cm below the skull, without reaching the neocortex, using magnetic induction [255]. When TMS is used, short electromagnetic impulses are applied at the scalp level. This instrument can stimulate or inhibit a particular cortical area [111].
Near Infrared Spectroscopy (NIRS)[267,268,269]NIRS measures hemodynamic alterations accompanying brain activation and is a simple bedside technique [269]. NIRS makes use of the near-infrared region of the electromagnetic spectrum (about 700–2500 nm). Measurements are taken of light scattered from the surface of and through a sample, and NIR reflectance spectra can give rapid insight into the properties of a material without altering the sample [268].
Steady-State Topography (SST)[251,253,255,256,260]SST can be applied to track high-speed changes and measure the activity of the human brain. This tool is very commonly used in neuromarketing research and cognitive neuroscience [255].
Functional Magnetic Resonance Imaging (fMRI) (Figure 4)[111,251,253,254,255,256,258,259,260,261,263,264,266,267]fMRI is suitable for use within neuromarketing studies, as brain activity can be measured in subjects performing certain tasks or experiencing marketing stimuli. It allows for the observation of deep brain structures, and hence can reveal patterns [255]. fMRI can also measure increases in oxygen levels in the blood flow to the brain and can detect the active cortical regions [111].
Positron Emission Tomography (PET)
(Figure 4)
[111,251,253,254,256,259,260,261,267]The subject is injected with a radioactive substance, and the flow of the substance is then measured. Significant increases in the flow are seen in activated areas [111].
Diffusion Tensor Imaging (DTI) (Figure 5)[267,270,271]This is an MRI-based neuroimaging technique that allows the user to estimate the location, anisotropy and orientation of the brain’s white matter tracts [271]. DTI makes it possible to visualize and characterize white matter fasciculi in two and three dimensions [270].
Table 2. Physiological and behavioral biometrics.
Table 2. Physiological and behavioral biometrics.
TechniqueAuthor(s)Description
Physical/Physiological Features
Eye Tracking (ET)
(Figure 6)
[111,251,253,254,255,256,257,258,259,260,261,264,265,266,267]ET determines the areas at which the subject is looking and for how long, and also tracks the movement of the subject’s eyes and changes in pupil dilation while the subject looks at stimuli. With this technique, behavior and cognition can be studied without measuring brain activity [255]. By measuring eye movements and visual attention, an eye tracker determines the point of regard [265].
Blinking[261,264,293] Eye blinking forms the basis of the new biometric emotions identifier proposed by Abo-Zahhad et al. [293]. These authors outline where eye blinking signals come from and give an overview of the features of the EOG signals from which the eye blinking waveform is extracted.
Iris characteristics User-oriented examinations were applied to find the relationships between personality and three common iris characteristics: pigment dots, crypts, and contraction furrows [294]. Dark-eyed individuals typically have higher scores for neuroticism and extraversion [295], sociability [296], and ease of emotional arousal [297].
Facial Action Coding (FC)/Facial Expression Analysis Surveys
(Figure 7)
[253,254,255,256,257,258,260,261,263,264,265,298]FC uses a video camera to track micro-expressions that correspond to certain subconscious reactions. The activity of the facial muscles is tracked [255]. Scientists and practitioners have developed various open data datasets (KaoKore Dataset, CelebFaces At-tributes Dataset, etc.) and applied elicitation techniques (gamification, virtual reality) in practice.
Facial Electromyography (fEMG) (Figure 8)[251,253,254,255,256,259,260,261,262,263,298,299] fEMG is used in measuring and evaluating the physiological properties of facial muscles [255].
Odor[300]This a method of emotion recognition based on an individual’s odor [300]. An emotional mood, for example a period of depression, may affect body odor [301].
Keystroke dynamics and mouse movements (Figure 9)[302]AFFECT states can be determined by how a person moves a computer mouse while sitting at a computer.
Skin Conductance (SC)/Galvanometer or Galvanic Skin Response (GSR)[111,251,253,255,256,258,260,261,262,264,265,267]SC is highly correlated with the rate of perspiration, and is often linked to stress as well as to the processes happening in the nervous system [261]. SC methods measure arousal based on tiny changes in conductance that occur when something activates the autonomic nervous system [255].
The sympathetic branch of the autonomic nervous system controls the skin’s sweat glands, and the activity of the glands determines the galvanic skin response [265].
Heart rate (HR)/Electrocardiogram (ECG)[19,111,251,256,261,303]An ECG is used to measure the electrical activity of the heart [261]. An ECG relies on cardiac electrical activity and measures the electrical impulses that travel through the heart with each beat, causing the heart muscle to pump blood. In ECGs of a normal heartbeat, the timing of the lower and top chambers of the heart is charted [303].
Respiratory Rate Assessment (RRA)[111,261,304]Respiratory rate, one of fundamental vital signs, is sensitive to various pathological situations (clinical deterioration, pneumonia, adverse cardiac events, etc.), as well as stressors [304].
Skin temperature (SKT)[305]SKT data can be used to measure the thermal responses of human skin. SKT depends on the complex relationship between blood perfusion in the skin layers, heat exchange with the environment, and the central warmer regions of the skin [305]
Photoplethysmography (PPG) or Blood volume pulse (BVP)[305]Changes in the amplitudes of PPG signals are related to the level of tension in a human being. PPG is a simple, non-invasive method of taking measurements of the cardiac synchronous changes in the blood volume [305].
Trapezium electromyogram[306]EMG is a technique that can be used to evaluate and record the electrical activity generated by skeletal muscle [306], for example the trapezius muscle [307].
Neurotransmitter (NT)[251,308]Brain neurotransmitters are particular chemical substances that act as messengers in chemical synaptic transmissions and can transmit emotive information. They have excitability and inhibitive abilities [308].
Voice/Speech/Voice Pitch Analysis (VPA)[263,267,300,309,310]This is a method of emotion recognition that relies on the person’s voice.
Implicit Association Test (IAT)[255,264,311]IAT measures individual behavior and experience by assessing the reaction times of subjects to determine their inner attitudes. The subjects are given two cognitive tasks, and measurements are taken of the speed at which they associate two distinct concepts (brands, advertisements, etc.) with two distinct assessed features. IATs can be used to identify hierarchies of products by means of comparisons [255].
Mouse Tracking (MT)[257,312]Recognition of a user’s emotions is possible based on their mouse movements. Users can be classified by extracting features from raw data on mouse movements and employing complex machine learning techniques (e.g., a support vector machine (SVM)) and basic machine learning techniques (e.g., k-nearest neighbor) [312].
Signature (Figure 9)[298,299,300,309]Emotions can be identified by their handwriting style, and in particular their signature.
Gait (Figure 9)[298,299,300,309]This method allows for emotions recognition based on a person’s walking style or gait [300].
Lip Movement[299]Lip movement measurements are a recently developed form of biometric emotions recognition that is very similar to the way a deaf person determines what is being said by tracking lip movements [299].
Gesture[298,309]Gesture recognition is used to identify emotions rather than a person, and gestures are grouped into certain categories [298].
Keystroke/Typing Recognition (Figure 9)[169,300]In this method, the unique characteristics of a person’s typing style are used for emotions identification purposes [300].
Table 3. An overview of studies on arousal, valence, affective attitudes, and emotional and physiological states (AFFECT) recognition.
Table 3. An overview of studies on arousal, valence, affective attitudes, and emotional and physiological states (AFFECT) recognition.
StimulusAFFECTMethodsReference
Recording of dances, videoAnger, fear, grief, and joyGSR, eye movement (Figure 6)[334]
Neurophysiological research from 2009 to 2016Overview of the existing works in emotionEEG[335]
Affective stimuliSurprise, disgust, anger, fear, happiness, and sadnessEEG[336]
The visual stimuli, black and white photographs of 10 different modelsHappy, sadMEG[337]
20 face actors, each displaying happy, neutral, and fearful facial expressionsHappy, neutral, fearfulMEG[338]
Task-irrelevant emotional and neutral picturesPleasant, unpleasantTMS[339]
A subset of music videos from the Dataset for Emotions Analysis using Physiological signals (DEAP) datasetValence, arousalfNIRS, EEG[340]
Emotional faces for the emotion perception testPleasant, unpleasant, neutralfMRI[341]
-StressPET[342]
VideoHappiness, sadness, disgust, anxiety, pleasant, unpleasant, neutralPET[343]
Facial Emotion Selection Test (FEST)Positive, negativeDTI[344]
Real time biometric-emotional data collection from depersonalized passersby Neutral, happiness, sadness, surprised, anger, scared, valence, arousal, disgust, interest, confusion, boredomEmotional, Affective and Biometrical States Analytics of the Built Environment Method[345]
Real time data collection Happy, sad, angry, surprised, scared, disgusted, valence, arousalMethod of an Affective Analytics of Demonstration Sites[346]
Scanning a human-centered built environment, real time data collectionSadness, disgust Happiness, anger, fear surprise, boredom, neutral, arousal, valence, confusion, and interest Affect-Based Built Environment Video Analytics[347]
Remote real time data Happiness, arousal, valenceVideo Neuro-advertising Method[93,348]
Smelling stripsHappy, radiant, well-being, soothed, energized, romantic, sophisticated, sensual, adventurous, comforted, amused, interested, nostalgic, revitalized, self-confident, surprised, free, desirable, daring, excitedIRT[349]
TextPositive and negative valenceEye tracking (ET)[350]
21 video fragmentsHigh/low arousal, high/moderate/low valenceEye tracking (ET)[351]
CryptsFeelings, tendermindedness, warmth, trust and positive emotionsIris[294]
The simulation environmentWellness/malaise, relaxation/tension, fatigue/excitementRetina[352]
ColorsSurprise, Happiness, Disgust, Anger, Sadness and FearBlinking, heart rate[353]
HSV color spaceFear, disgust, surprise, joy, anticipation, sadness, anger, trust Blinking[354]
Review of existing
novel facial expression recognition systems
Anger, disgust, fear, happiness, sadness, surprise and
neutral
Facial expression recognition[355]
Destination promotional videosPleasure, arousalSkin conductance, facial electromyography[355]
Games scenario between a human user
and a 3D humanoid agent
Arousal, valence, fear, frustrated, relaxed, joyful, excitedElectromyography, skin conductance[356]
Dramatic filmReal-time emotion estimationEEG, Heart Rate, Galvanic Skin Response[357]
Emotional state of a driver while in an automobileHappy, angerElectrocardiogram (ECG)[358]
MusicPleasure, unpleasureHeart and respiratory rates[359]
Trier Social Stress TestStress, relaxRespiratory rate and heart rate[360]
Voice- and speech-pattern analysisNormal, angry, panicVoice, speech[361]
Implicit anxiety-related self-conceptShame, guilt proneness, anxiety, anger-hostilityImplicit Association Test[362]
Case studiesSelf-control, happiness, anger, fear, sadness, surprise, and anxietyMouse Tracking[302]
Academic study websiteNeutral, positive, negativeMouse Tracking[363]
Motor improvisation taskJoy, sadness, and a neutral control emotionSignature[364]
-Neutral, joy, anger, sadnessGait[365]
TextNeutral, joy, surprise, fear, anger, disgust, sadnessLip Movement[366]
DatasetAnger, disgust, fear, happiness, sadness, and surpriseKeystroke dynamics[367]
Recall of past emotional life episodesValence, arousalEEG[368]
Physiological emotional database for real participantsValence, arousal Peripheral signals, EEG[369]
Data from wearable sensors on subject’s skinHigh/neutral/low arousal and valenceECG, EEG, electromyography (EMG) [370]
Real time heartbeat rate and skin conductanceHigh/low arousal and valenceGSR, temperature, breathing rate, blood pressure, EEG[371]
Multimedia contents based on IPTV, mobile social network service, and blog servicePleasant, unpleasantGSR, skin temperature, heart rate[372]
Stress stimuliHigh/low valence, high/low arousalGSR, heart rate, ECG[373]
CCD-capture human face, measure user’s physiological dataPleasant, unpleasantGSR, photoplethysmogram (PPG), skin temperature[374]
Music videosHigh/low arousal, high/low valenceEEG[375]
Detect the current mood of subjectsHigh/low arousal, high/low valenceEEG[376]
DEAP databaseJoy, fear, sadness, relaxationEEG, back-propagation neural network[377]
Hjorth features, statistics features, high order crossing featuresHappy, calm, sad, scaredEEG, CNN, LSTM recurrent neural networks[378]
Thirty film clipsSerenity, hope, joy, awe, love, gratitude, amusement, interest, pride, inspiration EEG[379]
Transcendental meditationEcstasyEEG[380]
Ultimatum gameAcceptanceEEG[381]
Driving a car equippedTrustEEG, GSR[382]
12 prototypes that were designed based on the framework of diachronic opposite emotionsAmazement, happinessEEG, SD tests[383]
Audio-visual emotion databasePleasure, irritation, sorrow, amazement, disgust, and panic-[384]
Sleep measuresGriefEEG[385]
Real episodes from subjects’ livesGrief, angerEEG[386]
Virtual environment consisting of three types of cuesPensiveness relaxation, non-arousal, stressEEG[387]
Patient with dramatic, episodic, seizure-related rage and violenceRage and aggressionVideo-EEG recording[388]
DEAP databaseRage EEG, multiclass-common spatial patterns[389]
Brainstem auditory evoked potentials Rage and self-injurious behaviorEEG, brainstem evoked potentials (BAEPs)[390]
Acoustic annoyanceAnnoyanceEEG[391]
70 dBA white noise and pure tones at 160 Hz, 500 Hz and 4000 HzAnnoyanceEEG[392]
30 pictures from International Affective Picture System Neutral, joy, sadness anger, surprise, valence (positive and negative), contempt, fear, disgustEEG[393]
Movie clipsAnger, fear, anxiety, disgust, contempt, joy, happinessEEG[394]
Emotional factorAggressivenessEEG[395]
Buss–Durkee questionnaireAggressivenessEEG[396,397]
Reward anticipationAnticipationEEG[398]
Structured Clinical Interview for DSM-IVAnticipationEEG, fMRI[399]
DEAP databaseHigh/low valence and arousalEEG [400,401,402,403,404,405]
Reading and reflection task about MuslimsDisapprovalEEG, ANOVA[406]
Simulated train drivingFatigue and distractionEEG, Multi-type feature extraction, CatB-FS algorithm[407]
Faces (the participant’s own face, the face of a stranger, and a celebrity’s face)AdmirationEEG, 18-Items Narcissistic Admiration and Rivalry Questionnaire[408]
Presentation of 12 virtual agentsAcceptanceEEG and the virtual agent’s acceptance questionnaire (VAAQ) [409]
English prosocial and opposite antisocial words in a sentenceApproval and disapprovalEEG, ANOVA[410]
Data from Facebook commentsEnjoyment (peace and ecstasy), sadness (disappointment and despair), fear (anxiety and terror), anger (annoyance and fury), disgust (dislike and loathing) surprise, other (neutral)Natural language processing (NLP); convolutional neural network (CNN) and long short-term memory (LSTM); Random Forest and support vector machine (SVM),
standard Vietnamese social media emotion corpus (UIT-VSMEC)
[411]
Video clipsPride, love, amusement, joy, inspiration, gratitude, awe, serenity, interest, hope fNIRS[412]
User’s interaction with a web pageArousal/valence
anxiety and aggressiveness
Facial expressions, Facial Action Coding System, specialized questionnaires[413]
An investment game that uses artificial agentsTrustEEG[285]
Simulated autonomous systemTrustEEG and GSR[382]
The iCV-MEFED dataset. For each subject in the iCV-MEFED dataset, five sample images were captured.Neutral, angry, contempt, happy, happily surprised, surprisingly fearful, surprisedFacial emotion recognition (Figure 7), CNN; Inception-V3 network[414]
Dynamic emotional facial expressions were generated by using FACSGenContempt, disgust, sadness, neutralANOVA, Participants completed emotion scales[415]
Film clipsPride, love, amusement, joy, inspiration, gratitude, awe, serenity, interest, hopeEEG, multidimensional scaling (MDS), intra-class correlation coefficients (ICCs)[379]
Simulated driving systemVigilance EEG and forehead electrooculogram (EOG), eye tracking (Figure 6)[416]
DEAP datasetOptimism, pessimism, calmEEG, CNN[166]
MusicRelaxing-calm, sad-lonely, amazed-surprised, quiet-still, angry-fearful, happy-pleasedBinary relevance (BR), label powerset (LP), random k-label sets (RAKEL), SVM[417]
MusicHappiness, love, anger and sadnessEEG, SVM, Multi-Layer Perceptron (MLP), and K-nearest Neighbor (K-NN)[418]
Three sets of picturesAnticipationFacial emotions (Figure 7), action observation network (AON), two-alternative forced-choice procedure, Reaction times (RT), ANOVA[419]
Individuals enacted aggressive actions, angry facial
expressions and other non-aggressive emotional gestures
Aggressive actions and angerKinect infrared sensor camera: hand movement, body posture, head gesture, face (Figure 9), and speech. SVM
and the rule-based features
[420]
Images of faces from the Ekman and Friesen series of Pictures of Facial AffectGriefFacial Expression of Emotion Test (Figure 7)[421]
MusicSoothing, engaging, annoying and boring FBS fusion of three-channel forehead biosignals, ECG[422]
FilmsAmusement, anger, grief, and fearFingertip blood oxygen saturation (OXY), GSR, HR[423]
Polish emotional database, database consists of 12 emotional statesRage, anger, annoyance, grief, sadness, pensiveness, ecstasy, joy, serenity, terror, fear, apprehensionSpeech, KNN Algorithm[424]
VideoNonverbal behaviors signaling dominance and submissivenessImplicit association test, body language, MANOVA[425]
Music High/low valence, high/low arousalEMG, EEG, HRV, GSR[426]
The external auditory canal is warmed or cooled with water or airHigh and low arousalElectrodermal activity (EDA), HRV, activity tracker, EMG, SKT [427]
After-image experiments, direct visual observation, photography of the eyes, recording of the corneal reflexHigh/low valence, high/low arousalGSR, EMG[428]
Assessment of emotional states experienced by racing driversSadness, fear, anger, surprise, happiness, and disgustECG, EMG, respiratory rate, GSR[429]
Dataset of standardized facial expressionsHappiness,
sadness, anger, disgust, fear, and surprise
Facial Action Coding (FC)[430]
Neighbor soundsArousal, valencefEMG, heart rate (HR), electrodermal activity (EDA)[431]
Audio visual stimuliJoy, sadness, anger, fearECG[432]
Playing with the infant to elicit laughterJoySkin temperature (SKT)[433]
Two different kinds of video inducing happiness and sadnessHappiness, sadnessPhotoplethysmography (PPG), skin temperature (SKT) [434]
International Affecting Picture System (IAPS) picturesJoy, sadness, fear, disgust, neutrality, amusementElectromyogram signal (EMG), respiratory volume (RV), skin temperature (SKT), skin conductance (SKC), blood volume pulse (BVP), heart rate (HR)[435]
Movie and music video clipsArousal, valenceElectrooculogram (EOG), electrocardiogram (EEG)
trapezium electromyogram (EMG)
[436]
Audio/visual Anger, happiness, sadness, pleasureGSR, EMG, respiratory rate, ECG[437]
Table 4. Descriptive statistics for the dependent variables of two models.
Table 4. Descriptive statistics for the dependent variables of two models.
Descriptive StatisticsDescriptive Statistics of 2 Models Dependent Variables
Publications—Country SuccessTimes Cited—Country Success
Model 1 (CSPN)Model 2 (CSPC)
Mean0.13540.9279
Median0.07850.3297
Maximum0.76427.7034
Minimum0.00150.0000
Standard Deviation0.15571.3893
Skewness1.55332.4316
Kurtosis5.36149.8641
Observations166165
Table 5. Goodness-of-fit testing for two models.
Table 5. Goodness-of-fit testing for two models.
Independent VariablesDependent Variables
Publications—Country Success Times Cited—Country Success
Model 1 (CSPN)Model 2 (CSPC)
GDP per capita0.7725 ***
(1.2062)
0.6368 ***
(7.1524)
GDP per capita in PPP0.6975 ***
(8.4298)
0.6467 ***
(7.3418)
Ease of doing business ranking−0.4821 ***
(−4.7652)
−0.4390 ***
(−4.2317)
Corruption perceptions index0.7624 ***
(1.5319)
0.6341 ***
(7.1014)
Human development index0.6717 ***
(7.8530)
0.5347 ***
(5.4799)
Global gender gap0.4797 ***
(4.7348)
0.3354 ***
(3.0834)
Happiness index0.7037 ***
(8.5774)
0.5315 ***
(5.4340)
Environmental performance index0.6939 ***
(8.3444)
0.5166 ***
(5.2256)
Freedom and control−0.5808 ***
(−6.1782)
−0.3832 ***
(−3.5932)
Economic freedom0.6535 ***
(7.4765)
0.5801 ***
(6.1681)
Democracy Index0.6227 ***
(6.8912)
0.4429 ***
(4.2777)
Unemployment rate−0.1860
(−1.6398)
−0.1642
(−1.4412)
Healthy life expectancy0.6312 ***
(7.0471)
0.5194 ***
(5.2635)
Fragile state index−0.7229 ***
(−9.0606)
−0.5405 ***
(−5.5634)
Economic decline index−0.6358 ***
(−7.1339)
−0.5597 ***
(−5.8487)
Standardized beta coefficients: *** significant at α = p < 0.001.
Table 6. Descriptive statistics for two models.
Table 6. Descriptive statistics for two models.
Descriptive StatisticsDescriptive Statistics of 2 Models
Publications—Country SuccessTimes Cited—Country Success
Model 1 (CSPN)Model 2 (CSPC)
Pearson’s correlation coefficient (|r|)0.62720.5142
Coefficient of determination (R2)0.69430.5114
Adjusted R20.61910.3912
Standard deviation0.15571.3693
p values (probability level)0.00000.0000
F9.23564.2570
Table 7. Standardized beta coefficient values of the dependent variables.
Table 7. Standardized beta coefficient values of the dependent variables.
Independent VariablesStandardized Beta Coefficient Values of the Dependent Variables
Publications—Country SuccessTimes Cited—Country Success
Model 1 (CSPN)Model 2 (CSPC)
1GDP per capita0.7735 **−0.0853
2GDP per capita in PPP−0.5123 *0.5304 *
3Ease of doing business ranking0.25350.1599
4Corruption perceptions index0.23920.3633
5Human development index0.1697−0.1836
6Global gender gap−0.02280.0703
7Happiness index0.0800−0.0916
8Environmental performance index−0.0601 **/0.1819
9Freedom and control−0.02990.0846
10Economic freedom0.45580.3239
11Democracy Index−0.15240.0577
12Unemployment rate0.03530.0552
13Healthy life expectancy0.00470.0696
14Fragile state index−0.00080.0246
15Economic decline index0.0147−0.0301
Standardized beta coefficients: * significant at—p < 0.1, ** significant at p < 0.01.
Table 8. How country success and its factors influence the two indicators.
Table 8. How country success and its factors influence the two indicators.
Publications—Country SuccessTimes Cited—Country Success
Model 1 (CSPN)Model 2 (CSPC)
When a country’s success increases by 1%, the indicator improves by
1.962%2.101%
The 17 independent variables explain the dependent variable under analysis by
89.5%54.0%
Table 9. Benefits and limitations of biometric technologies.
Table 9. Benefits and limitations of biometric technologies.
ToolBenefitsLimitations
Electroencephalography (EEG) Can be used to measure rapid changes in neural activity by the millisecond [588]
Minimally invasive and/or commercial research packages are available [588]
Participants can move around and benefit from enriched/social environments [588]
Uses portable instruments and natural environments; there is long tradition of well-controlled experiments; measurement processes requiring several hours are possible in practice [589]
It is difficult to pinpoint neural signals from particular brain areas (poor spatial resolution) [588]
Measurements from structures deep within the brain (e.g., nucleus accumbens) are not possible [588]
Published studies on biometrics based on this signal have used high-cost medical equipment [590]
Subjects have reported discomfort since it is necessary to apply scalp neck gel to improve conduction between electrodes [590]
Functional magnetic resonance imaging (fMRI) Has the ability to observe activity in small structures [588]
Differentiates signal from neighboring areas [588]
Measurements of the whole brain are possible [588]
Physically restrictive; participants lie on their back in the scanner and cannot move around [588]
Expensive, and equipment is in high demand [588]
Equipment cannot be removed from the laboratory; the sequence of the activities is difficult to monitor [589]
MEG (magnetoencephalography) Some MEG study protocols are quite well suited for design studies; there is a long tradition of well-controlled experiments based on EEG; optimal space-time-resolution [589]Equipment cannot be removed from the laboratory; the location of existing brain activity is relatively difficult to determine [589]
Electrocardiogram (ECG)Highly reliable source providing precise features of the electrical and physiological activity taking place with an individual; high performance has been noted in prior research on this signal [591]; it can easily be fused with other signals [592]One of the great difficulties listed in the literature is a lack of user acceptance, as its implementation at the physical level makes it fairly uncomfortable [593]; body posture can also affect cardiac signals [594]
MRI (magnetic resonance imaging) [589]Good for studies comparing groups of people Equipment cannot be removed from the laboratory
PET (positron emission tomography) [589]Good for comparing groups of people or natural tasks Radioactive tracer is injected into participants; equipment cannot be removed from the laboratory
Eye tracking [588]Offers strong nuanced data on visual attention and gaze pathways, and can be integrated with pupillometryDoes not measure inferences, the valence of the response, thoughts, or emotions
Iris [595]Unique data; input is stable throughout lifetime; non-intrusive Large data template; images are frequently improperly focused; single-source; high cost
NIRS (near-infrared spectroscopy) [589]Uses portable instruments and natural environments; some NIRS study protocols are well suited for design studies; measurement processes requiring several hours are possible in practice Difficulties in determining the location of brain activity; few groups are using NIRS for cognitive studies as yet
Transcranial magnetic stimulation (TMS/tDCS) [588]Can be used to show causalityLimited to investigating the function of brain surface areas
Can generally only lessen (TMS/tDCS) or increase (tDCS) neural activity in a general sense; cannot test for specific levels of activity or influence specific circuits
Forehead electrooculogram (EOG)These signals are low cost, and are not invasive [596]Electrodes used for the acquisition of the signals can present instability to eye flicker [597]; signals are highly affected by noises in the immediate vicinity [596]
Skin conductance response (SCR), heart rate, pupil dilation [588]Simple; well validated
Unobtrusive equipment; allows for more natural interactions with the environment
Cannot distinguish between positive and negative arousal
Lips [598]Easy acquisition and lip characteristics; it is possible to extract the outline even if the person has a beard or a moustacheAn image of the lips cannot be acquired when they are moving
Facial electromyography (fEMG), facial affective coding [599]This is a precise and sensitive method for measuring emotional expression
Unlike self-reports, fEMG does not depend on language and does not require cognitive effort or memory
Yields large amounts of data and is continuous and scalable (hence more credible)
Dynamic tracking of emotional (potentially unconscious) responses to ongoing stimuli/information
Can measure facial muscle activities for the sake of balancing weakly evocative emotional stimuli
Less intrusive than other physiological measures such as fMRI and EEG
Automatic facial encoding software/algorithms are available
The technique is intrusive and may alter natural expression
The number of muscles that can be triggered is limited by how many electrodes can be attached to the face
Requires electrodes to be directly attached to the face (in a lab)
Certain medicines that act on the nervous system, such as muscle relaxants and anticholinergics, can impact the final electromyography (EMG) result
GaitConvenient and non-intrusive (2D); subjects can be evaluated covertly, without their knowledge [595]During the assessment stage, light affects the results; clothing may affect detection [46]
Data may alter throughout a lifetime (injuries, training, footwear); specialist personnel required for data processing; large data template [595]
Body motion [595] Unique and various sources of data, small template sizeTime consuming; subject must cooperate with reader; specialist personnel required for data processing
Table 10. Comparison of biometric technologies by seven characteristics (traits).
Table 10. Comparison of biometric technologies by seven characteristics (traits).
UniversalityUniqueness or DistinctivenessPermanenceCollectabilityPerformanceAcceptabilityCircumvention
Iris/pupilHigh [141,602,603,604,605,606]High [141,602,603,604,605,606]High [141,602,603,604,605,606]Medium [141,602,603,604,605,606]High [141,602,603,604,605,606]Low [141,602,604,605,606]
Medium [603]
High [141,602]
Low [603,604,605,606]
FaceHigh [141,602,603,604,605,606]Low [141,602,604,605,606]
Medium [603]
Medium [141,602,603,604,605,606]High [141,602,603,604,605,606]Low [141,602,603,604,605,606]High [141,602,603,604,605,606]Low [141,602]
High [603,604,605,606]
OdorHigh [602,603,604,605,606]High [602,603,604,605,606]High [602,603,604,605,606]Low [602,603,604,605,606]High [602,603]
Low [604,605,606]
Low [602]
Medium [603,604,605,606]
Low [602,603,604,605,606]
Keystroke dynamics and mouse movements, Mouse TrackingLow [141,602,604,605,606]Low [141,602,604,605,606]Low [141,602,604,605,606]Medium [141,602,604,605,606]Low [141,602,604,605,606]Medium [141,602,604,605,606]Medium [141,602,604,605,606]
Skin temperature -thermogramHigh [141,604,605,606]High [141,604,605,606]Low [141,604,605,606]High [141,604,605,606]Medium [141,604,605,606]High [141,604,605,606]High [141]
Low [604,605,606]
Voice/Speech/Voice Pitch Analysis (VPA)Medium [141,602,604,605,606]Low [141,602,604,605,606]Low [141,602,604,605,606]Medium [141,602,604,605,606]Low [141,602,604,605,606]High [141,602,604,605,606]Low [141,602]
High [604,605,606]
SignatureLow [141,602,603,604,605,606]Low [141,602,603,604,605,606]Low [141,602,603,604,605,606]High [141,602,603,604,605,606]Low [141,602,604,605,606]
Medium [606]
High [141,602,603,604,605,606]Low [141,602]
High
[603,604,605,606]
GaitMedium [602,604,605,606]
High [603]
Low [141,604,605,606]
Medium [603]
Low [141,604,605,606]
Medium [603]
High [602,603,604,605,606]Low [602,603,604,605,606]High [602,604,605,606]
Medium [603]
Medium [602,603,604,605,606]
Table 11. Comparison of biometric technologies by various attributes.
Table 11. Comparison of biometric technologies by various attributes.
Easy of UseError IncidenceAccuracyUser AcceptanceLong Term StabilityCostSize of Template SecuritySocially IntroducedSocial
Acceptability
PopularitySpeed
Eye Tracking (ET) 0.5°–1° [607] Low-High [608]
Iris/pupilMedium [602,603,609]Lighting [602,609]
Lighting, glasses [603]
Very High [602,609]
High [320,603,610,611]
Medium [602,609]High [602,609,610]
Medium [320,603]
High [320,603,611]Small [320]Medium [320]
High [603]
Very high [609]
1995 [603]Medium-Low [610,611]Medium [603]Medium [603]
FaceMedium [602,609]
High
[603]
Lighting, age glasses, hair [602,603,609]High [602,609]
Low [320,602]
Medium-Low [610,611]
Medium [602,609]Medium [602,609]
Low [320,602]
High [320]
Medium [602,610,611]
Large [320]
Low [320]
Medium [602,610,611]
2000 [603]High [610,611]High [603]Medium [603]
Keystroke dynamics and mouse movements, Mouse TrackingLow [602]Device, weather [602]Low [602] Low [602]Medium [602] Low [602]2005 [603] Low [603]Medium [603]
Voice/Speech/Voice Pitch Analysis (VPA)High [602,603,609]Noise, colds [602,603,609]High [602,609]
Low [320,603]
Medium [610,611]
High [602,609]Medium [602,603,609]
Low [320]
Medium [320,610,611]
Low [603]
Small [320]Low [320]
High [603]
Medium [609]
1998 [603]High [610,611]High [603]High [603]
SignatureHigh [602,603,609]Changing signature [602,603,609]High [602,609]
Medium [320,603]
Low [610,611]
High [602]
Very High [609]
Medium [602,609]
Low [320,603]
Low [320]
Medium [603,610,611]
Medium [320]Low [320]
High [603]
Medium [609]
1970 [603]High [610,611]High [603]High [603]
Gait Medium [610] Medium [610] Low [610]
Lip Movement Medium [603] Medium [603]Medium [603]Small [603]High [603]
Gesture Low [612]
Table 12. Comparison of performance metrics for biometric technologies by various authors.
Table 12. Comparison of performance metrics for biometric technologies by various authors.
FARFRRCERFTE
Iris/pupil0.94% [603]
0.0001–0.94 [613]
2.4649% [614]
0.99% [603]
0.99–0.91 [613]
2.4614% [614]
0.01% [603]0.50% [603]
Face1% [603]
16% [614]
10% [603]
16% [614]
3.1% [615]
Keystroke dynamics and mouse movements, Mouse Tracking7% [603]
0.01% [614]
0.10% [603]
4% [614]
1.80% [603]
Voice/Speech/Voice Pitch Analysis (VPA)2% [603,613]
7% [614]
10% [603,613]
7% [614]
6% [603]0.5% [615]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Kaklauskas, A.; Abraham, A.; Ubarte, I.; Kliukas, R.; Luksaite, V.; Binkyte-Veliene, A.; Vetloviene, I.; Kaklauskiene, L. A Review of AI Cloud and Edge Sensors, Methods, and Applications for the Recognition of Emotional, Affective and Physiological States. Sensors 2022, 22, 7824. https://doi.org/10.3390/s22207824

AMA Style

Kaklauskas A, Abraham A, Ubarte I, Kliukas R, Luksaite V, Binkyte-Veliene A, Vetloviene I, Kaklauskiene L. A Review of AI Cloud and Edge Sensors, Methods, and Applications for the Recognition of Emotional, Affective and Physiological States. Sensors. 2022; 22(20):7824. https://doi.org/10.3390/s22207824

Chicago/Turabian Style

Kaklauskas, Arturas, Ajith Abraham, Ieva Ubarte, Romualdas Kliukas, Vaida Luksaite, Arune Binkyte-Veliene, Ingrida Vetloviene, and Loreta Kaklauskiene. 2022. "A Review of AI Cloud and Edge Sensors, Methods, and Applications for the Recognition of Emotional, Affective and Physiological States" Sensors 22, no. 20: 7824. https://doi.org/10.3390/s22207824

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop