Next Article in Journal
User Association and Power Control for Energy Efficiency Maximization in M2M-Enabled Uplink Heterogeneous Networks with NOMA
Previous Article in Journal
Permafrost Deformation Monitoring Along the Qinghai-Tibet Plateau Engineering Corridor Using InSAR Observations with Multi-Sensor SAR Datasets from 1997–2018
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Embodied Emotion Recognition Based on Life-Logging

1
Department of Emotion Engineering, University of Sangmyung, Seoul 03016, Korea
2
Team of Technology Development, Emotion Science Center, Seoul 03044, Korea
3
Department of Intelligence Informatics Engineering, University of Sangmyung, Seoul 03016, Korea
*
Author to whom correspondence should be addressed.
Sensors 2019, 19(23), 5308; https://doi.org/10.3390/s19235308
Submission received: 23 October 2019 / Revised: 22 November 2019 / Accepted: 30 November 2019 / Published: 2 December 2019
(This article belongs to the Section Intelligent Sensors)

Abstract

:
Embodied emotion is associated with interaction among a person’s physiological responses, behavioral patterns, and environmental factors. However, most methods for determining embodied emotion has been considered on only fragmentary independent variables and not their inter-connectivity. This study suggests a method for determining the embodied emotion considering interactions among three factors: the physiological response, behavioral patterns, and an environmental factor based on life-logging. The physiological response was analyzed as heart rate variability (HRV) variables. The behavioral pattern was calculated from features of Global Positioning System (GPS) locations that indicate spatiotemporal property. The environmental factor was analyzed as the ambient noise, which is an external stimulus. These data were mapped with the emotion of that time. The emotion was evaluated on a seven-point scale for arousal level and valence level according to Russell’s model of emotion. These data were collected from 79 participants in daily life for two weeks. Their relationships among data were analyzed by the multiple regression analysis, after pre-processing the respective data. As a result, significant differences between the arousal level and valence level of emotion were observed based on their relations. The contributions of this study can be summarized as follows: (1) The emotion was recognized in real-life for a more practical application; (2) distinguishing the interactions that determine the levels of arousal and positive emotion by analyzing relationships of individuals’ life-log data. Through this, it was verified that emotion can be changed according to the interaction among the three factors, which was overlooked in previous emotion recognition.

1. Introduction

The theory of the embodied mind has recently emphasized that emotion should be conceptualized as being operated by the inter-connectivity of the body, the behavior, and the environment because the intrinsic function of the emotion is for its adaptive survival in the environment [1,2]. Human physiological changes and behaviors have been dependent with the environment, and the interactions between them is a mechanism to cope with the environment [3]. Most scholars have agreed that there are correlations among the physiological response, behavior, and environment. Nevertheless, emotion has been recognized by fragmentary independent variables without consideration of the relationships among the three main factors. These are, the physiological response, behavior, and environmental factors. Therefore, it has been primitive that a heuristic understanding of the embodied emotion is missing connections among the three main factors. Moreover, the embodied emotions recognized in the laboratory have been difficult to apply to real life, due to its limitation, compared to the experiences in a complex, real-life environment. Ecological validity has been undermined by recognizing emotion in laboratory settings. Emotional expressions tend to be reduced due to social desirability [4]. Therefore, a field study is necessary to test the feasibility that emotional factors measured in laboratory settings are applicable in real environment [5,6,7].
Field studies of emotion recognition have tried to collect data and measure emotion using wearable sensors or smartphone [4,8,9]. Life-logging applications particularly hold for investigations conducted in the field. An increased use of wearable devices and self-tracking behaviors has been highlighted [10]. Life-logging is the process of automatically recording an aspects of one’s life in digital form [11]. They have mainly measured autonomic nervous system (ANS) for analyzing the physiological response in daily life [12,13,14,15,16,17]. The ANS included activities of sympathetic and parasympathetic nervous system, which has been measured by the responses of cardiovascular, respiratory, and electrodermal [18,19]. In particular, the photoplethysmoogram (PPG) has been increasingly measured from portable devices according to the commercialization of wearable devices, such as smart watches [14,16]. Therefore, the PPG signal is attracting attention as a measure of ANS that can be measured in daily life.
Behavioral patterns associated with emotions have based on individual’s own mobility patterns in life-logging studies. Stress and depression have been correlated with smaller variation of mobility [20,21]. The individual’s own movement patterns can be measured by the global positioning system (GPS) [21,22]. The GPS can be easily measured with sensors built into the smartphone. Emotions have been inherent in physiological mechanisms to adapt to environment [23]. Therefore, emotions have been affected by environmental factors.
The environment changes the physiological responses and human behavior according to emotions, and vice versa [24]. It also affects emotions. Embodied emotions are highly related to the environmental factors, such as sound [25,26] and exposure to ambient noise in daily life has been reported to affect negative emotions and arousal [27,28,29]. Louder or long-lasting noise has been reported to negatively impact emotion. Also, uncontrollable noise has been reported to have more emotional impact. On the other hand, there were the results that the white noise had a positive effect [30,31]. Schuller et al. [32] suggested that arousal is highly correlated with loudness and valence is negative correlated with spectral flux and spectral harmonicity. The arousal and valence levels are the dimension of the Russell’s emotional model [33]. These results indicate that ambient noise gives physical and emotional impact on humans and it could be one of the factors to induce emotions. Moreover, field studies of real-time noise monitoring suggested that environmental noise can be measured using smartphones [34,35,36]. Therefore, it can be easily measured in life-logging.
Despite the proposed application of emotion recognition in real-life, there is still little research, which is often overlooked for emotion recognition in real-time, by considering three factors. Therefore, this study attempts to recognize the emotions of the interactions of the three factors in real-life: Physiological responses by measuring PPG, behavior by measuring GPS, and the ambient noise as an environmental factor by measuring sound. The contributions of this study can be summarized as follows: (1) The emotion was recognized in real-life for more practical applications; (2) the proposed method analyzed the interactions of more causes of emotional determination compared with the previous emotion recognition method that employs fewer factors.

2. Method

2.1. Hypothesis

This study hypothesized that the interactions among the physiological response, the behavioral pattern, and the ambient noise would differ in the emotional arousal and valence.

2.2. Participant

Seventy-nine participants (35 males) without cardiovascular disease were selected by convenience sampling. Their average age was 23(±3). Everyone was given a detailed explanation and provided consent before the field test. Participants were compensated ($140.38) for their role in the field test.

2.3. Data Collection

Data were collected from a field test and not in a usual laboratory environment to ensure the authenticity of the physical experience and environmental factors of daily life. Seventy-nine participants were a part of this field test, which lasted 5 h, every day for 2 weeks, including the weekends. They were given a wearable device and received guidance for the smartphone application developed for this field test. The participants were asked to wear their devices by connecting it to their smartphones throughout the field test. They wore the device throughout the fixed time from 12 pm to 6 pm daily. These 5 h may be a working time or a rest time depending on the participant, but only the people who agreed to continue to measure data during this time participated in the experiment. A notification function in the application was developed to ensure that measurements are taken continuously throughout the experiment without missing data. The application sent a notification to the researcher when there was a lost connection with the sensor or data was not measured for a certain period. If the researcher received the notification from the application, the researcher asked the participant to continuously measure the data via messenger. The collected data consisted of physiological and behavioral responses based on photoplethysmogram, global positioning system location, along with environmental factors based on ambient noise. Physiological responses were separately measured with the wearable device, and the behavioral along with environmental factors were measured with the GPS sensor and microphone embedded in the smartphone, respectively. To avoid disturbing daily activities as much as possible, the wireless PPG sensor, which can be measured with one finger, was worn on the infrequently used hand (mostly left hand) as shown in Figure 1. Also, the data from GPS and the surrounding environment were automatically collected by using the smartphone that the participants always possesses during daily activities. It provided convenience for the participants. The participants only had to connect the application and the sensor at the start of the experiment to collect data.
Participants received an emotional assessment request from the application on time every hour. They answered two emotional questions about how arousal and how pleasant by assessing their overall emotional state during the previous hour, based on the point of emotional evaluation. These two questions are based on two independent dimensions, the arousal axis, and the valence axis, which constitute emotion in Russell’s two-dimensional circumplex model [33]. Russell’s model is one of the most representative emotional models and has been evaluated on a seven-point Likert scale when rating arousal and valence levels in other emotional assessment studies [37,38,39,40,41,42]. Therefore, in this study, participants self-reported their emotions on a seven-point scale. Specifically, the participant checked one of the radio buttons from 1 to 7 points in the application and clicked the submit button. As with other data, if a subject did not respond to an emotional assessment, the application detected it and alerted the investigator. When the investigator received such an alert, the investigator asked the subject to evaluate the emotion. This experimental procedure was approved by the Institutional Review Board of the Sangmyung University, Seoul, Korea (BE2017-22).

2.4. Measurement of Physiological Response by Analyzing HRV

2.4.1. Recording and Signal Processing

The PPG signals were recorded between 50 and 90 Hz sampling rate with the wireless PPG sensing system (Emotion Science Research Center Inc., Seoul, Korea). Zero-padding and cubic spline interpolation were applicated to stabilize the sampling rate of data to 80 Hz. After the interpolation, Only the frequency components between 0.75 and 2.5 Hz corresponding to the ranges between 50 bpm and 150 bpm, respectively, were extracted by the Butterworth bandpass filter for noise cancelation. The peak was detected in the PPG raw signal by the peak detection algorithm. The peak to peak interval (PPI), which is the interval between detected peaks, was calculated by detecting the dominant frequency by Fast Fourier Transformation (FFT), while sliding the raw data signal accumulated for 120 s at intervals of 1 s.

2.4.2. HRV Analysis in Time Domain

Beat per minute (BPM) was calculated by dividing the window size of 60 s with the peak to peak interval (PPI) as:
B P M = 60 / 1 N i = 1 N P P I i
where N was the number of raw PPG signal samples. All variables of the heart rate variability (HRV) were calculated by HRV analysis with the window size of 180 s and interval size of 60 s. The mean of the standard deviation (SDNN) of all PPI for all 3-min segments of the entire recording was calculated as:
S D N N =   S D ( P P I ) = 1 N 1 I = 1 N [ M e a n ( P P I ) P P I i ] 2
where S D ( P P I ) is a standard deviation of P P I . The root mean square of differences between adjacent PPI (RMSSD) was calculated as:
R M S S D = 1 N 2 i = 2 N ( P P I i P P I i 1 ) 2 .
The proportion derived by dividing the number of interval differences of PPI greater than 50 ms by the total number of PPI (pNN50) as:
p N N 50 = N N 50   c o u n t t o t a l   N N   c o u n t
where N N 50   c o u n t is number of PPI greater than 50ms and t o t a l   N N   c o u n t is the total number of PPI.

2.4.3. HRV Analysis in Frequency Domain

Very low frequency (VLF) which is the power in the frequency range of 0.0033–0.4 Hz was analyzed as an indicator of sympathetic activity as:
V L F = i = 0.0033 d f 0.04 d f P o w e r i ,     d f = S a m p l i n g R a t e ( P P I ) L e n g t h ( P P I ) = 1 T i m e
where P o w e r is the power spectrum analyzed PPI by FFT and d f is frequency resolution. Low frequency (LF) which is the power in the frequency range of 0.04–0.15 Hz was analyzed as an indicator of both the sympathetic and the parasympathetic activity as:
L F = i = 0.04 d f 0.15 d f P o w e r i
High frequency (HF) which is the power in the frequency range of 0.15–0.4 Hz was analyzed as an indicator of parasympathetic activity as:
H F = i = 0.15 d f 0.4 d f P o w e r i .
The VLF, LF, and HF components were also analyzed as percentage and normalized values, respectively. The percentage of each variable were calculated by dividing each variable by the total power. Total power is a band of power spectrum range between 0.0033 and 0.4 Hz as:
T o t a l P o w e r = i = 0.0033 d f 0.4 d f P o w e r i .
VLF(%) is VLF divided by the total power as:
V L F ( % ) = V L F T o t a l P o w e r .
LF(%) is LF divided by the total power as:
L F ( % ) = L F T o t a l P o w e r .
HF(%) is HF divided by the total power as:
H F ( % ) = H F T o t a l P o w e r .
The normalized variables were calculated by natural logarithm of VLF, LF, and HF. lnVLF is natural logarithm of VLF as:
l n V L F = l n ( V L F )
where l n is natural logarithm. lnLF is natural logarithm of LF as:
l n L F = l n ( L F )
lnHF is natural logarithm of HF as:
l n H F = l n ( H F ) .
VLF, LF, and HF were also calculated as ratios such as LF/HF ratio, VLF/HF ratio. The LF/HF ratio and VLF/HF ratio represent homeostasis of the sympathetic and parasympathetic activity [43] as:
L F / H F   r a t i o = L F H F
V L F / H F   r a t i o = V L F H F .
Peak power is the band of power spectrum range between −0.015 and 0.015 Hz based on peak Hz. The peak power is an indicator of homeostasis [43] as:
P e a k P o w e r = i = P e a k H z 0.015 d f P e a k H z + 0.015 d f P o w e r i .
Peak Hz is a hertz of highest peak in power spectrum range of 0.04–0.26 Hz as:
P e a k H z =   a r g m a x ( P o w e r i ) × d f ,     0.04 d f i 0.26 d f .
Coherence ratio is the peak power divided by difference of total power and peak power which is indicator of the emotional stability [43] as:
C o h e r e n c e   R a t i o = P e a k P o w e r T o t a l P o w e r P e a k P o w e r
Dominant power is a power of highest peak in total power spectrum range of 0–0.5 Hz as:
D o m i n a n t   P o w e r = P o w e r a r g m a x ( P o w e r )
Dominant Hz is a hertz of highest peak in total power spectrum range of 0–0.5 Hz as:
D o m i n a n t   H z = a r g m a x ( P o w e r ) × d f

2.5. Measurement of Behavior Patterns by Analyzing GPS Location

GPS locations were measured in two states: Stationary and transition state. The GPS locations were classified into a stationary state or transition state were defined based on a distance by K-Means algorithm. The GPS location in the stationary state were calculated when the latitude and longitude have changed by less than 1 km per hour, and the GPS locations in the transition state were considered only more than 1 km per hour [21]. The six variables of behavioral patterns were defined by analyzing the GPS locations accumulated for 10 min at intervals of 1 min. Location Variance is the variability in a participant’s GPS location which is calculated by logarithm of sum of squares of latitude and longitude as,
L o c a t i o n   V a r i a n c e = log ( σ l a t 2 + σ l n g 2 )
where σ l a t 2 is a sum of squares of latitude and σ l a t 2 is a sum of longitude. Number of Clusters is the number of location clusters found by the k-means algorithm. Entropy is a variability of the time spent at the location clusters as,
E n t r o p y = i = 1 N p i log   p i
where i was the location cluster, N was the number of clusters, and p i was the ratio of the time spent in the clusters. Circadian Movement is the regularity of moving pattern in daily life as,
C i r c a d i a n   M o v e m e n t = log ( E l a t + E l n g ) ,   E = i = 1 N p s d ( f i ) ( i i i N )
where f was a bin in frequency domain analyzed from GPS locations by least-squares spectral analysis, N was the number of frequency bins corresponding to 24-h periods, i was the index of frequency bin, and p s d ( f i ) was the power spectral density at frequency bin f i . The logarithm was applied to correct the skewed distribution. Transition Time is the percentage of time during which a participant was in the non-stationary state. Total Distance is accumulation of distances between the location samples in kilometers taken by a participant as,
T o t a l   D i s t a n c e = i = 1 N 1 111.19 × 180 π × acos [ sin ( l a t i ) × s i n ( l a t i + 1 ) + cos ( l a t i ) × cos ( l a t i + 1 ) × cos ( l n g i + 1 l n g i ) ]
where i was the GPS location, and N was the total number of GPS locations, and 111.19 was the constant for unit conversion from miles to kilometers.

2.6. Measurement of Environmental Factors by Analyzing Ambient Noise

The environmental factors based on ambient noise were analyzed by raw sound signals. The raw signals of ambient noise were recorded every second while accumulated for 5 s. The analyzed features of raw signals were classified with volume and frequency components which are sound amplitude and sound frequency. The sound amplitude was analyzed by averaging the measured the raw signal for 1 min as,
S o u n d A m p l i t u d e = 1 m i = 1 m A m p l i t u d e i
where Amplitude is amplitude of ambient noise, and m is a window size. The sound frequency was analyzed by dominant power spectrum in frequency domain.
S o u n d F r e q u e n c y = a r g m a x ( P o w e r ) × d f ,   d f = S a m p l i n g R a t e ( A m p l i t u d e ) L e n g t h ( A m p l i t u d e ) = 1 T i m e .

2.7. Statistical Analysis

The relationships among the physiological response, the behavioral pattern, and the ambient noise, based on emotion, were analyzed in the following three steps. First, the pre-processing step that interpolates and normalizes data samples. In the second step, the correlations, between the 31 variables measured the physiological response, the behavioral pattern, and the ambient noise, which are analyzed by the multiple regression. Finally, the hypothesis of this study, which is that significant relationships, resulting from multiple regression, differ depending on the emotions, is verified by ANOVA. Since the above variables have different criteria, data interpolation and standardization were performed to compare among the variables. The data were interpolated by averaging the data and standardized by z-score. After data preprocessing, a multiple regression model was constructed by setting one of the above 31 variables as the dependent variable and the remaining 30 variables as independent variables. The same procedure was repeated for all variables that were not set as dependent variables.
Correlations between the dependent variable and independent variables were analyzed by multiple regression. Multiple regression analysis is convenient to analyze multiple independent variables for a dependent variable. In order to derive a significant correlation between the dependent variable and independent variables by multiple regression, the following constraints should be checked. The multi-collinearity means that there is a strong correlation among independent variables. The independent variables which have multi-collinearity should have been removed to prevent error. In order to this constraint, the variance inflation factor (VIF) which is an indicator of multicollinearity was checked that is less than 10. The second, autocorrelation was tested by Durbin-Watson statistics. The autocorrelation indicates a strong correlation among dependent variables. It was verified that there was not the autocorrelation since the Durbin-Watson statistics were more than 1 and less than 3. Third, the normality and homogeneity of the residuals were tested by the Kolmogorov-Smirnov test (p > 0.1) and Breusch-Pagan (p > 0.05). All assumptions for multiple regression were satisfied, therefore, this study analyzed the multiple regression models. Finally, the existence of any significant independent variable, which affect a dependent variable, was verified (p < 0.05). The suitability of the regression model was verified by the adjusted r-squared, which is more than 0.6. The standardized coefficients (β) obtained as a result of multiple regression are indicators of the influence of each independent variable on the dependent variable.
The standardized coefficients for each person were stored in matrix form as shown in Figure 2. Then, the subjective emotion labels were mapped to the standardized coefficients matrix to form a data structure for ANOVA as shown in Figure 3. The emotion labels indicate the subjective questionnaire score, which is evaluated as seven points for two emotion questionnaires of arousal level and valence level. Samples that were analyzed 79 participants of standardized coefficient data mapped with emotional labels were analyzed for differences between arousal levels or valence levels respectively by ANOVA.

3. Results

Significant correlations among the physiological responses, the behavioral patterns, and the ambient noise were analyzed by multiple regression for two weeks of data samples of 79 participants’. An example of multiple regression results is shown in Table 1. The significant correlations derived from the multiple regression were analyzed by ANOVA. This analysis analyzed the difference between correlations depending on emotion levels. Specifically, we confirmed whether the standardized coefficients obtained through the multiple regression analysis were different according to the three emotion levels. The emotion was analyzed at the relevant time when a significant correlation was analyzed on the basis of the subjective emotion recorded from the participants. The emotion was recorded on seven-point scales according to the level of arousal, and valence, respectively. The level of arousal was classified into three levels based on the median of the assessment. The data larger than the median (5–7 points) were classified as arousal, the median (4-point) was neutral, and the data smaller than the average (1–3 points) were classified as relaxation. Likewise, 5–7 points of valence scores were classified into positive emotion, the median value (4-point) was into neutral, and 1–3 points were into negative emotion. The number of samples by arousal level was 1907 arousal samples, 1699 neutral, and 777 relaxation, respectively. While, 2300 samples were rated positive emotion, 1098 neutral, and 985 negative emotion. Specific descriptive statistics on mean and standard deviation of the standardized coefficients were presented according to the levels of arousal in Table 2. Descriptive statistics of the standardized coefficients according to the levels of valence were described in Table 3.
The significant result of statistics that distinguish the three levels of arousal were presented in Table 4, Table 5, Table 6, Table 7, Table 8, Table 9, Table 10, Table 11, Table 12, Table 13, Table 14, Table 15, Table 16, Table 17, Table 18, Table 19, Table 20 and Table 21. The significant result of ANOVA (i.e., p < 0.05 in the ANOVA row in the tables) indicates that there is more than one pair of differences among the three levels of emotions. However, this result does not indicate which pair is significant. Therefore, the pairs (arousal-neutral, neutral-relaxation, and arousal-relaxation) that show significant differences should be analyzed. This analysis has been commonly referred to as post-hoc analysis. In this study, an independent t-test was used which is a general method of analyzing the difference between two levels (i.e., p < 0.05 in the T-test row in the tables). This paper presents only the results of the significant differences between all pairs (all emotions). The correlations varied depending on the level of arousal were divided into the relationships within physiological or environmental variables, the relationships between the physiological and behavioral variables, and between the physiological and the environmental variables (Figure 4). There were many correlations between physiological variables, especially variables such as pNN50, SDNN, lnHF, Dominant Power, Dominant Hz, Peak Hz and Coherence ratio. The relationships between physiological and behavioral variables, which differed according to the level of arousal, were correlations between lnHF and Entropy, lnHF and Circadian Movement, Dominant Hz and Transition Time, and Dominant Hz and Total Distance. The relationships between physiological and environmental variables, which differed according to the level of arousal, were correlations between RMSSD and Sound Amplitude, and Dominant Hz and Sound Frequency. There was a correlation between sound amplitude and sound frequency. There was no correlation between the behavioral variables.
Significant correlations and the statistics which distinguish the three levels of valence were presented in Table 22, Table 23, Table 24, Table 25, Table 26, Table 27, Table 28, Table 29, Table 30, Table 31 and Table 32. Correlations that varied according to the level of valence were classified into the relationships within physiological variables, the relationships between the physiological and behavioral variables, the physiological and environmental variables, and between the behavioral and the environmental variables (Figure 5). There were fewer relationships within the physiological variables in the result of valence than the result of the arousal. The relationship between physiological and behavioral variables was only significant between peak power and total distance. The significant correlations between physiological and environmental variables were VLF/HF ratio and Sound Amplitude, and the relationship between Dominant Hz and Sound Frequency.

4. Discussion and Conclusions

The embodied emotion has differed from the previous view of emotion and it has importantly considered interactions among body, behavior, and environment. Therefore, this study was to recognize the embodied emotion by analyzing correlations among physiological changes, behavior, and environment. The physiological responses were determined by cardiovascular responses in this study. The autonomic nervous system (ANS) has been monitored to recognize emotions in many previous studies [44,45,46]. The behavioral patterns of individuals were determined by features, which were analyzed by GPS (global positioning system) locations, according to suppose that lifestyle patterns were associated with emotion [21]. The amplitude and frequency components of ambient sound were considered as the environmental factors, based on environmental factors, particularly ambient sound, which has been related to emotion and physiological arousal in daily life [25,26].
This study verified that there were differences between interactions that determine the arousal and valence in emotion by analyzing an individual’s life-log data. There were more connections between the physiological variables in the result of arousal (Figure 4) than the valence results (Figure 5). In addition, there was no direct connection between behavioral and environmental variables, while both behavioral and environmental variables were associated with physiological variables, as shown in Figure 4. These relationships between physiological and behavioral variables were also more pronounced in the arousal than the valence. These results showed that the autonomic nervous system response has been highly related to physiological arousal [47,48,49,50]. Coherence ratio, an indicator of physiological coherence, has been associated with VLF, LF, an indicator of sympathetic activation, and VLF (%), an indicator of parasympathetic activation. This was consistent with previous theories that physiological coherence has been determined by the way the sympathetic and parasympathetic nerves have been controlled [51]. In addition, lnHF, another indicator of parasympathetic activity, was linked to dominant rhythms (Dominant Power, Dominant Hz) in cardiac activity, which were also connected to the Coherence ratio. This suggests that cardiac activity varies with the degree of activation of parasympathetic nerves and which might be associated with physiological coherence. This association between the autonomic nervous system and physiological coherence was also connected with the indicators of the regularity of life patterns (Entropy, Circadian Movement) and movement patterns (Transition Time, Total Distance). These results suggested that the physiological homeostasis coincides with the behavioral homeostasis according to the polyvagal theory [51,52]. The relationships between physiological variables and ambient noise (Sound Amplitude and Sound Frequency) were consistent with the results that ambient noise has been related to the arousal in the previous studies [25,26].
The relationships among the physiological, behavioral, and environmental variables were more systemic in the result of valence, as shown in Figure 5. It means that there was a connection between the body-behavior-environment in the valence results, compared with arousal results that the behavioral and environmental variables were only connected to physiological variables, respectively. It seems that the conscious and cognitive judgment processes have been necessary to determine the valence level of emotion compared with the arousal level of emotion, which is determined by unconscious and autonomic physiological control [53].
In summary, the arousal levels of embodied emotion were represented by the more prominent interactions with physiological responses, while the valence levels were represented as a balanced relationship among the physiological, behavioral, and environmental variables. These results suggested that the arousal level is an indicator of the regulation of behavioral and physiological homeostasis to cope with the environment, while the valence level indicates the process of cognitive judgment, taking into consideration the environment and behavior. However, because this study was a field test, the experimental controls were less stringent than the laboratory studies. It might be necessary to remove the device, such as when washing hands, in which case, the researchers may not have detected it. Also, cross-validation of these results should be necessary to feasibility and consistency as this study analyzed the data for two weeks for 79 participants for the twenties. Therefore, additional studies should be supported to ensure reproducibility. Nevertheless, this study is valuable because it analyzed practical data. Further, this study serves as an indicator of interpretations, which proves useful in recognizing embodied emotion based on the life-log data in deep-learning or machine-learning.

Author Contributions

A.C., H.L., and Y.J. designed the study with investigation of previous studies and performed the experiments; H.L. analyzed raw data; Y.G. organized the database; A.C. performed the statistical analysis and wrote the manuscript; M.W. conceived the study and was in charge of overall direction and planning.

Funding

This work was supported by Electronics and Telecommunications Research Institute (ETRI) grant funded by the Korean government. [18ZS1100, Core Technology Research for Self-Improving Artificial Intelligence System] and by the National Research Foundation of Korea (NRF) grant funded by the Korea government (MSIT) (NRF-2018R1A2B6008901).

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Lee, S.; Gong, G.; Lee, S.G. Entity-event lifelog ontology model (EELOM) for lifeLog ontology schema definition. In Proceedings of the 2010 12th International Asia-Pacific Web Conference, Busan, Korea, 6–8 April 2010; pp. 344–346. [Google Scholar]
  2. Clark, A. Embodied, situated, and distributed cognition. In A Companion to Cognitive Science; William, B., George, G., Eds.; Blackwell Publishers: Malden, MA, USA, 2017; pp. 506–517. [Google Scholar]
  3. Niedenthal, P.M. Embodying emotion. Science 2007, 316, 1002–1005. [Google Scholar] [CrossRef] [PubMed]
  4. Quiroz, J.C.; Geangu, E.; Yong, M.H. Emotion Recognition Using Smart Watch Sensor Data: Mixed-Design Study. JMIR Ment. Health 2018, 5, e10153. [Google Scholar] [CrossRef] [PubMed]
  5. Parasuraman, R.; Wilson, G.F. Putting the brain to work: Neuroergonomics past, present, and future. Hum. Factors 2008, 50, 468–474. [Google Scholar] [CrossRef] [PubMed]
  6. Fairclough, S.H. Fundamentals of physiological computing. Interact. Comput. 2009, 21, 133–145. [Google Scholar] [CrossRef]
  7. Wilhelm, F.H.; Grossman, P. Emotions beyond the laboratory: Theoretical fundaments, study design, and analytic strategies for advanced ambulatory assessment. Biol. Psychol. 2010, 84, 552–569. [Google Scholar] [CrossRef] [PubMed]
  8. Shapsough, S.; Hesham, A.; Elkhorazaty, Y.; Zualkernan, I.A.; Aloul, F. Emotion recognition using mobile phones. In Proceedings of the 2016 IEEE 18th International Conference on e-Health Networking, Applications and Services (Healthcom), Munich, Germany, 14–16 September 2016; pp. 1–6. [Google Scholar]
  9. Brouwer, A.M.; van Dam, E.; Van Erp, J.B.; Spangler, D.P.; Brooks, J.R. Improving real-life estimates of emotion based on heart rate: A perspective on taking metabolic heart rate into account. Front. Hum. Neurosci. 2018, 12, 284. [Google Scholar] [CrossRef]
  10. Fischer, T.; Riedl, R. Lifelogging for organizational stress measurement: Theory and Applications. In Lifelogging for Organizational Stress Measurement; Springer: Cham, Switzerland, 2019; pp. 1–37. [Google Scholar]
  11. Doherty, A.R.; Caprani, N.; Conaire, C.Ó.; Kalnikaite, V.; Gurrin, C.; Smeaton, A.F.; O’Connor, N.E. Passively recognising human activities through lifelogging. Comput. Hum. Behav. 2011, 27, 1948–1958. [Google Scholar] [CrossRef]
  12. Haag, A.; Goronzy, S.; Schaich, P.; Williams, J. Emotion recognition using bio-sensors: First steps towards an automatic system. In Tutorial and Research Workshop on Affective Dialogue Systems; Springer: Berlin/Heidelberg, Germany, June 2004; pp. 36–48. [Google Scholar]
  13. Liu, Y.; Sourina, O.; Nguyen, M.K. Real-time EEG-based emotion recognition and its applications. In Transactions on Computational Science XII; Springer: Berlin/Heidelberg, Germany, 2011; pp. 256–277. [Google Scholar]
  14. Carpenter, A.; Frontera, A. Smart-watches: A potential challenger to the implantable loop recorder? Europace 2016, 18, 791–793. [Google Scholar] [CrossRef]
  15. Thanapattheerakul, T.; Mao, K.; Amoranto, J.; Chan, J.H. Emotion in a Century: A Review of Emotion Recognition. In Proceedings of the 10th International Conference on Advances in Information Technology, ACM, Bangkok, Thailand, 10–13 December 2018; p. 17. [Google Scholar]
  16. Koshy, A.N.; Sajeev, J.K.; Nerlekar, N.; Brown, A.J.; Rajakariar, K.; Zureik, M.; Teh, A.W. Smart watches for heart rate assessment in atrial arrhythmias. Int. J. Cardiol. 2018, 266, 124–127. [Google Scholar] [CrossRef]
  17. Jiang, S.; Li, Z.; Zhou, P.; Li, M. Memento: An Emotion-driven Lifelogging System with Wearables. ACM Trans. Sens. Netw. (TOSN) 2019, 15, 8. [Google Scholar] [CrossRef]
  18. Kreibig, S.D. Autonomic nervous system activity in emotion: A review. Biol. Psychol. 2010, 84, 394–421. [Google Scholar] [CrossRef] [PubMed]
  19. Siegel, E.H.; Sands, M.K.; Van den Noortgate, W.; Condon, P.; Chang, Y.; Dy, J.; Barrett, L.F. Emotion fingerprints or emotion populations? A meta-analytic investigation of autonomic features of emotion categories. Psychol. Bull. 2018, 144, 343. [Google Scholar] [CrossRef] [PubMed]
  20. Sano, A.; Picard, R.W. Stress recognition using wearable sensors and mobile phones. In Proceedings of the 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction, Geneva, Switzerland, 2–5 September 2013; pp. 671–676. [Google Scholar]
  21. Saeb, S.; Zhang, M.; Karr, C.J.; Schueller, S.M.; Corden, M.E.; Kording, K.P.; Mohr, D.C. Mobile phone sensor correlates of depressive symptom severity in daily-life behavior: An exploratory study. J. Med. Internet Res. 2015, 17, e175. [Google Scholar] [CrossRef] [PubMed]
  22. Riener, A.; Ferscha, A.; Aly, M. Heart on the road: HRV analysis for monitoring a driver’s affective state. In Proceedings of the 1st International Conference on Automotive User Interfaces and Interactive Vehicular Applications, ACM, Essen, Germany, 21–22 September 2009; pp. 99–106. [Google Scholar]
  23. Clark, A.; Chalmers, D. The extended mind. Analysis 1998, 58, 7–19. [Google Scholar] [CrossRef]
  24. Varela, F.J.; Thompson, E.; Rosch, E. The Embodied Mind: Cognitive Science and Human Experience; MIT Press: Cambridge, MA, USA, 2017. [Google Scholar]
  25. Sanders, M.S.; McCormick, E.J. Applied anthropometry, work-space design and seating. Hum. Factors Eng. Des. 1993, 7, 19–23. [Google Scholar]
  26. Miller, J.D. Effects of noise on people. J. Acoust. Soc. Am. 1974, 56, 729–764. [Google Scholar] [CrossRef]
  27. Wallenius, M.A. The interaction of noise stress and personal project stress on subjective health. J. Environ. Psychol. 2004, 24, 167–177. [Google Scholar] [CrossRef]
  28. Smyth, J.; Ockenfels, M.C.; Porter, L.; Kirschbaum, C.; Hellhammer, D.H.; Stone, A.A. Stressors and mood measured on a momentary basis are associated with salivary cortisol secretion. Psychoneuroendocrinology 1998, 23, 353–370. [Google Scholar] [CrossRef]
  29. Cohen, S.; Glass, D.C.; Singer, J.E. Apartment noise, auditory discrimination, and reading ability in children. J. Exp. Soc. Psychol. 1973, 9, 407–422. [Google Scholar] [CrossRef]
  30. Söderlund, G.; Sikström, S.; Smart, A. Listen to the noise: Noise is beneficial for cognitive performance in ADHD. J. Child Psychol. Psychiatry 2007, 48, 840–847. [Google Scholar] [CrossRef]
  31. Ravindran, R.; Devi, R.S.; Samson, J.; Senthilvelan, M. Noise-stress-induced brain neurotransmitter changes and the effect of Ocimum sanctum (Linn) treatment in albino rats. J. Pharmacol. Sci. 2005, 98, 354–360. [Google Scholar] [CrossRef] [PubMed]
  32. Schuller, B.; Hantke, S.; Weninger, F.; Han, W.; Zhang, Z.; Narayanan, S. Automatic recognition of emotion evoked by general sound events. In Proceedings of the 2012 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Kyoto, Japan, 25–30 March 2012; pp. 341–344. [Google Scholar]
  33. Russell, J.A. A circumplex model of affect. J. Personal. Soc. Psychol. 1980, 39, 1161. [Google Scholar] [CrossRef]
  34. Swanepoel, D.W.; Myburgh, H.C.; Howe, D.M.; Mahomed, F.; Eikelboom, R.H. Smartphone hearing screening with integrated quality control and data management. Int. J. Audiol. 2014, 53, 841–849. [Google Scholar] [CrossRef] [PubMed]
  35. Kardous, C.A.; Shaw, P.B. Evaluation of smartphone sound measurement applications. J. Acoust. Soc. Am. 2014, 135, EL186–EL192. [Google Scholar] [CrossRef] [PubMed]
  36. Murphy, E.; King, E.A. Testing the accuracy of smartphones and sound level meter applications for measuring environmental noise. Appl. Acoust. 2016, 106, 16–22. [Google Scholar] [CrossRef]
  37. Chanel, G.; Ansari-Asl, K.; Pun, T. Valence-arousal evaluation using physiological signals in an emotion recall paradigm. In Proceedings of the 2007 IEEE International Conference on Systems, Man and Cybernetics, Montreal, QC, Canada, 7–10 October 2007; pp. 2662–2667. [Google Scholar]
  38. Mauss, I.; Robinson, M. Measures of emotion: A review. Cogn. Emot. 2009, 23, 209–237. [Google Scholar] [CrossRef]
  39. Feldman, L.A. Valence focus and arousal focus: Individual differences in the structure of affective experience. J. Personal. Soc. Psychol. 1995, 69, 153. [Google Scholar] [CrossRef]
  40. Ray, R.D.; McRae, K.; Ochsner, K.N.; Gross, J.J. Cognitive reappraisal of negative affect: Converging evidence from EMG and self-report. Emotion 2010, 10, 587. [Google Scholar] [CrossRef]
  41. Barrett, L.F. Feelings or words? Understanding the content in self-report ratings of experienced emotion. J. Personal. Soc. Psychol. 2004, 87, 266. [Google Scholar] [CrossRef]
  42. Barrett, L.F. Discrete emotions or dimensions? The role of valence focus and arousal focus. Cogn. Emot. 1998, 12, 579–599. [Google Scholar] [CrossRef]
  43. McCraty, R.; Tomasino, D. Psychophysiological coherence. Stress Health Dis. 2006, 5, 288. [Google Scholar]
  44. Lacey, J.I. Somatic response patterning and stress: Some revisions of activation theory. Psychol. Stress Issues Res. 1967, 14–37. [Google Scholar]
  45. Graham, F.K.; Clifton, R.K. Heart-rate change as a component of the orienting response. Psychol. Bull. 1966, 65, 305. [Google Scholar] [CrossRef] [PubMed]
  46. Sokolov, E.N. Perception and the Conditioned Reflex; Pergamon Press: Oxford, UK, 1963. [Google Scholar]
  47. Malmo, R.B. Activation: A neuropsychological dimension. Psychol. Rev. 1959, 66, 367. [Google Scholar] [CrossRef] [PubMed]
  48. Duffy, E. The psychological significance of the concept of” arousal” or” activation.”. Psychol. Rev. 1957, 64, 265. [Google Scholar] [CrossRef]
  49. Lindsley, D.B. Available online: https://psycnet.apa.org/record/1951-07758-022 (accessed on 20 September 2019).
  50. Darrow, C.W.; Jost, H.; Solomon, A.P.; Mergener, J.C. Autonomic indications of excitatory and homeostatic effects on the electroencephalogram. J. Psychol. 1942, 14, 115–130. [Google Scholar] [CrossRef]
  51. Porges, S.W. The Polyvagal Theory: Neurophysiological Foundations of Emotions, Attachment, Communication, and Self-regulation (Norton Series on Interpersonal Neurobiology); W. W. Norton & Company: New York, NY, USA, 2011; ISBN 978-0-393-70906-3. [Google Scholar]
  52. Porges, S.W.; Doussard-Roosevelt, J.A.; Portales, A.L.; Greenspan, S.I. Infant regulation of the vagal “brake” predicts child behavior problems: A psychobiological model of social behavior. Dev. Psychobiol. 1996, 29, 697–712. [Google Scholar] [CrossRef]
  53. Porges, S.W.; Greenspan, S.I. Regulatory disorders II: Psychophysiologic perspectives. NIDA Res. Monogr. 1991, 114, 173–181. [Google Scholar]
Figure 1. The wearable device for sensing photoplethysmogram (PPG) signals and the mobile application for PPG, photoplethysmogram (GPS), ambient noise, and self-report data acquisition.
Figure 1. The wearable device for sensing photoplethysmogram (PPG) signals and the mobile application for PPG, photoplethysmogram (GPS), ambient noise, and self-report data acquisition.
Sensors 19 05308 g001
Figure 2. A sample of standardized coefficient matrix. The standardized coefficient which is beta (β) indicates the influence of each independent variable on the dependent variable.
Figure 2. A sample of standardized coefficient matrix. The standardized coefficient which is beta (β) indicates the influence of each independent variable on the dependent variable.
Sensors 19 05308 g002
Figure 3. Data structures of each analysis step. Standardized coefficients as results of multiple regression were formed as a matrix. Map the subjective emotion labels to the standardized coefficients matrix to form a data structure for ANOVA.
Figure 3. Data structures of each analysis step. Standardized coefficients as results of multiple regression were formed as a matrix. Map the subjective emotion labels to the standardized coefficients matrix to form a data structure for ANOVA.
Sensors 19 05308 g003
Figure 4. A schematic representation of correlations that demonstrate the differences in arousal of emotions. The letters in red indicate physiological variables, blue indicate behavioral variables, and green indicate environmental variables. The arrows represent the correlation between the two variables. The red arrows represent the correlations within physiological variables, the green arrows represent the correlations within environmental variables, and the black arrows represent the correlations between the different construct variables.
Figure 4. A schematic representation of correlations that demonstrate the differences in arousal of emotions. The letters in red indicate physiological variables, blue indicate behavioral variables, and green indicate environmental variables. The arrows represent the correlation between the two variables. The red arrows represent the correlations within physiological variables, the green arrows represent the correlations within environmental variables, and the black arrows represent the correlations between the different construct variables.
Sensors 19 05308 g004
Figure 5. A schematic representation of correlations that distinguish the differences in valence emotions. The letters in red indicate physiological variables, blue indicate behavioral variables, and green indicate environmental variables. The arrows represent the correlation between the two variables. The red arrows represent the correlations within physiological variables, the green arrows represent the correlations within environmental variables, and the black arrows represent the correlations between the different construct variables.
Figure 5. A schematic representation of correlations that distinguish the differences in valence emotions. The letters in red indicate physiological variables, blue indicate behavioral variables, and green indicate environmental variables. The arrows represent the correlation between the two variables. The red arrows represent the correlations within physiological variables, the green arrows represent the correlations within environmental variables, and the black arrows represent the correlations between the different construct variables.
Sensors 19 05308 g005
Table 1. An example of the significant causalities analyzed by multiple regression among two weeks data for 79 participants. All assumptions of multiple regression were satisfied. There was no autocorrelation in the residuals (Durbin-Watson value = 2.264). Normality of residuals was satisfied (p-value of Kolmogorov-Smirnov test = 0.728). Homeogeneity of residuals was satisfied (p-value of Breusch-Pagan test = 0.499). Multiple regression was run to predict lnHF from location variance, circadian movement, transition time, total distance, total distance, pNN50, peak hz, and coherence ratio. Only those variables which were not affected by multicollinearity were entered in the multiple-regression (VIF < 10). A significant regression equation was found (F(25, 34) = 40.231, p < 0.000, Adj. R 2 = 0.943). Transition time was significant predictor of lnHF. Regression model degrees of freedom: 25, Residual degrees of freedom: 34, Autocorrelation Test - Durbin Watson: 2.264, Kolmogorov-Smirnov Test: Z = 0.089, p = 0.728, Breusch-Pagan Test: F = 0.994, p = 0.499.
Table 1. An example of the significant causalities analyzed by multiple regression among two weeks data for 79 participants. All assumptions of multiple regression were satisfied. There was no autocorrelation in the residuals (Durbin-Watson value = 2.264). Normality of residuals was satisfied (p-value of Kolmogorov-Smirnov test = 0.728). Homeogeneity of residuals was satisfied (p-value of Breusch-Pagan test = 0.499). Multiple regression was run to predict lnHF from location variance, circadian movement, transition time, total distance, total distance, pNN50, peak hz, and coherence ratio. Only those variables which were not affected by multicollinearity were entered in the multiple-regression (VIF < 10). A significant regression equation was found (F(25, 34) = 40.231, p < 0.000, Adj. R 2 = 0.943). Transition time was significant predictor of lnHF. Regression model degrees of freedom: 25, Residual degrees of freedom: 34, Autocorrelation Test - Durbin Watson: 2.264, Kolmogorov-Smirnov Test: Z = 0.089, p = 0.728, Breusch-Pagan Test: F = 0.994, p = 0.499.
Dependent VariablesTestsStatistics
lnHFMultiple regressionDetermining how well the model fitsAdj. R-square0.943
F40.231
Sig.0.000
Statistical significance of the independent variablesIndependent variablesUnstandardized coefficients (Beta)p
(constant)−10437988370.4810.661
Location Variance0.0120.687
Circadian Movement0.0280.166
Transition Time−0.0590.076
Total Distance0.0640.116
pNN500.0200.792
Peak Hz0.0500.115
Coherence Ratio0.0400.247
Multicollinearity testIndependent variablesVIF
Location Variance2.759
Circadian Movement3.673
Transition Time7.764
Total Distance8.050
pNN508.528
Peak Hz3.704
Coherence Ratio4.322
Table 2. Descriptive statistics of the standardized coefficients with significant differences between arousal levels.
Table 2. Descriptive statistics of the standardized coefficients with significant differences between arousal levels.
VariableDescriptive Statistics of Standardized Coefficients
IndependentDependentStatisticArousalNeutralRelaxation
BPMVLFMean−5,745,442,617−17,308,431,5078,428,219,435
SD270,176,000,000245,484,000,000231,354,000,000
pNN50Dominant PowerMean−0.423−0.613−0.659
SD1.8662.633.945
RMSSDpNN50Mean0.0130.0080.01
SD0.050.0410.046
SDNNpNN50Mean0.0250.0320.034
SD0.0710.0990.089
SDNNlnHFMean−0.055−0.05−0.141
SD0.8380.9310.93
SDNNVLF/HF ratioMean−0.001−0.003−0.007
SD0.0730.0620.082
SDNNPeak PowerMean−0.045−0.089−0.056
SD0.3450.530.433
LF(%)VLF(%)Mean−0.001−0.003−0.006
SD0.0310.0530.082
LF(%)HF(%)Mean−0.001−0.003−0.008
SD0.040.0680.098
lnLFBPMMean0.0010−0.001
SD0.0220.0170.022
lnHFEntropyMean−1193.169−140,776.74614,580.387
SD37,241.0473,206,125.223642,719.545
lnHFCircadian MovementMean00.0010
SD0.0060.0160.004
lnHFDominant HzMean−0.097−0.112−0.119
SD0.2360.2490.254
lnHFPeak HzMean0.0320.040.041
SD0.080.0920.091
LF/HF ratiopNN50Mean−0.004−0.0060
SD0.0520.060.067
Dominant PowerlnHFMean−0.0410.0040.035
SD0.9020.8530.872
Dominant HzDominant PowerMean−0.008−0.138−0.055
SD0.7912.0461.176
Dominant HzCoherence ratioMean−0.042−0.06−0.064
SD0.2590.2150.225
Dominant HzSound FrequencyMean−0.002−0.0030.003
SD0.0770.0510.044
Peak PowerCoherence ratioMean0.1560.1270.111
SD0.4560.3590.339
Peak HzRMSSDMean−0.0750.2−0.013
SD1.9334.1341.17
Peak HzPeak PowerMean0.0430.1190.216
SD1.2580.8781.729
Coherence ratiopNN50Mean0.006−0.0020.001
SD0.0770.080.076
Coherence ratioVLF(%)Mean−105,267,518.5−2,016,911,138−48,597,422.56
SD15,299,232,24517,846,539,92320,110,510,824
Coherence ratioLF(%)Mean−132,068,389.2−2,021,703,780−84,532,472.78
SD14,109,144,08117,824,867,95818,840,262,720
Coherence ratioHF(%)Mean−217,938,238.3−2,389,073,268−85,954,745.84
SD16,032,965,20921,954,646,14022,568,564,779
Coherence ratioDominant HzMean−0.065−0.074−0.093
SD0.3090.3450.325
Transition TimeDominant HzMean−0.015−0.0150.011
SD0.2590.1750.194
Total DistanceDominant HzMean0.0090.019−0.003
SD0.1820.1950.236
Sound AmplitudeRMSSDMean0.0050.3760.02
SD0.7577.9241.492
Sound AmplitudeSound FrequencyMean0.0620.0780.094
SD0.2450.2660.286
Sound FrequencySound AmplitudeMean0.0750.0860.109
SD0.2990.3140.337
Table 3. Descriptive statistics of the standardized coefficients with significant differences between valence levels.
Table 3. Descriptive statistics of the standardized coefficients with significant differences between valence levels.
VariableDescriptive Statistics of Standardized Coefficients
IndependentDependentStatisticPositiveNeutralNegative
Total DistancePeak PowerMean−0.033−0.128−0.01
SD0.7172.0040.7510.751
pNN50LF(%)Mean−118,639,038.51,079,730,5671,891,445,746
SD21,820,377,52818,397,399,41929,119,919,937
pNN50HF(%)Mean−209,743,703.21,227,721,9762,344,251,772
SD25,563,848,51321,374,411,77135,526,468,774
VLF(%)LF(%)Mean−0.002−0.007−0.001
SD0.0410.0810.036
VLF(%)HF(%)Mean−0.002−0.009−0.002
SD0.0520.1040.046
LF(%)VLF(%)Mean−0.003−0.007−0.001
SD0.0550.0880.029
LF(%)HF(%)Mean−0.003−0.009−0.001
SD0.0640.1090.041
HF(%)VLF(%)Mean−0.002−0.006−0.001
SD0.0370.0680.021
HF(%)LF(%)Mean−0.002−0.006−0.001
SD0.0360.0660.022
lnHFVLF(%)Mean−6,230,158.347-808,663,867.5139,285,674.2
SD5,364,187,18514,013,185,1775,257,502,233
lnHFLF(%)Mean−5,812,633.013-910,997,723.4128,324,064
SD5,522,380,31616,960,721,9185,205,893,477
lnHFHF(%)Mean−19,292,545.01−1,142,833,043160,581,520.7
SD6,721,689,12821,115,017,8436,435,510,213
VLF/HF ratioVLFMean−6,771,606,84848068436745,214,155,475
SD191,933,000,00089,138,008,05597,000,248,564
VLF/HF ratioSound AmplitudeMean−0.0010.007−0.002
SD0.0510.070.057
Dominant HzVLFMean4,564,168,257941,743,170−10,176,921,026
SD93,659,975,658111,036,000,000224,181,000,000
Dominant HzTotal PowerMean−9,252,548,826387775431.715,059,942,652
SD190,801,000,000222,377,000,000381,999,000,000
Sound AmplitudeTransition TimeMean0.024−0.0070.004
SD0.3270.3950.227
Sound AmplitudeTotal DistanceMean−0.0230.0230.031
SD0.490.4810.789
Sound AmplitudeVLF/HF ratioMean−0.0110.0130.002
SD0.2850.1410.304
Sound FrequencyDominant HzMean0−0.010.008
SD0.1430.1670.138
Table 4. Results of one-way ANOVA show a significant difference between Arousal-Neutral-Relaxation among variables correlated with BPM analyzed by multiple regression. The difference between the two emotion levels was verified by independent t-test.
Table 4. Results of one-way ANOVA show a significant difference between Arousal-Neutral-Relaxation among variables correlated with BPM analyzed by multiple regression. The difference between the two emotion levels was verified by independent t-test.
Dependent variablesTestsStatisticslnLF
BPMANOVAF3.173
p0.042
T-testArousal-Neutralt1.487
p0.137
Neutral-Relaxationt0.488
p0.625
Arousal-Relaxationt−2.358
p0.018
Table 5. Results of one-way ANOVA show a significant difference between Arousal-Neutral-Relaxation among variables correlated with RMSSD analyzed by multiple regression. The difference between the two emotion levels was verified by independent t-test.
Table 5. Results of one-way ANOVA show a significant difference between Arousal-Neutral-Relaxation among variables correlated with RMSSD analyzed by multiple regression. The difference between the two emotion levels was verified by independent t-test.
Dependent variablesTestsStatisticsPeak HzSound Amplitude
RMSSDANOVAF4.0673.466
p0.0170.031
T-testArousal-Neutralt-2.346−2.023
p0.0190.043
Neutral-Relaxationt1.9641.782
p0.0500.075
Arousal-Relaxationt1.1430.401
p0.2530.688
Table 6. Results of one-way ANOVA show a significant difference between Arousal-Neutral-Relaxation among variables correlated with pNN50 analyzed by multiple regression. The difference between the two emotion levels was verified by independent t-test.
Table 6. Results of one-way ANOVA show a significant difference between Arousal-Neutral-Relaxation among variables correlated with pNN50 analyzed by multiple regression. The difference between the two emotion levels was verified by independent t-test.
Dependent VariablesTestsStatisticsSDNNRMSSDLF/HF ratioCoherence Ratio
pNN50ANOVAF5.3674.2333.4963.946
p0.0050.0150.0300.019
T-testArousal-Neutralt−2.2002.6250.9692.502
p0.0280.0090.3320.012
Neutral-Relaxationt−0.292−1.109−2.200−0.885
p0.7700.2680.0280.376
Arousal-Relaxationt3.257−2.0041.994−2.092
p0.0010.0450.0460.036
Table 7. Results of one-way ANOVA show a significant difference between Arousal-Neutral-Relaxation among variables correlated with VLF analyzed by multiple regression. The difference between the two emotion levels was verified by independent t-test.
Table 7. Results of one-way ANOVA show a significant difference between Arousal-Neutral-Relaxation among variables correlated with VLF analyzed by multiple regression. The difference between the two emotion levels was verified by independent t-test.
Dependent VariablesTestsStatisticsBPM
VLFANOVAF3.109
p0.045
T-testArousal-Neutralt1.032
p0.302
Neutral-Relaxationt−2.518
p0.012
Arousal-Relaxationt1.681
p0.093
Table 8. Results of one-way ANOVA show a significant difference between Arousal-Neutral-Relaxation among variables correlated with VLF(%) analyzed by multiple regression. The difference between the two emotion levels was verified by independent t-test.
Table 8. Results of one-way ANOVA show a significant difference between Arousal-Neutral-Relaxation among variables correlated with VLF(%) analyzed by multiple regression. The difference between the two emotion levels was verified by independent t-test.
Dependent VariablesTestsStatisticsLF(%)Coherence Ratio
VLF(%)ANOVAF3.5973.813
p0.0270.022
T-testArousal-Neutralt1.0222.793
p0.3070.005
Neutral-Relaxationt1.117−2.338
p0.2640.019
Arousal-Relaxationt−2.6040.096
p0.0090.924
Table 9. Results of one-way ANOVA show a significant difference between Arousal-Neutral-Relaxation among variables correlated with LF(%) analyzed by multiple regression. The difference between the two emotion levels was verified by independent t-test.
Table 9. Results of one-way ANOVA show a significant difference between Arousal-Neutral-Relaxation among variables correlated with LF(%) analyzed by multiple regression. The difference between the two emotion levels was verified by independent t-test.
Dependent VariablesTestsStatisticsCoherence Ratio
LF(%)ANOVAF4.166
p0.016
T-testArousal-Neutralt2.905
p0.004
Neutral-Relaxationt−2.413
p0.016
Arousal-Relaxationt0.086
p0.931
Table 10. Results of one-way ANOVA show a significant difference between Arousal-Neutral-Relaxation among variables correlated with HF(%) analyzed by multiple regression. The difference between the two emotion levels was verified by independent t-test.
Table 10. Results of one-way ANOVA show a significant difference between Arousal-Neutral-Relaxation among variables correlated with HF(%) analyzed by multiple regression. The difference between the two emotion levels was verified by independent t-test.
Dependent VariablesTestsStatisticsLF(%)Coherence Ratio
HF(%)ANOVAF3.3754.056
p0.0340.017
T-testArousal-Neutralt1.0232.841
p0.3060.005
Neutral-Relaxationt1.050−2.376
p0.2940.018
Arousal-Relaxationt−2.5500.204
p0.0110.838
Table 11. Results of one-way ANOVA show a significant difference between Arousal-Neutral-Relaxation among variables correlated with lnHF analyzed by multiple regression. The difference between the two emotion levels was verified by independent t-test.
Table 11. Results of one-way ANOVA show a significant difference between Arousal-Neutral-Relaxation among variables correlated with lnHF analyzed by multiple regression. The difference between the two emotion levels was verified by independent t-test.
Dependent VariablesTestsStatisticsSDNNDominant Power
lnHFANOVAF4.9673.343
p0.0070.035
T-testArousal-Neutralt−0.150−1.178
p0.8810.239
Neutral-Relaxationt2.256−0.832
p0.0240.406
Arousal-Relaxationt−2.9012.556
p0.0040.011
Table 12. Results of one-way ANOVA show a significant difference between Arousal-Neutral-Relaxation among variables correlated with VLF/HF ratio analyzed by multiple regression. The difference between the two emotion levels was verified by independent t-test.
Table 12. Results of one-way ANOVA show a significant difference between Arousal-Neutral-Relaxation among variables correlated with VLF/HF ratio analyzed by multiple regression. The difference between the two emotion levels was verified by independent t-test.
Dependent VariablesTestsStatisticsSDNN
VLF/HF ratioANOVAF3.417
p0.033
T-testArousal-Neutralt0.474
p0.636
Neutral-Relaxationt1.509
p0.132
Arousal-Relaxationt−2.475
p0.013
Table 13. Results of one-way ANOVA show a significant difference between Arousal-Neutral-Relaxation among variables correlated with Peak Power analyzed by multiple regression. The difference between the two emotion levels was verified by independent t-test.
Table 13. Results of one-way ANOVA show a significant difference between Arousal-Neutral-Relaxation among variables correlated with Peak Power analyzed by multiple regression. The difference between the two emotion levels was verified by independent t-test.
Dependent VariablesTestsStatisticsSDNNPeak Hz
Peak PowerANOVAF3.0136.761
p0.0490.001
T-testArousal-Neutralt2.513−1.537
p0.0120.124
Neutral-Relaxationt−1.600−1.476
p0.1100.140
Arousal-Relaxationt−0.8733.456
p0.3830.001
Table 14. Results of one-way ANOVA show a significant difference between Arousal-Neutral-Relaxation among variables correlated with Peak Hz analyzed by multiple regression. The difference between the two emotion levels was verified by independent t-test.
Table 14. Results of one-way ANOVA show a significant difference between Arousal-Neutral-Relaxation among variables correlated with Peak Hz analyzed by multiple regression. The difference between the two emotion levels was verified by independent t-test.
Dependent VariablesTestsStatisticslnHF
Peak HzANOVAF5.202
p0.006
T-testArousal-Neutralt−2.246
p0.025
Neutral-Relaxationt−0.179
p0.858
Arousal-Relaxationt3.058
p0.002
Table 15. Results of one-way ANOVA show a significant difference between Arousal-Neutral-Relaxation among variables correlated with Coherence ratio analyzed by multiple regression. The difference between the two emotion levels was verified by independent t-test.
Table 15. Results of one-way ANOVA show a significant difference between Arousal-Neutral-Relaxation among variables correlated with Coherence ratio analyzed by multiple regression. The difference between the two emotion levels was verified by independent t-test.
Dependent VariablesTestsStatisticsDominant HzPeak Power
Coherence ratioANOVAF4.1945.807
p0.0150.003
T-testArousal-Neutralt1.6901.553
p0.0910.120
Neutral-Relaxationt0.4661.090
p0.6410.276
Arousal-Relaxationt−2.737−3.311
p0.0060.001
Table 16. Results of one-way ANOVA show a significant difference between Arousal-Neutral-Relaxation among variables correlated with Dominant Power analyzed by multiple regression. The difference between the two emotion levels was verified by independent t-test.
Table 16. Results of one-way ANOVA show a significant difference between Arousal-Neutral-Relaxation among variables correlated with Dominant Power analyzed by multiple regression. The difference between the two emotion levels was verified by independent t-test.
Dependent VariablesTestsStatisticspNN50Dominant Hz
Dominant PowerANOVAF3.0953.013
p0.0450.049
T-testArousal-Neutralt2.1172.365
p0.0340.018
Neutral-Relaxationt0.296−1.278
p0.7670.201
Arousal-Relaxationt−2.342−1.400
p0.0190.162
Table 17. Results of one-way ANOVA show a significant difference between Arousal-Neutral-Relaxation among variables correlated with Dominant Hz analyzed by multiple regression. The difference between the two emotion levels was verified by independent t-test.
Table 17. Results of one-way ANOVA show a significant difference between Arousal-Neutral-Relaxation among variables correlated with Dominant Hz analyzed by multiple regression. The difference between the two emotion levels was verified by independent t-test.
Dependent VariablesTestsStatisticsTransition TimeTotal DistancelnHFCoherence Ratio
Dominant HzANOVAF6.9463.5243.5593.252
p0.0010.0300.0290.039
T-testArousal-Neutralt0.0140.0141.4070.626
p0.9890.9890.1590.531
Neutral-Relaxationt−3.155−3.1550.6571.297
p0.0020.0020.5110.195
Arousal-Relaxationt3.3263.326−2.642−2.570
p0.0010.0010.0080.010
Table 18. Results of one-way ANOVA show a significant difference between Arousal-Neutral-Relaxation among variables correlated with Entropy analyzed by multiple regression. The difference between the two emotion levels was verified by independent t-test.
Table 18. Results of one-way ANOVA show a significant difference between Arousal-Neutral-Relaxation among variables correlated with Entropy analyzed by multiple regression. The difference between the two emotion levels was verified by independent t-test.
Dependent VariablesTestsStatisticslnHF
EntropyANOVAF3.538
p0.029
T-testArousal-Neutralt1.900
p0.058
Neutral-Relaxationt−1.914
p0.056
Arousal-Relaxationt1.069
p0.285
Table 19. Results of one-way ANOVA show a significant difference between Arousal-Neutral-Relaxation among variables correlated with Circadian Movement analyzed by multiple regression. The difference between the two emotion levels was verified by independent t-test.
Table 19. Results of one-way ANOVA show a significant difference between Arousal-Neutral-Relaxation among variables correlated with Circadian Movement analyzed by multiple regression. The difference between the two emotion levels was verified by independent t-test.
Dependent VariablesTestsStatisticslnHF
Circadian MovementANOVAF3.648
p0.026
T-testArousal-Neutralt−2.202
p0.028
Neutral-Relaxationt1.621
p0.105
Arousal-Relaxationt1.614
p0.107
Table 20. Results of one-way ANOVA show a significant difference between Arousal-Neutral-Relaxation among variables correlated with Sound Amplitude analyzed by multiple regression. The difference between the two emotion levels was verified by independent t-test.
Table 20. Results of one-way ANOVA show a significant difference between Arousal-Neutral-Relaxation among variables correlated with Sound Amplitude analyzed by multiple regression. The difference between the two emotion levels was verified by independent t-test.
Dependent VariablesTestsStatisticsSound Frequency
Sound AmplitudeANOVAF5.174
p0.006
T-testArousal-Neutralt−0.878
p0.380
Neutral-Relaxationt−1.571
p0.116
Arousal-Relaxationt3.191
p0.001
Table 21. Results of one-way ANOVA show a significant difference between Arousal-Neutral-Relaxation among variables correlated with Sound Frequency analyzed by multiple regression. The difference between the two emotion levels was verified by independent t-test.
Table 21. Results of one-way ANOVA show a significant difference between Arousal-Neutral-Relaxation among variables correlated with Sound Frequency analyzed by multiple regression. The difference between the two emotion levels was verified by independent t-test.
Dependent variablesTestsStatisticsDominant HzSound Amplitude
Sound FrequencyANOVAF3.3146.380
p0.0360.002
T-testArousal-Neutralt0.359−1.420
p0.7200.156
Neutral-Relaxationt−2.788−1.355
p0.0050.175
Arousal-Relaxationt2.1413.574
p0.0320.000
Table 22. Results of ANOVA show a significant difference between Positive-Neutral-Negative among variables correlated with VLF analyzed by multiple regression. The difference between the two emotion levels was verified by independent t-test.
Table 22. Results of ANOVA show a significant difference between Positive-Neutral-Negative among variables correlated with VLF analyzed by multiple regression. The difference between the two emotion levels was verified by independent t-test.
Dependent VariablesTestsStatisticsVLF/HF RatioDominant Hz
VLFANOVAF3.2384.067
p0.0390.017
T-testPositive-Neutralt−1.8110.959
p0.070.338
Neutral-Negativet−0.0991.409
p0.9210.159
Positive-Negativet1.953−2.698
p0.0510.007
Table 23. Results of one-way ANOVA show a significant difference between Positive-Neutral-Negative among variables correlated with VLF(%) analyzed by multiple regression. The difference between the two emotion levels was verified by independent t-test.
Table 23. Results of one-way ANOVA show a significant difference between Positive-Neutral-Negative among variables correlated with VLF(%) analyzed by multiple regression. The difference between the two emotion levels was verified by independent t-test.
Dependent VariablesTestsStatisticsLF(%)HF(%)lnHF
VLF(%)ANOVAF3.3594.1074.281
p0.0350.0170.014
T-testPositive-Neutralt1.822.1662.37
p0.0690.030.018
Neutral-Negativet−2.329−2.366−2.083
p0.020.0180.037
Positive-Negativet1.0940.9130.744
p0.2740.3610.457
Table 24. Results of one-way ANOVA show a significant difference between Positive-Neutral-Negative among variables correlated with LF(%) analyzed by multiple regression. The difference between the two emotion levels was verified by independent t-test.
Table 24. Results of one-way ANOVA show a significant difference between Positive-Neutral-Negative among variables correlated with LF(%) analyzed by multiple regression. The difference between the two emotion levels was verified by independent t-test.
Dependent VariablesTestsStatisticspNN50VLF(%)HF(%)lnHF
LF(%)ANOVAF3.0013.8723.9524.002
p0.050.0210.0190.018
T-testPositive-Neutralt−1.5092.3942.1452.291
p0.1310.0170.0320.022
Neutral-Negativet−0.75−1.997−2.315−1.931
p0.4530.0460.0210.054
Positive-Negativet2.2430.1850.8610.674
p0.0250.8530.3890.5
Table 25. Results of one-way ANOVA show a significant difference between Positive-Neutral-Negative among variables correlated with HF(%) analyzed by multiple regression. The difference between the two emotion levels was verified by independent t-test.
Table 25. Results of one-way ANOVA show a significant difference between Positive-Neutral-Negative among variables correlated with HF(%) analyzed by multiple regression. The difference between the two emotion levels was verified by independent t-test.
Dependent VariablesTestsStatisticspNN50VLF(%)LF(%)lnHF
HF(%)ANOVAF3.3673.9383.4384.057
p0.0350.020.0320.017
T-testPositive-Neutralt−1.5482.4151.9412.294
p0.1220.0160.0520.022
Neutral-Negativet−0.857−2.014−2.244−1.946
p0.3920.0440.0250.052
Positive-Negativet2.3870.2040.9530.739
p0.0170.8390.3410.46
Table 26. Results of one-way ANOVA show a significant difference between Positive-Neutral-Negative among variables correlated with VLF/HF ratio analyzed by multiple regression. The difference between the two emotion levels was verified by independent t-test.
Table 26. Results of one-way ANOVA show a significant difference between Positive-Neutral-Negative among variables correlated with VLF/HF ratio analyzed by multiple regression. The difference between the two emotion levels was verified by independent t-test.
Dependent VariablesTestsStatisticsSound Amplitude
VLF/HF ratioANOVAF3.149
p0.043
T-testPositive-Neutralt−2.552
p0.011
Neutral-Negativet1.008
p0.313
Positive-Negativet1.28
p0.201
Table 27. Results of one-way ANOVA show a significant difference between Positive-Neutral-Negative among variables correlated with Peak Power analyzed by multiple regression. The difference between the two emotion levels was verified by independent t-test.
Table 27. Results of one-way ANOVA show a significant difference between Positive-Neutral-Negative among variables correlated with Peak Power analyzed by multiple regression. The difference between the two emotion levels was verified by independent t-test.
Dependent VariablesTestsStatisticsTotal Distance
Peak PowerANOVAF3.186
p0.041
T-testPositive-Neutralt1.991
p0.047
Neutral-Negativet−1.81
p0.07
Positive-Negativet0.858
p0.391
Table 28. Results of one-way ANOVA show a significant difference between Positive-Neutral-Negative among variables correlated with Dominant Hz analyzed by multiple regression. The difference between the two emotion levels was verified by independent t-test.
Table 28. Results of one-way ANOVA show a significant difference between Positive-Neutral-Negative among variables correlated with Dominant Hz analyzed by multiple regression. The difference between the two emotion levels was verified by independent t-test.
Dependent VariablesTestsStatisticsSound Frequency
Dominant HzANOVAF3.826
p0.022
T-testPositive-Neutralt1.784
p0.075
Neutral-Negativet−2.671
p0.008
Positive-Negativet1.473
p0.141
Table 29. Results of one-way ANOVA show a significant difference between Positive-Neutral-Negative among variables correlated with Total Power analyzed by multiple regression. The difference between the two emotion levels was verified by independent t-test.
Table 29. Results of one-way ANOVA show a significant difference between Positive-Neutral-Negative among variables correlated with Total Power analyzed by multiple regression. The difference between the two emotion levels was verified by independent t-test.
Dependent VariablesTestsStatisticsDominant Hz
Total PowerANOVAF3.305
p0.037
T-testPositive-Neutralt−1.26
p0.208
Neutral-Negativet−1.055
p0.291
Positive-Negativet2.473
p0.013
Table 30. Results of one-way ANOVA show a significant difference between Positive-Neutral-Negative among variables correlated with Total Distance analyzed by multiple regression. The difference between the two emotion levels was verified by independent t-test.
Table 30. Results of one-way ANOVA show a significant difference between Positive-Neutral-Negative among variables correlated with Total Distance analyzed by multiple regression. The difference between the two emotion levels was verified by independent t-test.
Dependent VariablesTestsStatisticsSound Amplitude
Total DistanceANOVAF4.284
p0.014
T-testPositive-Neutralt−2.509
p0.012
Neutral-Negativet−0.28
p0.779
Positive-Negativet2.473
p0.013
Table 31. Results of one-way ANOVA show a significant difference between Positive-Neutral-Negative among variables correlated with Transition Time analyzed by multiple regression. The difference between the two emotion levels was verified by independent t-test.
Table 31. Results of one-way ANOVA show a significant difference between Positive-Neutral-Negative among variables correlated with Transition Time analyzed by multiple regression. The difference between the two emotion levels was verified by independent t-test.
Dependent VariablesTestsStatisticsSound Amplitude
Transition TimeANOVAF3.66
p0.026
T-testPositive-Neutralt2.333
p0.02
Neutral-Negativet−0.769
p0.442
Positive-Negativet−1.852
p0.064
Table 32. Results of one-way ANOVA show a significant difference between Positive-Neutral-Negative among variables correlated with Sound Amplitude analyzed by multiple regression. The difference between the two emotion levels was verified by independent t-test.
Table 32. Results of one-way ANOVA show a significant difference between Positive-Neutral-Negative among variables correlated with Sound Amplitude analyzed by multiple regression. The difference between the two emotion levels was verified by independent t-test.
Dependent VariablesTestsStatisticsVLF/HF Ratio
Sound AmplitudeANOVAF6.852
p0.001
T-testPositive-Neutralt−3.389
p0.001
Neutral-Negativet2.918
p0.004
Positive-Negativet−0.374
p0.708

Share and Cite

MDPI and ACS Style

Cho, A.; Lee, H.; Jo, Y.; Whang, M. Embodied Emotion Recognition Based on Life-Logging. Sensors 2019, 19, 5308. https://doi.org/10.3390/s19235308

AMA Style

Cho A, Lee H, Jo Y, Whang M. Embodied Emotion Recognition Based on Life-Logging. Sensors. 2019; 19(23):5308. https://doi.org/10.3390/s19235308

Chicago/Turabian Style

Cho, Ayoung, Hyunwoo Lee, Youngho Jo, and Mincheol Whang. 2019. "Embodied Emotion Recognition Based on Life-Logging" Sensors 19, no. 23: 5308. https://doi.org/10.3390/s19235308

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop