Next Article in Journal
Temporal, Kinematic and Kinetic Variables Derived from a Wearable 3D Inertial Sensor to Estimate Muscle Power during the 5 Sit to Stand Test in Older Individuals: A Validation Study
Previous Article in Journal
A Comparison of Multiple Odor Source Localization Algorithms
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Influence of Multimodal Emotional Stimulations on Brain Activity: An Electroencephalographic Study

Department of Computer Science, Tokyo Institute of Technology, Yokohama 226-8502, Japan
*
Author to whom correspondence should be addressed.
Sensors 2023, 23(10), 4801; https://doi.org/10.3390/s23104801
Submission received: 20 April 2023 / Revised: 5 May 2023 / Accepted: 12 May 2023 / Published: 16 May 2023
(This article belongs to the Section Biomedical Sensors)

Abstract

:
This study aimed to reveal the influence of emotional valence and sensory modality on neural activity in response to multimodal emotional stimuli using scalp EEG. In this study, 20 healthy participants completed the emotional multimodal stimulation experiment for three stimulus modalities (audio, visual, and audio-visual), all of which are from the same video source with two emotional components (pleasure or unpleasure), and EEG data were collected using six experimental conditions and one resting state. We analyzed power spectral density (PSD) and event-related potential (ERP) components in response to multimodal emotional stimuli, for spectral and temporal analysis. PSD results showed that the single modality (audio only/visual only) emotional stimulation PSD differed from multi-modality (audio-visual) in a wide brain and band range due to the changes in modality and not from the changes in emotional degree. The most pronounced N200-to-P300 potential shifts occurred in monomodal rather than multimodal emotional stimulations. This study suggests that emotional saliency and sensory processing efficiency perform a significant role in shaping neural activity during multimodal emotional stimulation, with the sensory modality being more influential in PSD. These findings contribute to our understanding of the neural mechanisms involved in multimodal emotional stimulation.

1. Introduction

Multimodal emotional stimulation is widely used in modern society for evoking various emotional states in humans via external sensory inputs, including auditory, visual, olfactory, and tactile cues. These stimulations are prevalent in different media, such as movies, animation, and virtual reality, and they can be used in education, marketing, advertising, psychological counseling, and entertainment [1,2,3]. A study on the response of the human brain to these stimuli is critical for understanding the process of emotional multimodal stimulation. Electroencephalography (EEG) is commonly used to analyze and understand how emotional multimodal stimulation affects human brain activity. EEG measures the electrical activity of the brain by using electrodes placed on the scalp and recording the signals generated by neurons. By analyzing EEG data, researchers can better understand how these stimulations affect specific areas of the brain and elicit emotional responses in humans [4].
Brain–computer interfaces are promising for multimodal emotion recognition [5,6]. EEG signals can be used to recognize emotions from visual and audio-visual stimuli [7], and the fusion of EEG signals and audio-visual features can improve emotion recognition performance [8]. However, there is a lack of research on the process of multimodal emotional stimulation and how it affects human brain activity. Therefore, research in this area could be significant in improving the effectiveness of these stimulations. This research can assist in the development of new methods used for designing more effective emotional stimuli and for attaining a better understanding of how various multimodal stimulations impact different individuals.
Recent studies using EEG have investigated the neural correlates of emotional processing. Jiang et al. [9] found that alpha and beta oscillations in the prefrontal cortex are modulated by the valence and arousal of emotional stimuli during an affective priming task. In another study, Mu et al. [10] found that beta and gamma oscillations in the temporal and parietal regions were associated with the processing of social feedback during interpersonal trust games. Schelenz et al. [11] investigated the multisensory integration of dynamic emotional faces and voices and proposed a method for simultaneous EEG and fMRI measurements. Additionally, Li et al. [12] demonstrated that alpha oscillations in the prefrontal cortex are associated with cognitive reappraisal during an emotion regulation task, whereas Yu et al. [13] found that delta and theta oscillations in the anterior cingulate cortex and insula are modulated by the perceived intensity of emotional pain during a social exclusion task. The medial temporal pole was observed to be responsible for driving amygdala activity in response to emotional stimuli by Sonkusare et al. [14]. Ahirwal and Kose investigated correlated EEG channel agendas, and the results were examined with respect to audio-visual emotion classification [15]. In the spectral domain, Balconi and Vanutelli investigated how vocal and visual stimulation, congruence, and lateralization affect brain oscillations in emotional interspecies interactions [16]. Temporal analyses were conducted by Balconi and Carrera [17], and they showed that the cross-modal integration of an emotional face and voice leads to the P2 ERP effect. These studies highlight the continued use of EEG in elucidating the neural mechanisms underlying emotional processing.
EEG signals have been used to investigate the neural mechanisms underlying sensory processing, including visual, auditory, and somatosensory processing. For instance, Gallotto et al. [18] investigated the neural correlates of visual perception using EEG and found that alpha and beta oscillations were modulated by attentional demands during visual processing. Craddock et al. [19] found that alpha oscillations in the somatosensory cortex were associated with the detection of tactile stimuli. Klasen et al. [20] found that emotional information processing in multimodal environments is highly integrated, with interactions and regulatory effects between different sensory channels. Additionally, Bugos et al. [21] used EEG to investigate the neural mechanisms underlying auditory processing and found that theta oscillations in the auditory cortex are modulated by musical features. Zimmer et al. [22] found that emotional cues presented in multiple sensory modalities can improve visual attention. Ross et al. [23] investigated the neural correlates of somatosensory processing using EEG and found that gamma oscillations in the somatosensory cortex are associated with the processing of vibrotactile stimulation. Brain functional area studies were utilized by Aricò et al. [24], revealing that EEG alpha power is associated with attentional changes during cognitive tasks and virtual reality immersion sensory stimulations. The effects of prolonged waking-auditory stimulation on EEG synchronization and cortical coherence were investigated by Cantero et al. [25]. Baumgartner et al. [26] found that different types of sensory stimuli evoke emotions in different ways, but the essential characteristics of emotional experience are similar. These studies demonstrate the utility of EEG in uncovering the role of oscillatory activity in sensory processing and its relationship with perception and attention.
However, previous EEG-related research has mutually and exclusively focused on the influence of stimulation modality and emotional stimuli. The effects of EEG with changes in multimodal emotional stimulations have not been thoroughly studied, especially the spatiotemporal analysis under its uni-stimulus-modality and multi-stimulus modality. Overall, much remains to be understood regarding the neural mechanisms underlying emotional and sensory processing.
Our study aimed to investigate the changes in human brain activity in response to multimodal emotional stimulation and to assess the effects of the emotional mode and sensory modality individually and together with respect to EEG. These investigations were conducted because movies are recognized as useful naturalistic stimuli for neuroimaging studies, and they were utilized in previous studies [27]. To achieve this, we selected short film clips as the source of multimodal (visual + auditory) emotional stimulation and recruited 20 participants. We assessed two emotions (pleasure/unpleasure) and three modes (audio/visual/audio-visual); thus, a total of six conditions were assessed by EEG and self-assessment data analyses. The subjective evaluation of the self-assessment metric (SAM) and objective PSD and EEG parameters were analyzed in combination, and correlation calculations were performed to assess whether significant changes in these parameters were associated with emotional states and/or multimodality. The results of this study can be used to gain an in-depth understanding of the relevant changes in human brain activity in response to multimodal emotional stimuli, thus contributing to research and application in the areas of emotion regulation, psychotherapy, and EEG-related brain–computer interface BCI.
Highlights: This study is the first attempt to use film clips as a stimulus source for multimodal audio-visual stimulation, and it investigates the effect of three stimulation modalities on emotion regulation from the perspective of EEG. It is the first analysis of multimodal emotional stimulus EEG data in the time domain with the brain functional area and multi-band analysis and frequency domain with N200 and P300, under the uni-stimulus-modality and multi-stimulus modality of multimodal emotional stimulus, indicating specific physiological significance.

2. Materials and Methods

2.1. Experimental Setup

2.1.1. Participant Information

Referring to the previous EEG experiment using the emotion mode and sensory modality [28,29,30], twenty individuals were invited to participate in this study. Of the participants, 11 were men and 9 were women, aged 24.7 ± 1.9 years. All participants were international students at the Tokyo Institute of Technology and native Chinese speakers, right-handed, and had no history of mental illness or recent mental or physical trauma. All participants provided written informed consent to participate in this study. The study was conducted in accordance with the principles of the Declaration of Helsinki. The study was approved by the Ethics Committee of the Tokyo Institute of Technology.

2.1.2. Emotional Stimulus

Next, 20 emotional video clips were selected to elicit 2 emotions: pleasure or unpleasure (sad). Each emotion was shown in 10 video clips, with each lasting 30 s. Stimulus video clips were sourced from the New Standardized Emotional Film Database for Asian Culture [31]. The detailed information for selected video clips were referred to Supplementary Material Table S1. We trimmed the original videos from the database to 30 s to meet the stimulus duration requirements for this study.

2.1.3. Experimental Protocol

During the experiment, subjects were asked to sit in a comfortable chair with their hands placed flat on a table. A 24-inch monitor with 1024 × 768 resolution was connected to a computer that sent control commands to play the experimental instructions and visual stimuli on the screen. The subjects were asked to wear AirPods Pro2 in the noise-canceling mode while playing audio stimuli. The distance between the participants and the monitor was approximately 90 cm, measured with their chins resting on a chinrest to prevent the shaking of the head. The experimental setup is shown in Figure 1.
Before the experiment, a questionnaire survey was conducted to ascertain the physical and mental condition of the subjects. The contents of questionnaire were referred to supplementary material: S3. Questionnaire before emotion EEG experiment. After the questionnaire, according to the type of stimulus, there were three experimental conditions/modalities: audio stimulus, visual stimulus, and audio-visual stimulus. Resting conditions were used as the control.
Audio stimulus modality: The film clip was played with a black screen, and only audio was provided.
Visual stimulus modality: The film clip was played while muted, and only visual stimulus was provided.
Audio-visual stimulus modality: The film clip was played with original audio and visual information.
Resting state: No audio or visual information was provided.
Each subject first performed the experiment in the resting state to enable the measurement of EEG in the resting state. In the resting state, the subjects maintained a sitting posture that was the same as that in stimulus-corresponding tasks. The duration for the resting state EEG recording was 300 s.
The experimental workflow for the three stimulus modalities of multimodal emotional stimulation is shown in Figure 2.
The experimental execution of the multimodal emotional conditioning task consisted of 60 trials, each lasting approximately 2.5 min. At the beginning of each trial, a 5 s cue indicated the starting time, followed by an 8 s cross-fixation to direct the participant’s attention. The stimulus lasted 30 s and was pseudo-randomly presented with one of two emotional stimuli (pleasure or unpleasure) in one of the three stimulus modalities. Pseudo-random protocol was referred to Supplementary Material: S2. Pseudo-random protocol for the experiment. After the presentation of the stimulus, participants completed a self-assessment questionnaire to rate the type and degree of emotion experienced, spending approximately 20 s. Then, there was a 60 s rest without any visual or auditory stimulation. For each subject, the entire experiment, which included the requisite rest during the emotional conditioning task and resting state EEG measurements, lasted approximately 2.5 h; a 5 min break was provided after every 20 trials to avoid fatigue effects.

2.2. Data Acquisition

2.2.1. EEG Data Acquisition

In this study, a 32-channel EEG system (Brain Products GmbH, Gilching, Germany) was used to collect EEG data, and the electrodes were as follows: Fp1, Fp2, F3, F4, F7, F8, FT9, FT10, FC5, FC6, T7, T8, CP5, CP6, TP9, TP10, P7, P8, FC1, FCz, FC2, C3, Cz, C4, CP1, CP2, P3, Pz, P4, O1, O2, and Oz. The positions of the channels on the scalp are shown in Figure 3.
Before the experiment, each subject wore a headset for EEG collection. By the application of a special gel to the scalp, the impedance between each electrode and the scalp was reduced to below 10 kΩ. Using BrainVision Recorder software, an amplifier was used to amplify weak electrical signals from the brain, and the data acquisition system was used to digitize and store the data. We used FCz as the reference electrode, with a sampling frequency of 500 Hz. The data from these experiments are detailed in Table 1.
EEG data were divided into seven categories. Audio unpleasure EEG data were collected with the audio stimulus modality, and film clips were collected with unpleasure emotions. For visual unpleasure data, EEG data were collected with the visual stimulus modality, and film clips were collected with unpleasure emotions. For audio-visual unpleasure data, EEG data were collected with audio-visual stimulus modalities, and film clips were collected with unpleasure emotions. For audio pleasure data, EEG data were collected with the audio stimulus modality, and film clips were collected with pleasure emotions. For visual pleasure, EEG data were collected with the visual stimulus modality, and film clips were collected with pleasure emotions. For audio-visual pleasure, EEG data were collected with the audio-visual stimulus modality, and film clips were collected with pleasure emotions. Resting state EEG data were collected in the resting state.
Collected data comprised the following: 20 (subjects) × 2 (emotional classes) × 3 (stimulus modalities) × 10 (trials) = 1200 trials, with 200 trials for each emotion with 1 stimulus modality. The duration of each trial was 30 s. Resting state = 1 × 20 (subjects) trials were collected before the main experiment with a duration of 300 s for each trial.

2.2.2. Self-Assessment Data

In each trial of the multimodal emotional conditioning tasks, self-assessment data were collected post-stimulus. According to the self-assessment metric (SAM) [32], the participants’ self-reported scores during each trial, as shown in Figure 4.
During the self-assessment in each trial, subjects named the level of emotional arousal they experienced and assigned an integer score from 0 to 10. If the participant observed that the emotion was pleasure, it was recorded as the original score, and if the emotion was unpleasure, it was recorded as a negative absolute value of the score. All readings were recorded by the researcher.
Self-assessment data were used as a subjective index to validate multimodal emotional stimulation, and the data were explored using EEG analysis.

2.3. Data Analysis

2.3.1. EEG Data Preprocessing

When preprocessing EEG data to enhance data quality, we utilized several methods to improve the accuracy and reliability of the data. Specifically, the following steps were taken:
Filtering: We applied a notch filter with a frequency of 50 Hz to remove electromagnetic interference from the EEG data. Additionally, we used the average potential as the reference electrode and re-referenced the EEG electrodes to further enhance data quality. These steps helped reduce noise and artifacts in the EEG signals.
Artifact removal: We identified and removed EEG signals that were greater than 100 µv, which are commonly considered noise in EEG data. This step helped further reduce the impact of artifacts on the data.
ICA decomposition: We used independent component analysis (ICA) to identify and separate the different independent sources of the EEG signals. This step helps identify components that are not related to the neural activity of interest, such as eye movements or muscle artifacts. The ICA result would be used in ICLabel.
ICLabel: We used the ICLabel tool in EEGLAB [33] to classify the independent components obtained by ICA. ICLabel is a commonly used automatic classification tool that classifies independent components into different categories, including eye movement, muscle artifact, and neural activity. We used ICLabel to remove components corresponding to eye movements and muscle artifacts in order to improve the accuracy of our results.
By utilizing these methods, we were able to preprocess the EEG data and prepare it for further analysis. The combination of filtering, artifact removal, ICA decomposition, and ICLabel classification helped enhance data quality and reduce the impact of noise and artifacts on the data.

2.3.2. Power Spectral Density (PSD) Analysis of EEG

For the analysis of EEG data after preprocessing, we first conducted frequency domain analysis.
In frequency domain analysis, the main index used was PSD, which is the result of the spectral analysis of the power of EEG signals. Different frequency bands of EEG can be obtained using the above experimental conditions and the distribution of PSD in different regions; thus, the significance of brain activity under different experimental conditions can be specifically explained [34]. Meanwhile, the inner condition comparison can be easily conducted using PSD. Before explaining the PSD calculation method adopted in this study, we specify the strategy for the PSD analysis. As this experiment pertained to emotion and stimulation modalities, we used frequency band and brain functional analyses together. We aimed to reveal the conditions under different frequency ranges and the activity of each functional brain area corresponding to the EEG obtained from different experimental conditions.

EEG Spectral Analysis

EEG spectrum band research is a well-recognized analysis method in frequency domain studies and has significance in the study of emotion and sensation. In this EEG, signals are divided into various frequency bands [35].
Commonly used sub-spectral bands include delta: 1–4 Hz; theta: 4–8 Hz; alpha: 8–14 Hz; beta: 14–30 Hz; and gamma: 30–100 Hz [36]. In this EEG study with emotional and multimodal (visual, auditory, and audio-visual) stimulations, we selected three bands—delta, theta, and alpha—and used a bandpass filter to obtain sub-spectrum bands.
Increases in delta and theta waves are associated with negative emotions, such as unpleasure, while increases in alpha waves are associated with positive emotions, such as pleasure [37]. In addition, different stimulation modalities can elicit different EEG patterns that can be used to differentiate stimulus types, such as visual, auditory, or multimodal stimuli. These frequency bands are particularly relevant in the analysis of emotion and stimulus patterns as they provide valuable insights into the physiological changes that occur in response to different emotional and stimulus conditions [38]. By analyzing these specific frequency bands, we can better understand the neural mechanisms underlying emotional processing, such as unpleasure and pleasure.
For example, we can compare pleasure and unpleasure emotional stimuli by comparing the intensity changes in different frequency bands. In addition, we compared the effects of visual, auditory, and audio-visual stimuli on different frequency bands, thereby assessing the effects of different stimulation modalities on emotional and sensory responses.

EEG Functional Brain Mapping

The study of divided functional brain areas is important in the field of brain science [39]. Functional brain division refers to the subdivision of the entire brain into smaller regions as per their contribution to different functions. In this study, five functional areas were defined as corresponding to the EEG channels utilized in the experiment: prefrontal, temporal, central, parietal, and occipital [40]. With the exception of the original reference channel FCz, the remaining 31 channels were allocated as per Figure 5.
The prefrontal cortex (Fp1, Fp2, F3, and F4) is mainly responsible for thought, cognition, and behavioral control.
The temporal cortex (F7, F8, FT9, FT10, FC5, FC6, T7, T8, CP5, CP6, TP9, TP10, P7, and P8) is primarily responsible for language processing, auditory cognition, and memory processing.
The central cortex (FC1, FC2, C3, Cz, and C4) is primarily responsible for functions, such as perception, sensation, and sensory fusion.
The parietal cortex (CP1, CP2, P3, Pz, and P4) is primarily responsible for body perception, spatial orientation, and hand–eye coordination.
The occipital cortex (O1, O2, and Oz) is primarily responsible for visual processing and cognition.
For each trial, the processes of a one-channel EEG with bandpass filtering and its PSD were calculated using the Welch method. The following process was performed using the MNE library in Python [41,42].
Segmentation: We divided the pre-processed EEG signals using the default setting for calculating the PSD using Welch’s method in a library, which usually includes overlapping windows. Adjacent windows overlapped by a certain percentage of their length, typically 50%. This allowed for a more accurate estimation of the PSD by reducing variance.
PSD estimation: We used a function for estimating PSD using Welch’s method to estimate the PSD of each window of each EEG channel. The default settings for this function included overlapping windows, a Hanning window, and a Welch window size of 250.
Denoting a one-channel EEG from one trial as vector X, the PSD can be determined as follows:
p p s d ω = 1 n | m = 0 n 1 X m · e j w m | 2
where X is a vector with n elements, as indicated by the recordings from one channel.
Average PSD: The function for estimating the PSD using Welch’s method returns an average PSD estimate for each channel in each window. The average PSD was calculated from the PSD estimates across all windows for each channel. Using the process shown in Figure 6, we obtained the PSD for each channel in each frequency band under different experimental conditions.

2.3.3. Event-Related Potential Analysis for Temporal EEG

For this study, we selected six channels (Cz, C3, C4, P3, Pz, and P4) for the analysis of event-related potentials (ERPs). ERPs are time-locked electrical responses of the brain that occur in response to specific events, such as the presentation of a visual or auditory stimulus. The N200 and P300 ERP components are well-established indicators of cognitive processing [43].
The N200 component occurs approximately 200 ms after a stimulus and is typically associated with early sensory processing. It reflects the brain’s response to the physical features of a stimulus, such as brightness or pitch. In contrast, P300 occurs approximately 300 ms after the stimulus and reflects the allocation of attention and semantic processing. It is often used to measure cognitive processing related to decision making, working memory, and attention [44].
In this study, we focused on N200 and P300 components and selected the six commonly used channels for analyses. The channels selected were Cz, C3, C4, P3, Pz, and P4. We aimed to investigate the differences in ERP responses between conditions and determined whether there were any significant differences in cognitive processing [45]. The process for temporal EEG analyses is shown in Figure 7.

3. Results and Discussion

3.1. Self-Assessment

First, considering the subjective self-assessment of emotion induced by multimodal emotional simulation, in 20 subjects, each person experienced 60 trials of stimulation; the total number of stimulation conditions was 6, 2 (pleasure/unpleasure) × 3 (audio/visual/audio-visual). The self-assessment results for each subject in terms of the experimental conditions are shown in Table 2.
By calculating the self-assessments of all subjects for the six experimental conditions, we obtained the average self-assessment score [46,47] for each experimental condition shown in Table 2.
We refer to pleasure as PL and unpleasure as UNPL. The results of the self-assessment of all subjects in the six conditions are audio pleasure = 6.50 ± 2.40; visual pleasure = 6.17 ± 2.51; audio-visual pleasure = 7.23 ± 1.91; audio unpleasure = −6.17 ± 2.35; visual unpleasure = −5.91 ± 2.30; and audio-visual unpleasure = −7.00 ± 2.02, repsecitvely. By using one-way ANOVA [48], we first validated the significance of the self-evaluation of the six conditions, where the basic elements of each condition were each subject; that is, we used 20 elements for each set of data, with a total of 6 sets of data. The obtained p-value was 9.43 × 10−78 < 0.1, which shows that, in the self-assessment results of the six conditions, there are significant differences within the 20 subject groups.
Next, we calculated the relative difference between the two pairs of conditions within the six conditions. The p-values from the one-way ANOVA were as follows: for audio pleasure and visual pleasure, p = 0.48 > 0.1, indicating no significant difference; for audio unpleasure and visual unpleasure, p = 0.61 > 0.1, indicating no significant difference; for audio pleasure and audio-visual pleasure, p = 0.06 < 0.1, indicating a significant difference; for visual pleasure and audio-visual pleasure, p = 0.02 < 0.1, indicating a significant difference; for audio unpleasure and audio-visualu npleasure, p = 0.01 < 0.1, indicating a significant difference; and for visual pleasure and audio-visual pleasure, p = 0.02 < 0.1, indicating a significant difference.
According to the results of one-way ANOVA, we concluded that the audio and visual conditions have no significant difference (p > 0.1) for the two emotional states of pleasure and unpleasure. However, the audio-visual condition was significantly different (p < 0.1) from audio- and visual-only conditions.
The results also suggest that a combination of multiple sensory modalities (audio-visual modality) can have a greater impact on mental states than stimulation from a single sensory modality (audio only or visual only) in view of subjective assessments.

3.2. Comparison of PSD between Resting EEG and Multimodal Stimulation EEG

To explore the neurological effects of multimodal emotional stimulation in the six conditions, we first studied the PSD during stimulated and pre-stimulated phases. We chose the start of the stimulation as 0 s, an EEG of 0~30 s for 3000 ms EEG in the stimulated phase, and the 500 ms EEG of −5~0 s for the pre-stimulated phase. The stimulus EEG and pre-stimulus EEG were transformed by epoching original EEG data. The changes before and after stimulation were compared. The extracted stimulus and pre-stimulus EEG datasets were preprocessed, and the PSD of the band-divided brain functional area was calculated to obtain the pre-stimulated and stimulated PSD in decibels. We calculated the percentage change in PSD. For each frequency band interval corresponding to each experimental stimulus condition and functional area, we used the following formula:
C h a n g e   o f   p e r c e n t a g e P S D = s t i m u l a t e d P S D p r e s t i m u l a t e d P S D | p r e s t i m u l a t e d P S D | × 100 %
where the p r e s t i m u l a t e d P S D had a negative value, and we used its absolute value for correction. For the data of 20 subjects, we calculated their respective results and used a paired t-test to assess statistical significance. We considered results with a p-value < 0.1 as significantly different and denoted 0.05 < p < 0.1 as *, 0.01 < p < 0.05 as **, and 0.01 > p as ***. The results are shown in Table 3 and Figure 8.
There was a statistically significant difference between the stimulated and pre-stimulated PSD only in the frequency domain of the alpha wave and none in those of the delta and theta bands. Therefore, we chose only the alpha frequency domain and drew the topography of C h a n g e   o f   p e r c e n t a g e P S D . The six conditions were as follows.
The occipital PSD increased by 2.177 in the presence of both auditory stimulation and unpleasure. Temporal, parietal, and occipital PSDs decreased by 1.814, 2.449, and 2.845, respectively, under visual stimulation and unpleasure moods. The PSD in the temporal, parietal, and occipital lobes decreased by 1.514, 1.867, and 2.176, respectively, under audio-visual stimulation and unpleasure moods.
Temporal and occipital PSDs increased by 1.465 and 2.696, respectively, under auditory stimulation and pleasure. Temporal, parietal, and occipital PSDs decreased by 1.347, 1.759, and 1.876, respectively, under visual stimulation and pleasure. Under audio-visual stimuli and pleasure emotions, parietal, parietal, and occipital PSDs increased by 0.359, 0.473, and 0.07, respectively.
These results suggest that changes in the PSD in the brain are influenced by sensory and emotional factors and that the effects of these factors vary by brain region and specific sensory and emotional conditions.
In the pre-stimulated and stimulated conditions for comparison, only the alpha wave showed a statistically significant difference among the selected bands. This indicates that different frequency bands in the EEG are related to different brain processes. Alpha is associated with attention and relaxation, and changes in alpha activity are associated with different cognitive and emotional states [49]. Thus, PSD changes in the alpha band may be more sensitive to sensory and emotional conditions than those in the other bands, leading to the observations. The results show that changes in the PSD of the brain under different sensory and emotional conditions are complex and multifaceted.
In terms of sensory conditions, the results suggest that auditory stimuli have different effects on the brain than visual stimuli. For example, PSD in the temporal and occipital regions increased in response to auditory stimulation, whereas PSD in the temporal, parietal, and occipital regions decreased in response to visual stimulation. In response to audio-visual stimuli, the PSD increased in the temporal regions but decreased in the parietal and occipital regions. These results suggest that different sensory inputs have different effects on brain function and that the combination of auditory and visual stimuli can produce unique patterns of PSD changes.
In terms of emotional conditions, the results suggest that unpleasure and pleasure emotions affect the brain differently. Under the unpleasure condition, the PSD was reduced, with the greatest reduction in the occipital regions. In contrast, in the pleasure condition, the PSD increased, with the greatest increase in the occipital region. These results suggest that emotions have profound effects on brain function and that different emotions can produce different patterns of PSD changes.

3.3. Emotion-Related PSD Analysis

To explore the role of the emotional part of the stimulus in this experiment, we performed emotion-related PSD analyses. We divided the control group into pleasure and unpleasure for each modality stimulation so that we could obtain a total of three pairs of comparisons: Audio Pleasure vs. Audio Unpleasure, Visual Pleasure vs. Visual Unpleasure, and Audio-Visual Pleasure vs. Audio-Visual Unpleasure. We chose the start of stimulation as 0 s and an EEG of 0~30 s for 3000 ms in the stimulated phase for both pleasure and unpleasure conditions. The PSD of each functional area was calculated for the three theta, beta, and alpha bands. The percentage change was used to measure the change in the pleasure state compared to the unpleasure state. The calculation was as follows:
C h a n g e   o f   p e r c e n t a g e P S D = P l e a s u r e s t i m u l a t e d P S D U n p l e a s u r e s t i m u l a t e d P S D | U n p l e a s u r e s t i m u l a t e d P S D | × 100 %
Since the calculated U n p l e a s u r e s t i m u l a t e d P S D had a negative value, we used its absolute value to ensure the correction of results. We calculated the respective results for the data from the 20 subjects and used the t-test for statistical significance. Results with a p-value < 0.1 were considered statistically significant, and we denoted 0.05 < p < 0.1 as *, 0.01< p < 0.05 as **, and 0.01 > p as ***. The calculated results are shown in Table 4.
As there were statistically significant differences in the range of the three bands, we drew the topography of the percentage change in the three bands, as shown in Figure 9.
We observed that the results of the delta interval were significantly different from those of theta and alpha. We concluded that in the delta band and parietal and occipital brain regions, the pleasure PSD was greater than the unpleasure PSD. However, in the theta and alpha bands and the parietal and occipital regions, pleasure PSD was lower than unpleasure PSD. The influence of each sensory stimulation modality differed among the three bands. The difference between theta with alpha patterns is evident from the topography diagram. Although the changing trend of theta is approximately the same as that of alpha, the degree of change in alpha is greater than that of theta; that is, alpha is more sensitive to changes in emotions.
Combining the results of the t-test and functional brain area calculations, we can observe the following.
Delta band: The PSD of the delta wave changed significantly under the audio and audio-visual modalities; however, no similar change was observed under the visual modality. In the audio modality, the PSD of delta waves did not change significantly in the frontal and central regions but was slightly reduced by 0.171% in other regions. These results suggest that delta waves may be involved in emotional processing in certain contexts.
Theta band: The PSD of the theta waves changed significantly in all audio, visual, and audio-visual modalities. In the audio modality, the PSD increased in the temporal, central, and parietal regions (0.148%, 0.174%, and 0.221%, respectively) and decreased in the occipital region (−0.234%). Under visual conditions, the PSD increased in the temporal and parietal regions (0.134% and 0.238%, respectively) but did not change significantly in other regions. In the audio-visual modality, the PSD increased in the central and temporal regions (0.123% and 0.104%, respectively). These results suggest that theta waves are sensitive to emotional states.
Alpha band: The PSD of the alpha band changed significantly in all audio, visual, and audio-visual modalities. In the audio modality, the PSD increased in the central and parietal regions (0.362% and 0.699%, respectively) but did not change significantly in the temporal regions. In the visual mode, the PSD increased (0.269%) in the occipital region and decreased (−0.104%) in the temporal region, with no significant changes in other regions. In the audio-visual condition, the PSD increased in the central and occipital regions (0.0492% and 0.034%, respectively) and decreased in the temporal and parietal regions (−0.031% and −0.130%, respectively).
The results indicated that the change in emotional state had a significant impact on the change in theta activity. The PSD in the central and parietal regions significantly increased under the unpleasure condition, while the theta activity in the temporal and parietal regions decreased under the pleasure condition. This suggests that changes in the theta activity, a hallmark of emotional processing in the brain, may be more complex than previously thought. The specific regions affected by changes in theta activity may vary according to the emotional state.
Furthermore, the results indicated that the change in emotional state also had a significant impact on the change in alpha activity. The PSD of the occipital region significantly increased with respect to the pleasure emotion, whereas the alpha activity of the temporal region significantly decreased with respect to the unpleasure emotion. These results suggest that alpha activities may be involved in emotional processing, particularly in the occipital region, which is thought to be involved in visual processing. The specific regions affected by changes in alpha activity may depend on emotional state and sensory input [50].
Findings in the delta band suggest that changes in the emotional state may be less sensitive to changes in the delta activity than in other bands. The PSD in the frontal regions was significantly lower in the pleasure emotion, suggesting that changes in delta activity may be related to emotional processing within this brain region [51]. The specific regions affected by changes in delta activity may vary according to the emotional state and sensory input.
Thus, the effect of emotion on PSD could be discussed in that the analysis found that the emotional state of the participants had a significant impact on PSD, with all three frequency bands (delta, theta, and alpha) showing changes in activity that were dependent on the emotional state. Emotion perception shows serial dependence within and between sensory modalities [52]. This suggests that emotions can modulate neural activity in specific brain regions and frequency bands, which may have implications for understanding emotional processing and regulation.
This section investigated the changes in PSD in response to emotional stimuli across different frequency bands, focusing on the emotional states of pleasure and unpleasure. We explored the changes in PSD across different sensory inputs and functional brain regions to better understand the complex interplay between emotional states and brain functions. The findings suggest that different frequency bands exhibit different patterns of change in response to emotional stimuli and that the specific affected regions also vary relative to the emotional state and sensory input.

3.4. Stimulus-Modality-Related PSD Analysis

Next, we focused on the influence of the stimulus modality on PSD in the frequency domain. We used multimodality stimuli (audio-visual) and single-modality stimuli (audio and visual) in our experiment. That is, we wanted to compare the changes in PSD between the multimodality stimulus and the two single-modality stimuli. Therefore, we performed comparisons in four categories: audio-visual unpleasure vs. audio unpleasure (AV-A-UNPL), audio-visual pleasure vs. audio pleasure (AV-A-PL), audio-visual unpleasure vs. visual unpleasure (AV-V-UNPL), and audio-visual pleasure vs. visual pleasure (AV-V-PL). Here, we used the percentage change to evaluate the change in PSD as well; that is, we evaluated the change in PSD after adding additional stimuli (multimodal) compared with a single modality, which is calculated as follows.
C h a n g e   o f   p e r c e n t a g e P S D = m u l t i s t i m u l a t e d P S D U n i s t i m u l a t e d P S D | U n i s t i m u l a t e d P S D | × 100 %
Since the calculated u n i t i m u l a t e d P S D had a negative value, we used its absolute value to ensure the correction of results. For the data of 20 subjects, we calculated their respective results and used the t-test to assess statistical significance. Results with a p-value < 0.1 were considered statistically significant, and we denoted 0.05 < p < 0.1 as *, 0.01< p < 0.05 as **, and 0.01 > p as ***. The calculated results are shown in Table 5.
As there were statistically significant differences in the range of the three bands, we drew the topography of the percentage change in the three bands, as shown in Figure 10.
Although the apparent difference is the additional stimulus in the four comparisons, there are also some differences between single-modal and multimodal emotions. We aimed to understand whether these differences in emotions were the cause of significant differences in PSD. Therefore, we used Pearson correlation analysis to analyze whether the change in emotion was influenced by the PSD that is attributed to single or multimodal stimulation. We used the Pearson correlation method and t-test (one-way ANOVA) to calculate the p-value and evaluated correlations in all cases [53]. The results are represented in Figure 11.
The SAM-PSD correlation analysis revealed that the addition of visual to audio stimuli increased brain activity in the frontal and parietal regions, whereas the addition of audio to visual stimuli increased brain activity in the temporal and occipital regions. These findings suggest that the saliency of the emotional cue and the processing efficiency of the sensory modality perform important roles in shaping neural activity patterns. Moreover, the Pearson correlation analysis revealed that changes in emotions were not the direct cause of the significant differences in PSD, indicating that the differences in PSD are likely due to differences in the stimuli themselves.
From these results, we concluded that only correlation results showed statistically significant differences and that these two results did not show significant differences in the PSD t-test. Therefore, we concluded that, in the comparison of stimulation modalities, the difference in PSD comes from the difference in the stimulus itself and does not directly come from a change in emotion. Therefore, we combined the previous results to arrive at the following analysis.
Delta (1–3 Hz):
In the unpleasure state, adding visual to audio stimuli led to a statistically significant decrease in delta activity across all brain regions ranging from 0.8126% to 1.2049%, respectively. Adding audio to visual stimuli led to a statistically significant increase in use activity in the central, parietal, and occipital regions, ranging from 0.5585% to 0.6931%, respectively. In the pleasure state, adding visual to audio stimuli led to a statistically significant decrease in activity in the central, parietal, and occipital regions ranging from 1.3092% to 1.5284%. Adding audio to visual stimuli led to a statistically significant increase in activity in the frontal, temporal, and occipital regions ranging from 0.3061% to 0.5248%, respectively.
Theta (4–7 Hz):
In the unpleasure state, adding visual to audio stimuli led to a statistically significant increase in theta activity in the frontal and parietal regions, ranging from 0.1514% to 0.7576%, respectively, with a statistically significant p-value of less than 0.05 in the occipital region. Adding audio to visual stimuli led to a statistically significant increase in activity in all regions ranging from 0.2996% to 0.8381%, respectively. In the pleasure state, adding visual to audio stimuli led to a statistically significant increase in activity in the frontal and parietal regions ranging from 0.2488% to 0.7675%, respectively. Adding audio to visual stimuli led to a statistically significant increase in activity in the temporal and occipital regions ranging from 0.3063% to 0.4241%, respectively.
Alpha (8–13 Hz):
In the unpleasure state, adding visual to audio stimuli led to a statistically significant increase in alpha activity in the frontal, parietal, and occipital regions ranging from 0.3093% to 4.4406%, respectively. Adding audio to visual stimuli led to a statistically significant increase in activity in all regions ranging from 0.1413% to 0.2344%, respectively. In the pleasure state, adding visual to audio stimuli led to a statistically significant increase in activity in the frontal and parietal regions ranging from 0.2535% to 5.0590%, respecively. Adding audio to visual stimuli led to a statistically significant increase in activity in the temporal and occipital regions ranging from 0.0259% to 0.4241%, respectively.
Therefore, the three bands showed significant changes in activity when audio and visual stimuli were combined compared to when they were presented separately. In addition, all three bands showed changes in activity that were dependent on the emotional state of participants.
However, some differences were observed between bands. Delta activity consistently decreased when visual stimuli were added to audio stimuli across all regions and emotional states. Theta activity significantly increased in the frontal and parietal regions when audio stimuli were combined with visual stimuli, regardless of the emotional state. The alpha activity showed more complex changes, with different regions and emotional states showing different patterns. In summary, these results suggest that the addition of different sensory inputs can modulate neural activity in specific brain regions. The pattern of changes in neural activity in response to different sensory inputs varies across the different frequency bands and brain regions. Overall, we compared three pairs for emotional changes and four pairs for stimulus modality changes. We determined all statistically significant changes in percentages in the PSDs. The results corresponding to each band for each condition are summarized in Figure 12.
The above figure shows that the PSD of the three groups on the left and the emotion-related changes are calculated and compared with the PSD of the four groups on the right, where changes are related to the stimulus modality. For the significant changes in the PSD with regard to stimulus modality (i.e., audio or visual), the change due to the stimulus modality was more significant than that due to the emotional state. It can be concluded that in multimodal emotional stimulations, the effect of change in the stimulation modality itself is significantly greater than the change in emotion.
Therefore, the effect of stimulus modality on PSD could be discussed as follows: The analysis also found that the addition of different sensory inputs (audio or visual) can modulate neural activity in specific brain regions and frequency bands. Although emotional perception via different modalities shares common neural mechanisms [54], the pattern of changes in neural activity in response to different sensory inputs varies across frequency bands and brain regions. In particular, the influence of audio was greater than that of the visual stimulus, especially in the delta and alpha bands, with the difference being significantly greater than that of any other group. The emotional components in stimuli are influential for the EEG spectral domain and differ in response to positive and negative emotional stimuli [55]. The pleasure conditions lead to a decrease in delta band PSD and an increase in PSD for both theta and alpha bands. In contrast, unpleasure conditions lead to an increase in delta band PSD and a decrease in PSD in both theta and alpha bands.
It can be clearly observed from the changes in PSD that the influence of audio stimuli was greater than that of visual stimuli, especially for delta and alpha bands. The finding that visual stimulus has a greater influence on PSD than auditory stimuli, particularly in the delta and alpha frequency bands, can be attributed to several factors. First, the visual system has a larger and more direct pathway to the brain than the auditory system, which involves more processing and the integration of information from different sources. This direct pathway allows faster and more efficient transmission of visual information, leading to stronger and more immediate effects on brain oscillations. Furthermore, visual stimuli tend to be more salient and attention-grabbing than auditory stimuli as they provide more detailed and complex information about the environment. This may lead to the greater activation of attention and cognitive processing systems, resulting in stronger and more widespread effects on brain oscillations. It is also possible that the differences observed in this study were due to the specific characteristics of the visual and auditory stimuli used. For example, visual stimuli may have been more engaging for or appealing to the participants, leading to stronger effects on PSD. Similarly, auditory stimuli may have been less salient or more difficult to discriminate, resulting in weaker effects on the PSD [56].
Moreover, by incorporating the results from Section 3.3, the relationship between emotion and stimulus modality on PSD has found that there were some differences between single-modal and multimodal emotions. However, the differences in emotions were not the cause of the significant differences in PSD, which resulted from differences in the stimulus modality itself. Therefore, it can be concluded that in multimodal emotional stimulation, it is often the stimulation modality itself that is responsible for changes in the PSD rather than the emotional state itself.

3.5. Temporal ERP Analysis

To obtain the ERP results, we used the workflow described in the Materials and Methods section. First, we preprocessed the raw data, then selected Cz, C3, C4, P3, Pz, and P4, utilizing the FIR filter to filter them into 1–40 Hz and averaged the data from six channels to obtain the temporal EEG from each condition. We chose the stimulus start time as 0 s, time-locked the time from −500 ms to ~500 ms, and epoched the data. The processed data were used for N200 and P300 calculations, as shown in Figure 13.
For the calculation of N200, we chose an average potential range of 180–220 ms, and we chose an average potential range of 280–320 ms for P300. Thus, the six conditions for N200 and P300 were obtained as per Table 6 and visualized in Figure 14.
Here, we investigated the impact of emotional valence and stimulus modality on the amplitude values of the N200 and P300 components of event-related potentials (ERPs). We analyzed the amplitude values of these components under six different conditions, namely, audio unpleasure, visual unpleasure, audio-visual unpleasure, audio pleasure, visual pleasure, and audio-visual pleasure.
Our results showed that the highest amplitude of the N200 component was observed in the visual pleasure condition (−2.54 µV), while the lowest amplitude was found in the audio unpleasure condition (−7.07 µV). Similarly, the highest amplitude of the P300 component was observed in the audio pleasure condition (5.84 µV), while the lowest amplitude was observed in the visual pleasure condition (0.02 µV).
Our findings suggest that the saliency of the emotional cue and the processing efficiency of the sensory modality are important factors in shaping the amplitude values of ERP components. The higher N200 amplitude in the visual pleasure condition may be due to the faster and more efficient processing of visual stimuli compared to auditory stimuli, and the pleasure emotion may have a more salient visual cue [57]. Conversely, the lower N200 amplitude in the audio unpleasure condition may be due to the decreased processing of auditory stimuli in the unpleasure state and reduced attention and concentration [58].
In the case of the P300 component, the higher amplitude value in the audio pleasure condition may be attributed to the more salient auditory cue in the pleasure emotion, which may enhance attention and cognitive processing [59]. The lower amplitude value in the visual pleasure condition may be due to the faster and more efficient processing of visual stimuli, and the pleasure emotion may have a less salient visual cue, leading to a decreased allocation of attentional resources [60].
Therefore, the ERP analysis revealed that the saliency of the emotional cue and the processing efficiency of the sensory modality were important factors in shaping the amplitude values of N200 and P300 components. Emotional information processing in multimodal environments is highly integrated, with interactions and regulatory effects between different sensory channels [20]. The N200 [17] and P300 components were subjected to influences from the emotional multimodal stimulation in the experiment. The highest and lowest amplitudes of the N200 component were observed in visual pleasure and audio unpleasure conditions, respectively. Similarly, the highest and lowest amplitudes of the P300 component were observed in audio pleasure and visual pleasure conditions, respectively. These findings suggest that increased salient sensory cues lead to higher amplitude values for the corresponding ERP component.

4. Conclusions

This study investigated the impact of emotional states and stimulus modality on brain activity using three analytical methods: SAM, PSD, and ERP. SAM was used to identify the regions of interest, PSD was used to quantify changes in brain activity in response to different stimuli, and ERP was used to measure the amplitude of N200 and P300 components. Our findings shed light on the neural mechanisms underlying emotional processing across sensory modalities. Highlights: In multimodal emotional stimulation, it is often the stimulation modality itself that is responsible for changes in PSD rather than the emotional state itself. The saliency of emotional cues and the processing efficiency of sensory modality are important factors in shaping the amplitude values of N200 and P300 components. Increased salient sensory cues lead to higher amplitude values for the corresponding ERP components in multimodal emotional environments. Overall, our study provides a comprehensive understanding of the neural mechanisms underlying emotional processing across sensory modalities, mainly via EEG spatiotemporal analysis. Our findings have important implications for understanding the neural mechanisms underlying emotional processing and can aid the development of more effective interventions for emotion-related applications. For future research, we will consider more complex types of multimodal stimuli and their effects on the brain under the role of multimodal emotional stimulation.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/1424-8220/23/10/4801/s1, Supplementary Table S1: The corresponding relationship between experimental videos and sources from the film database (Unpleasure (sad: bs), pleasure: yy), S2. Pseudo-random protocol for the experiment, S3. Questionnaire before emotion EEG experiment.

Author Contributions

Conceptualization, C.G., H.U. and Y.M.; methodology, C.G. and Y.M.; software, C.G.; validation, C.G., H.U. and Y.M.; formal analysis, C.G.; investigation, C.G.; resources, C.G.; data curation, C.G.; writing—original draft preparation, C.G.; writing—review and editing, C.G., H.U. and Y.M.; visualization, C.G.; supervision, H.U. and Y.M.; project administration, Y.M.; funding acquisition, C.G., H.U. and Y.M. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the follows following: The Japan Science and Technology Agency (JST) CREST, grant JPMJCR21C5; The Japan Science and Technology Agency (JST) COI-NEXT, grant JPMJPF2101; The Japan Society for the Promotion of Science (JSPS) KAKENHI, grant 22K17630; The Japan Science and Technology Agency (JST): The establishment of university fellowships towards the creation of science technology innovation, grant Number JPMJFS2112; The Tokyo Institute of Technology (Tokyo Tech) Academy for Convergence of Materials and Informatics (TAC-MI).

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki and approved by the local Ethics Committee of the Tokyo Institute of Technology, Japan.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Data recorded from our experimental work are currently under further analysis; the data will be uploaded when this process is complete.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Berthold-Losleben, M.; Habel, U.; Brehlss, A.K.; Freiherr, J.; Losleben, K.; Schneider, F.; Amunts, K.; Kohn, N. Implicit affective rivalry: A behavioral and fMRI study combining olfactory and auditory stimulation. Front. Behav. Neurosci. 2018, 12, 313. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  2. Castiblanco Jimenez, I.A.; Gomez Acevedo, J.S.; Olivetti, E.C.; Marcolin, F.; Ulrich, L.; Moos, S.; Vezzetti, E. User Engagement Comparison between Advergames and Traditional Advertising Using EEG: Does the User’s Engagement Influence Purchase Intention? Electronics 2023, 12, 122. [Google Scholar] [CrossRef]
  3. Bazzani, A.; Ravaioli, S.; Trieste, L.; Faraguna, U.; Turchetti, G. Is EEG Suitable for Marketing Research? A Systematic Review. Front Neurosci. 2020, 14, 594566. [Google Scholar] [CrossRef]
  4. Shu, L.; Xie, J.; Yang, M.; Li, Z.; Li, Z.; Liao, D.; Xu, X.; Yang, X. A review of emotion recognition using physiological signals. Sensors 2018, 18, 2074. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  5. He, Z.; Li, Z.; Yang, F.; Wang, L.; Li, J.; Zhou, C.; Pan, J. Advances in multimodal emotion recognition based on brain–computer interfaces. Brain Sci. 2020, 10, 687. [Google Scholar] [CrossRef]
  6. Torres, E.P.; Torres, E.A.; Hernández-Álvarez, M.; Yoo, S.G. EEG-based brain-computer interfaces are an active area of research for emotion recognition. Sensors 2020, 20, 5083. [Google Scholar] [CrossRef]
  7. Murugappan, M.; Juhari MR, B.M.; Nagarajan, R.; Yaacob, S. An Investigation on visual and audiovisual stimulus based emotion recognition using EEG. Int. J. Med. Eng. Inform. 2009, 1, 342–356. [Google Scholar] [CrossRef] [Green Version]
  8. Xing, B.; Zhang, H.; Zhang, K.; Zhang, L.; Wu, X.; Shi, X.; Yu, S.; Zhang, S. Exploiting EEG signals and audiovisual feature fusion for video emotion recognition. IEEE Access 2019, 7, 59844–59861. [Google Scholar] [CrossRef]
  9. Jiang, J.; Bailey, K.; Xiao, X. Midfrontal theta and posterior parietal alpha band oscillations support conflict resolution in a masked affective priming task. Front. Hum. Neurosci. 2018, 12, 175. [Google Scholar] [CrossRef]
  10. Mu, Y.; Han, S.; Gelfand, M.J. The role of Gamma Interbrain Synchrony in social coordination when humans face territorial threats. Soc. Cogn. Affect. Neurosci. 2017, 12, 1614–1623. [Google Scholar] [CrossRef] [Green Version]
  11. Schelenz, P.; Klasen, M.; Reese, B.; Regenbogen, C.; Wolf, D.; Kato, Y.; Mathiak, K. Multisensory integration of dynamic emotional faces and voices: Method for simultaneous EEG-fMRI measurements. Front. Hum. Neurosci. 2013, 7, 729. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  12. Li, W.; Li, Y.; Cao, D. The effectiveness of emotion cognitive reappraisal as measured by self-reported response and its link to EEG Alpha Asymmetry. Behav. Brain Res. 2021, 400, 113042. [Google Scholar] [CrossRef] [PubMed]
  13. Yu, W.; Wu, X.; Chen, Y.; Liang, Z.; Jiang, J.; Misrani, A.; Su, Y.; Peng, Y.; Chen, J.; Tang, B.; et al. Pelvic pain alters functional connectivity between anterior cingulate cortex and hippocampus in both humans and a rat model. Front. Syst. Neurosci. 2021, 15, 642349. [Google Scholar] [CrossRef]
  14. Sonkusare, S.; Nguyen, V.T.; Moran, R.; van der Meer, J.; Ren, Y.; Koussis, N.; Dionisio, S.; Breakspear, M.; Guo, C. Intracranial-EEG evidence for medial temporal pole driving amygdala activity induced by multi-modal emotional stimuli. Cortex 2020, 130, 32–48. [Google Scholar] [CrossRef] [PubMed]
  15. Ahirwal, M.K.; Kose, M.R. Audio-visual stimulation based emotion classification by correlated EEG channels. Health Technol. 2020, 10, 7–23. [Google Scholar] [CrossRef]
  16. Balconi, M.; Vanutelli, M.E. Vocal and visual stimulation, congruence and lateralization affect brain oscillations in interspecies emotional positive and negative interactions. Soc. Neurosci. 2016, 11, 297–310. [Google Scholar] [CrossRef]
  17. Balconi, M.; Carrera, A. Cross-modal integration of emotional face and voice in congruous and incongruous pairs: The P2 ERP effect. J. Cogn. Psychol. 2011, 23, 132–139. [Google Scholar] [CrossRef]
  18. Gallotto, S.; Sack, A.T.; Schuhmann, T.; de Graaf, T.A. Oscillatory correlates of visual consciousness. Front. Psychol. 2017, 8, 1147. [Google Scholar] [CrossRef] [Green Version]
  19. Craddock, M.; Poliakoff, E.; El-Deredy, W.; Klepousniotou, E.; Lloyd, D.M. Pre-stimulus alpha oscillations over somatosensory cortex predict tactile misperceptions. Neuropsychologia 2017, 96, 9–18. [Google Scholar] [CrossRef] [Green Version]
  20. Klasen, M.; Kreifelts, B.; Chen, Y.-H.; Seubert, J.; Mathiak, K. Neural processing of emotion in multimodal settings. Front. Hum. Neurosci. 2014, 8, 822. [Google Scholar] [CrossRef] [Green Version]
  21. Bugos, J.A.; Bidelman, G.M.; Moreno, S.; Shen, D.; Lu, J.; Alain, C. Music and visual art training increase auditory-evoked theta oscillations in older adults. Brain Sci. 2022, 12, 1300. [Google Scholar] [CrossRef] [PubMed]
  22. Zimmer, U.; Wendt, M.; Pacharra, M. Enhancing allocation of visual attention with emotional cues presented in two sensory modalities. Behav. Brain Funct. 2022, 18, 1–14. [Google Scholar] [CrossRef] [PubMed]
  23. Ross, B.; Dobri, S.; Jamali, S.; Bartel, L. Entrainment of somatosensory beta and gamma oscillations accompany improvement in tactile acuity after periodic and aperiodic repetitive sensory stimulation. Int. J. Psychophysiol. 2022, 177, 11–26. [Google Scholar] [CrossRef] [PubMed]
  24. Aricò, P.; Magosso, E.; De Crescenzio, F.; Ricci, G.; Piastra, S.; Ursino, M. EEG Alpha Power Is Modulated by Attentional Changes during Cognitive Tasks and Virtual Reality Immersion. Comput. Intell. Neurosci. 2019, 2019, 7051079. [Google Scholar]
  25. Cantero, J.L.; Atienza, M.; Salas, R.M.; Dominguez-Marin, E. Effects of Prolonged Waking-Auditory Stimulation on Electroencephalogram Synchronization and Cortical Coherence during Subsequent Slow-Wave Sleep. J. Neurosci. 2002, 22, 4702–4708. [Google Scholar] [CrossRef] [Green Version]
  26. Baumgartner, T.; Esslen, M.; Jäncke, L. From emotion perception to emotion experience: Emotions evoked by pictures and classical music. Int. J. Psychophysiol. 2006, 60, 34–43. [Google Scholar] [CrossRef]
  27. Jääskeläinen, I.P.; Sams, M.; Glerean, E.; Ahveninen, J. Movies and narratives are useful naturalistic stimuli for neuroimaging studies. NeuroImage 2021, 224, 117445. [Google Scholar] [CrossRef]
  28. Mulders, D.; De Bodt, C.; Lejeune, N.; Courtin, A.; Liberati, G.; Verleysen, M.; Mouraux, A. Dynamics of the perception and EEG signals triggered by tonic warm and cool stimulation. PLoS ONE 2020, 15, e0231698. [Google Scholar] [CrossRef]
  29. Murugappan, M.; Murugappan, S. Human emotion recognition through short time Electroencephalogram (EEG) signals using Fast Fourier Transform (FFT). In Proceedings of the 2013 IEEE 9th International Colloquium on Signal Processing and its Applications, Kuala Lumpur, Malaysia, 8–10 March 2013; pp. 289–294. [Google Scholar]
  30. Dmochowski, J.; Sajda, P.; Dias, J.; Parra, L. Correlated components of ongoing EEG point to emotionally laden attention—A possible marker of engagement? Front. Hum. Neurosci. 2012, 6, 112. [Google Scholar] [CrossRef] [Green Version]
  31. Deng, Y.; Yang, M.; Zhou, R. A New Standardized Emotional Film Database for Asian Culture. Front. Psychol. 2017, 8, 1941. [Google Scholar] [CrossRef] [Green Version]
  32. Betella, A.; Verschure, P.F.M.J. The Affective Slider: A Digital Self-Assessment Scale for the Measurement of Human Emotions. PLoS ONE 2016, 11, e0148037. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  33. Delorme, A.; Makeig, S. EEGLAB: An open-source toolbox for analysis of single-trial EEG dynamics. J. Neurosci. Methods 2004, 134, 9–21. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  34. Stancin, I.; Cifrek, M.; Jovic, A. A Review of EEG Signal Features and Their Application in Driver Drowsiness Detection Systems. Sensors 2021, 21, 3786. [Google Scholar] [CrossRef] [PubMed]
  35. Rahman, M.M.; Sarkar, A.K.; Hossain, M.A.; Hossain, M.S.; Islam, M.R.; Hossain, M.B.; Quinn, J.M.W.; Moni, M.A. Recog-nition of human emotions using EEG Signals: A Review. Comput. Biol. Med. 2021, 136, 104696. [Google Scholar] [CrossRef]
  36. Liao, D.; Shu, L.; Liang, G.; Li, Y.; Zhang, Y.; Zhang, W.; Xu, X. Design and Evaluation of Affective Virtual Reality System Based on Multimodal Physiological Signals and Self-Assessment Manikin. IEEE J. Electromagn. RF Microw. Med. Biol. 2020, 4, 216–224. [Google Scholar] [CrossRef]
  37. Knyazev, G.G. Motivation, emotion, and their inhibitory control mirrored in brain oscillations. Neurosci. Biobehav. Rev. 2007, 31, 377–395. [Google Scholar] [CrossRef]
  38. Mensen, A.; Marshall, W.; Tononi, G. EEG Differentiation Analysis and Stimulus Set Meaningfulness. Front. Psychol. 2017, 8, 1748. [Google Scholar] [CrossRef] [Green Version]
  39. De Haan, E.H.F.; Corballis, P.M.; Hillyard, S.A.; Marzi, C.A.; Seth, A.; Lamme, V.A.F.; Volz, L.; Fabri, M.; Schechter, E.; Bayne, T.; et al. Split-Brain: What We Know Now and Why This is Important for Understanding Consciousness. Neuropsychol. Rev. 2020, 30, 224–233. [Google Scholar] [CrossRef]
  40. Ding, L.; Shou, G.; Yuan, H.; Urbano, D.; Cha, Y.-H. Lasting Modulation Effects of rTMS on Neural Activity and Connectivity as Revealed by Resting-State EEG. IEEE Trans. Biomed. Eng. 2014, 61, 2070–2080. [Google Scholar] [CrossRef] [Green Version]
  41. Ong, Z.Y.; Saidatul, A.; Ibrahim, Z. Power Spectral Density Analysis for Human EEG- based Biometric Identification. In Proceedings of the International Conference on Computational Approach in Smart Systems Design and Applications (ICASSDA), Kuching, Malaysia, 15–17 August 2018; pp. 1–6. [Google Scholar]
  42. Gramfort, A.; Luessi, M.; Larson, E.; Engemann, D.A.; Strohmeier, D.; Brodbeck, C.; Parkkonen, L.; Hämäläinen, M.S. MNE software for processing MEG and EEG data. Neuroimage 2014, 86, 446–460. [Google Scholar] [CrossRef] [Green Version]
  43. Patel, S.H.; Azzam, P.N. Characterization of N200 and P300: Selected Studies of the Event-Related Potential. Int. J. Med Sci. 2005, 2, 147–154. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  44. Haider, A.; Fazel-Rezai, R. Application of P300 Event-Related Potential in Brain-Computer Interface. Event-Relat. Potentials Evoked Potentials 2017, 1, 19–36. [Google Scholar] [CrossRef] [Green Version]
  45. Kaufmann, T.; Hammer, E.; Kübler, A. ERPs contributing to classification in the “P300” BCI. In Proceedings of the 5th International BCI Conference, Graz, Austria, 22–24 September 2011; pp. 136–139. [Google Scholar]
  46. Uzun, S.S.; Dirim, S.Y.; Yildirim, E. Emotion primitives estimation from EEG signals using Hilbert Huang transform. In Proceedings of the IEEE-EMBS International Conference on Biomedical and Health Informatics (BHI 2012), Hong Kong, China, 5–7 January 2012; pp. 623–626. [Google Scholar]
  47. Marín-Morales, J.; Higuera-Trujillo, J.L.; Greco, A.; Guixeres, J.; Llinares, C.; Gentili, C.; Scilingo, E.P.; Alcañiz, M.; Valenza, G. Real vs. immersive-virtual emotional ex-pe-rience: Analysis of psycho-physiological patterns in a free exploration of an art museum. PLoS ONE 2019, 14, e0223881. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  48. Suk, H.-J.; Irtel, H. Emotional response to color across media. Color Res. Appl. 2010, 35, 64–77. [Google Scholar] [CrossRef]
  49. Lagopoulos, J.; Xu, J.; Rasmussen, I.; Vik, A.; Malhi, G.S.; Eliassen, C.F.; Arntsen, I.E.; Sæther, J.G.; Hollup, S.; Holen, A.; et al. Increased Theta and Alpha EEG Activity During Nondirective Meditation. J. Altern. Complement. Med. 2009, 15, 1187–1192. [Google Scholar] [CrossRef]
  50. Grasso, P.A.; Pietrelli, M.; Zanon, M.; Làdavas, E.; Bertini, C. Alpha oscillations reveal implicit visual processing of motion in hemianopia. Cortex 2020, 122, 81–96. [Google Scholar] [CrossRef]
  51. Izhar, L.I.; Babiker, A.; Rizki, E.E.; Lu, C.K.; Abdul Rahman, M. Emotion self-regulation in neurotic students: A pilot mind-ful-ness-based intervention to assess its effectiveness through brain signals and behavioral data. Sensors 2022, 22, 2703. [Google Scholar] [CrossRef]
  52. Van der Burg, E.; Toet, A.; Brouwer, A.-M.; Van Erp, J.B.F. Serial Dependence of Emotion Within and Between Stimulus Sensory Modalities. Multisens. Res. 2021, 35, 151–172. [Google Scholar] [CrossRef]
  53. Boddy, C.R. Corporate Psychopaths, Bullying, Conflict and Unfair Supervision in the Workplace. Corp. Psychopaths 2011, 100, 44–62. [Google Scholar] [CrossRef]
  54. Schirmer, A.; Adolphs, R. Emotion Perception from Face, Voice, and Touch: Comparisons and Convergence. Trends Cogn. Sci. 2017, 21, 216–228. [Google Scholar] [CrossRef] [Green Version]
  55. Costa, T.; Rognoni, E.; Galati, D. EEG phase synchronization during emotional response to positive and negative film stimuli. Neurosci. Lett. 2006, 406, 159–164. [Google Scholar] [CrossRef] [PubMed]
  56. Chaplin, T.A.; Rosa, M.; Lui, L.L. Auditory and Visual Motion Processing and Integration in the Primate Cerebral Cortex. Front. Neural Circuits 2018, 12, 93. [Google Scholar] [CrossRef] [PubMed]
  57. Magnée, M.J.C.M.; de Gelder, B.; van Engeland, H. Atypical processing of fearful face–voice pairs in pervasive developmental disorder: An ERP study. Clin. Neurophysiol. 2008, 119, 2004–2010. [Google Scholar] [CrossRef] [PubMed]
  58. Kemp, A.; Hopkinson, P.J.; Hermens, D.; Rowe, D.L.; Sumich, A.L.; Clark, C.R.; Drinkenburg, W.; Abdi, N.; Penrose, R.; McFarlane, A.; et al. Fronto-temporal alterations within the first 200 ms during an attentional task distinguish major depression, non-clinical participants with depressed mood and healthy controls: A potential biomarker? Hum. Brain Mapp. 2008, 30, 602–614. [Google Scholar] [CrossRef] [Green Version]
  59. Lu, Z.; Li, Q.; Gao, N.; Yang, J.; Bai, O. Happy emotion cognition of bimodal audiovisual stimuli optimizes the performance of the P300 speller. Brain Behav. 2019, 9, e01479. [Google Scholar] [CrossRef] [Green Version]
  60. Lucas, N.; Vuilleumier, P. Effects of emotional and non-emotional cues on visual search in neglect patients: Evidence for dis-tinct sources of attentional guidance. Neuropsychologia 2008, 46, 1401–1414. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Illustration of the experimental setup.
Figure 1. Illustration of the experimental setup.
Sensors 23 04801 g001
Figure 2. Workflow for stimulation representation.
Figure 2. Workflow for stimulation representation.
Sensors 23 04801 g002
Figure 3. EEG electrode distribution.
Figure 3. EEG electrode distribution.
Sensors 23 04801 g003
Figure 4. Self-assessment metric.
Figure 4. Self-assessment metric.
Sensors 23 04801 g004
Figure 5. Brain functional area allocation with EEG electrodes.
Figure 5. Brain functional area allocation with EEG electrodes.
Sensors 23 04801 g005
Figure 6. Block diagram of spectral EEG analysis.
Figure 6. Block diagram of spectral EEG analysis.
Sensors 23 04801 g006
Figure 7. Block diagram of temporal EEG analysis.
Figure 7. Block diagram of temporal EEG analysis.
Sensors 23 04801 g007
Figure 8. Topography distribution for change in PSD with pre-stimulated/stimulated conditions.
Figure 8. Topography distribution for change in PSD with pre-stimulated/stimulated conditions.
Sensors 23 04801 g008
Figure 9. Topography distribution for changes in PSD with different emotional conditions.
Figure 9. Topography distribution for changes in PSD with different emotional conditions.
Sensors 23 04801 g009
Figure 10. Topography distribution for the change in PSD with multi-stimulated/uni-stimulated pleasure/unpleasure conditions.
Figure 10. Topography distribution for the change in PSD with multi-stimulated/uni-stimulated pleasure/unpleasure conditions.
Sensors 23 04801 g010
Figure 11. p-value from the Pearson correlation analysis between SAM and PSD in the multi-stimulated/uni-stimulated pleasure/unpleasure conditions. The four comparative pairs are represented in each sub-figure (ad), and each figure consists of three frequency bands and five brain functional area results.
Figure 11. p-value from the Pearson correlation analysis between SAM and PSD in the multi-stimulated/uni-stimulated pleasure/unpleasure conditions. The four comparative pairs are represented in each sub-figure (ad), and each figure consists of three frequency bands and five brain functional area results.
Sensors 23 04801 g011aSensors 23 04801 g011b
Figure 12. Sum of the absolute value for significant changes corresponding to emotional change pairs and stimulus modality change pairs. The three on the left are emotional changes, and the four on the right are stimulus modality changes.
Figure 12. Sum of the absolute value for significant changes corresponding to emotional change pairs and stimulus modality change pairs. The three on the left are emotional changes, and the four on the right are stimulus modality changes.
Sensors 23 04801 g012
Figure 13. Temporal illustration for six stimulated conditions; those within 180–220 ms are highlighted as N200, those within 280–320 ms are highlighted as P300, and 0 s is the stimulation start time.
Figure 13. Temporal illustration for six stimulated conditions; those within 180–220 ms are highlighted as N200, those within 280–320 ms are highlighted as P300, and 0 s is the stimulation start time.
Sensors 23 04801 g013
Figure 14. Depiction of potential changes for six stimulated experimental conditions corresponding with N200 and P300 values.
Figure 14. Depiction of potential changes for six stimulated experimental conditions corresponding with N200 and P300 values.
Sensors 23 04801 g014
Table 1. EEG data collected during the experiment.
Table 1. EEG data collected during the experiment.
Audio-UnpleasureVisual-UnpleasureAudio-Visual-UnpleasureAudio-PleasureVisual-PleasureAudio-Visual-PleasureResting State
200 trails200 trails200 trails200 trails200 trails200 trails20 trails
Table 2. Self-assessment results of each subject in different experimental conditions.
Table 2. Self-assessment results of each subject in different experimental conditions.
Audio PleasureVisual PleasureAudio-Visual PleasureAudio Unpleasure Visual UnpleasureAudio-Visual Unpleasure
Sub017.88.27.4−5.3−4.6−8.3
Sub025.27.46.5−6.7−8.3−6.1
Sub036.45.57.9−5.0−6.2−7.9
Sub047.54.27.8−8.0−4.3−6.8
Sub056.05.27.1−5.4−5.2−7.5
Sub065.68.08.5−7.0−3.8−6.4
Sub076.57.13.8−6.6−4.1−6.7
Sub085.25.16.1−7.9−6.7−8.5
Sub097.97.77.7−8.5−5.6−6.9
Sub103.45.38.2−5.9−6.3−5.1
Sub117.28.37.8−4.4−3.6−7.9
Sub127.27.47.5−2.7−7.3−6.3
Sub134.86.06.0−8.3−7.2−7.0
Sub147.63.98.2−6.2−3.3−6.1
Sub156.94.76.6−4.8−8.6−7.1
Sub166.97.77.0−4.6−8.1−6.3
Sub174.74.78.0−6.6−7.0−7.8
Sub186.65.27.4−6.2−5.2−7.9
Sub198.37.48.0−8.4−6.3−7.5
Sub208.23.47.0−4.8−6.4−5.8
Average6.50 ± 2.406.17 ± 2.517.23 ± 1.91−6.17 ± 2.35−5.90 ± 2.30−7.00 ± 2.02
Table 3. Percentage change in PSD with pre-stimulated/stimulated conditions, considering frequency bands and functional areas. p-values are provided in parenthesis.
Table 3. Percentage change in PSD with pre-stimulated/stimulated conditions, considering frequency bands and functional areas. p-values are provided in parenthesis.
Stimulus Types Band Change in Percentage (%)
FrontalTemporalCentralParietalOccipital
Audio Unpleasure Delta
Theta
Alpha 2.177 * (0.0745)
Visual Unpleasure Delta
Theta
Alpha −1.814 ** (0.0434) −2.449 ** (0.039) −2.845 ** (0.020)
Audio-visual Unpleasure Delta
Theta
Alpha −1.514 ** (0.0418) −1.867 ** (0.0412) −2.176 ** (0.0218)
Audio Pleasure Delta
Theta
Alpha 1.465 * (0.0818) 2.696 ** (0.0374)
Visual Pleasure Delta
Theta
Alpha −1.347 * (0.0506) −1.759 ** (0.0470) −1.876 ** (0.0382)
Audio-visual Pleasure Delta
Theta
Alpha 0.359 ** (0.0236) 0.727 * (0.0740) 0.473 ** (0.0127) 0.070 ** (0.0113)
0.05 < p < 0.1 as *, 0.01 < p < 0.05 as **.
Table 4. Change in PSD with emotional status, considering frequency bands and functional areas.
Table 4. Change in PSD with emotional status, considering frequency bands and functional areas.
Stimulus Types Band Change in Percentage (%)
FrontalTemporalCentralParietalOccipital
Audio
PL-UNPL
Delta −0.171 ** (0.0221)
Theta 0.148 * (0.098) 0.174 ** (0.046) 0.221 ** (0.026) −0.234 ** (0.0477)
Alpha 0.411 *** (0.008) 0.362 ** (0.024) 0.699 *** (0.0016)
Visual
PL-UNPL
Delta −0.415 ** (0.011) −0.253 * (0.0796)
Theta 0.134 * (0.0956) 0.238 ** (0.0596)
Alpha 0.269 ** (0.0378) 0.104 ** (0.0240)
Audio-visual
PL-UNPL
Delta −0.4942 ** (0.0449) −0.443 *** (0.0032) −0.4889 *** (0.027) −0.432 ** (0.011)
Theta 0.123 ** (0.0132) 0.104 ** (0.0163)
Alpha 0.0492 ** (0.0331) 0.031 *** (0.0014) 0.034 ** (0.041) 0.130 *** (0.0018)
0.05 < p < 0.1 as *, 0.01< p < 0.05 as **, and 0.01 > p as ***.
Table 5. Changes in PSD with respect to multi-stimulated/uni-stimulated and pleasure/unpleasure conditions, considering frequency bands and functional areas.
Table 5. Changes in PSD with respect to multi-stimulated/uni-stimulated and pleasure/unpleasure conditions, considering frequency bands and functional areas.
Stimulus Types Band Change in Percentage (%)
FrontalTemporalCentralParietalOccipital
AV-A UNPL Delta −1.205 *** (0.010) −1.055 *** (0.009) −1.022 *** (0.002) −1.014 *** (0.002) −0.813 *** (0.008)
Theta 0.607 * (0.063) 0.369 ** (0.040) 0.758 *** (0.003)
Alpha 0.309 * (0.095) −2.752 *** (0.0001) −1.821 *** (0.0001) −3.382 ** (0.0001) −4.441 *** (0.0002)
AV-V UNPL Delta 0.693 *** (0.001) 0.633 *** (0.003) 0.559 *** (0.008) 0.605 *** (0.004)
Theta 0.838 *** (0.0007) 0.369 *** (0.0007) 0.380 *** (0.0026) 0.362 *** (0.002) 0.300 ** (0.0027)
Alpha 0.234 ** (0.020) −0.141 ** (0.030) −0.141 * (0.075) −0.182 ** (0.045) −0.198 ** (0.040)
AV-A PL Delta −1.528 ** (0.020) −0.721 * (0.082) −1.309 *** (0.004) −1.381 *** (0.003) −1.088 ** (0.011)
Theta 0.123 ** (0.0132) 0.768 ** (0.030)
Alpha −3.1379 *** (0.0001) −1.942 *** (0.0004) −3.811 ** (0.0002) −5.060 *** (0.0001)
AV-V PL Delta 0.525 * (0.095) 0.408 * (0.054) 0.306 * (0.095) 0.335 * (0.080) 0.405 ** (0.043)
Theta 0.359 *** (0.0017) 0.306 ** (0.014) 0.311 *** (0.004) 0.313 *** (0.005)
Alpha 0.197 * (0.077)
0.05 < p < 0.1 as *, 0.01< p < 0.05 as **, and 0.01 > p as ***.
Table 6. N200 and P300 values for the six stimulated experimental conditions.
Table 6. N200 and P300 values for the six stimulated experimental conditions.
N200 (μV)P300 (μV)
Audio_UNPL−7.072.19
Visual_UNPL−4.231.63
Audio-Visual_UNPL−6.682.44
Audio_PL−7.135.84
Visual_PL−2.540.02
Audio-Visual_PL−7.271.91
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Gao, C.; Uchitomi, H.; Miyake, Y. Influence of Multimodal Emotional Stimulations on Brain Activity: An Electroencephalographic Study. Sensors 2023, 23, 4801. https://doi.org/10.3390/s23104801

AMA Style

Gao C, Uchitomi H, Miyake Y. Influence of Multimodal Emotional Stimulations on Brain Activity: An Electroencephalographic Study. Sensors. 2023; 23(10):4801. https://doi.org/10.3390/s23104801

Chicago/Turabian Style

Gao, Chenguang, Hirotaka Uchitomi, and Yoshihiro Miyake. 2023. "Influence of Multimodal Emotional Stimulations on Brain Activity: An Electroencephalographic Study" Sensors 23, no. 10: 4801. https://doi.org/10.3390/s23104801

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop