Next Article in Journal
Technical-Economic Model in the Real-Time Ancillary Services Market for the Reallocation of Power Reserves in Primary Frequency Control
Previous Article in Journal
Design and Evaluation of Shared Tennis Service Robots Based on AHP–FCE
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Family Emotional Support System for MCS Patients Based on an EEG-to-Visual Translation Mechanism: Design, Implementation, and Preliminary Validation

School of Industrial Design, Hubei University of Technology, Wuhan 430068, China
*
Author to whom correspondence should be addressed.
Appl. Sci. 2025, 15(20), 11149; https://doi.org/10.3390/app152011149
Submission received: 4 September 2025 / Revised: 11 October 2025 / Accepted: 15 October 2025 / Published: 17 October 2025

Abstract

Featured Application

The proposed EEG-based emotion visualization system provides a novel tool for enhancing emotional support and reducing psychological distress among families of patients in a minimally conscious state (MCS). By translating simulated EEG signals into intuitive visual feedback, the system enables caregivers to perceive the emotional states of MCS patients, thereby fostering emotional connection and alleviating anxiety. This approach offers a practical application in clinical and home-care settings to support both neurological rehabilitation and family mental well-being.

Abstract

(1) Patients in a minimally conscious state (MCS) and their families face prolonged emotional distress and psychological challenges due to the uncertainty of recovery and limited means of emotional communication. This study aims to develop an EEG-based emotion visualization system to support affected families by translating patients’ neural activity into perceivable emotional imagery. (2) Using simulated MCS patient EEG data corresponding to different emotional states, we designed a dynamic visual interface via TouchDesigner to convert bio-signals into real-time emotional animations. User tests involving questionnaires and interviews were conducted to evaluate the system’s performance. (3) The results demonstrate that the system accurately conveys emotional states, enhances caregivers’ perception of patients’ internal conditions, and significantly alleviates family members’ anxiety. (4) These findings suggest that EEG-based emotion visualization offers a viable and compassionate tool for supporting MCS families, providing new pathways for interdisciplinary research combining neuroscience and design while establishing a foundation for future clinical and home-care applications.

1. Introduction

Brain injury or cerebral hemorrhage can lead to Disorders of Consciousness (DOC), which include Coma, Vegetative State (VS), Minimally Conscious State (MCS), and Emergence from MCS (EMCS) [1]. Their clinical and neurophysiological characteristics are shown in Table 1.
The treatment and recovery process is lengthy and fraught with uncertainty, imposing a substantial psychological burden on family members as primary caregivers. In the early phase of DOC, families typically maintain high hopes for the patient’s awakening and recovery; however, when no significant improvement is observed over time, their mental well-being deteriorates [2]. Long-term caregiving demands considerable time, energy, and financial resources, and this responsibility may extend indefinitely or even for life, leading to caregiver exhaustion, reduced quality of life, and significant economic strain [3,4,5]. Studies show that family members’ psychological distress intensifies as time passes, with frequent symptoms of anxiety and depression [6]. During the rehabilitation of MCS patients, caregivers’ hopes often shift—from an initial focus on patient recovery to aspirations centered on improving quality of life or achieving other meaningful goals [7]. Currently, family members bear a heavy psychological burden and lack hope, primarily due to three factors:
(1)
The recovery process for patients is highly uncertain: many MCS patients may remain in a minimally conscious state for extended periods, making it difficult to return to normal life and work. This uncertainty makes it challenging for families to set specific rehabilitation goals [8,9].
(2)
Despite ongoing advances in medical technology, limitations persist in the treatment and rehabilitation of MCS. When family members learn of these medical boundaries, their confidence in the possibility of a full recovery is shaken [10,11].
(3)
The inadequacy of social support systems leaves MCS families under significant economic and psychological strain throughout the long caregiving journey, with insufficient help and support, resulting in feelings of isolation and helplessness [4,12].
Recent studies have provided new evidence for evaluating the consciousness and affective states of MCS patients. Cruse et al. detected consistent EEG signals in 19% of VS patients during motor-imagery tasks, revealing the potential for residual consciousness [13]; in an fMRI study, Owen et al. observed significant activation in the supplementary motor area and parahippocampal gyrus when a patient imagined playing tennis or navigating, further supporting preserved conscious function in some patients [14]. Huang et al. implemented an EEG-based brain–computer interface system that played emotional video clips while analyzing frequency-band power in real time, successfully identifying the emotional states of three patients and suggesting that emotion-related regions such as the amygdala and prefrontal cortex remain responsive [1]. Furthermore, Sun et al. applied spinal cord stimulation (SCS) and Hua et al. used transauricular vagus nerve stimulation (ta VNS) to modulate thalamocortical networks, improving Coma Recovery Scale-Revised (CRS-R) scores and EEG activity in DOC patients and thereby supporting affective functional plasticity [15,16,17]. Building on these breakthroughs, the present study focuses primarily on MCS patients.
This study proposes an “Emotional Support System” that establishes an emotion feedback channel via an EEG to visual translation mechanism, converting patients’ EEG activity into a visual language perceptible by family members. Departing from the tool rational framework of conventional medical design, the system emphasizes the symbolic translation of emotion and the construction of meaning. It aims to strengthen the emotional bond between patients and their relatives, provide psychological comfort, alleviate caregivers’ anxiety and depression, and create a more positive, supportive rehabilitation environment for both patients and their families. The research leverages the high temporal resolution of EEG [18] and brain–computer interface (BCI) technologies, combined with biosignal visualization, to develop an interactive system in TouchDesigner. User testing confirmed the system’s effectiveness in reducing psychological stress among caregivers, offering a novel emotional support paradigm for families of patients in an MCS.
The main contributions of this paper are as follows:
(1)
Developed a no-training-required, EEG-based modulation method to realize an innovative system for visualizing emotions in MCS patients;
(2)
Introduced an intervention that harnesses and cultivates even the most rudimentary patient responses, transforming them into repeatable communication channels to provide effective support for MCS families;
(3)
Validated through user testing, the system’s benefits for emotional support in MCS families provide a reference for interdisciplinary research at the nexus of design and neuroscience.

2. Related Research Overview

2.1. Current Status of Emotional Support for Families of MCS Patients

Although patients in an MCS may retain some degree of awareness, their inability to express emotions actively places a sustained and substantial psychological burden on their caregivers, as shown in Figure 1. Studies have shown that relatives of MCS and VS patients commonly experience anxiety, depression, and chronic grief; this so-called “ambiguous loss” hinders their ability to complete the emotional mourning process [19,20]. Caregiver burden correlates closely with their psychological symptoms, and a lack of social support exacerbates their emotional distress [21,22]. In particular, after patient discharge, caregivers often face resource shortages and social isolation, further intensifying their stress levels [23]. Therefore, it is essential to establish ongoing psychological support and social assistance systems for this population.

2.2. EEG Signals and Emotion Recognition

EEG measures the electrical activity of the cerebral cortex and serves as a crucial tool for studying cognitive and emotional processes, as shown in Figure 2. Early neuroscience studies demonstrated that the spectral characteristics of EEG signals vary across different emotional states. For example, alpha-band power increases during calm or relaxed states, while high-frequency beta band activity is enhanced under stress or arousal [24,25]. Contemporary emotion recognition research leverages these features by employing machine learning algorithms to classify EEG signals and identify dimensions such as valence and arousal. A common approach involves mapping emotional models onto the two-dimensional Russell circumplex (valence arousal plane) or the three-dimensional Mehrabian PAD model (Pleasure, Arousal, Dominance), extracting power spectral density or nonlinear features from the EEG, and then training classifiers to discriminate between emotional states [26,27]. Parallel studies in China have also explored this domain, for instance by applying deep learning techniques to EEG-based emotion classification [28,29].
The widespread adoption of consumer-grade wearable EEG devices (e.g., Emotiv, NeuroSky) has enabled the integration of real-time EEG-based emotion recognition into interactive systems and artistic installations. Several research groups have demonstrated prototype systems that translate EEG signals into visual or auditory feedback. For example, the Echo Wave team at Toronto Metropolitan University employed the Emotiv INSIGHT headset to acquire EEG data [30] and utilized TouchDesigner to generate abstract visual particle effects, thereby allowing participants to observe their emotional states on screen. These findings indicate that EEG-based emotion visualization techniques are viable and offer novel avenues for nonverbal emotional expression and interaction.

2.3. Emotion Visualization Technology

Emotion visualization is designed to convey users’ subjective emotional states through intuitive visual means and has recently attracted widespread attention in human–computer interaction, affective computing, and digital art [31,32]. This approach maps physiological signals, such as EEG and heart rate variability, onto visual elements like color, shape, or particle motion to present emotions directly. In visual design, warm hues typically signify excitement or pleasure, whereas cool hues denote calmness or sadness [31]. Wearable devices and EEG acquisition technologies provide the data foundation for emotion visualization. For instance, Marín Morales et al. combined EEG and heart rate data to enable real-time emotion recognition and dynamic feedback within virtual reality environments, thereby supporting emotion regulation and immersive interaction [33]. Moreover, recent research has converted physiological signals into artistic graphics for mental health interventions and art therapy, fostering emotion externalization and self-reflection [34]. By fusing multimodal sensing with dynamic visual expression, emotion visualization not only enriches human–computer interaction experiences but also shows broad potential applications in psychological healing and digital art.

3. Design Model and Methods

This study employs a design-thinking approach combined with iterative practice, proposing a cross-modal emotion translation framework: EEG → emotion mapping model → visual output. The overall design process comprises a literature review and needs assessment, construction of the emotion mapping model, design of the interactive interface, and development and refinement of a prototype system. User requirements analysis, derived from surveys and systematic literature synthesis, clarified the primary concerns of family members interacting with MCS patients (see Figure 3). Based on this, the study established design objectives and functional positioning aimed at helping family members intuitively perceive patients’ emotional changes, thereby alleviating anxiety and helplessness, and providing emotional support. Subsequently, the emotional mapping relationship between EEG signals and visual representation was developed, and an emotion visualization interface was implemented using the TouchDesigner platform. Finally, the system underwent preliminary testing and iterative optimization.

3.1. Emotion Translation Design Framework

The framework is designed with EEG signals as the input, emotional states as the core variables, and dynamic visual elements as the output. The emotion translation process is divided into three key stages—EEG feature analysis, graphical feature analysis, and mapping between the two domains—as illustrated in Figure 4.
The system begins by preprocessing the input EEG signals, which includes artifact removal, filtering, and segmentation, to eliminate non-neural noise and extract stable signal segments. It then extracts features indicative of emotional states, with a particular focus on the energy distribution across different frequency bands. In the frequency domain analysis, alpha (α) waves (8–12 Hz) and beta (β) waves (13–30 Hz) are considered key indicators for identifying emotional states, as their energy variations are closely linked to arousal levels [35]. Specifically:
(1)
In states of heightened arousal (e.g., intense concentration, stress, or excitement), β wave energy increases significantly, while α wave energy tends to decrease, indicating a higher level of neural activation.
(2)
Conversely, in calm states (e.g., relaxation or rest), α wave energy increases, reflecting a relaxed cortical rhythm, while β wave activity is comparatively reduced [35].
To better quantify these emotional states, the system incorporates the β/α power ratio as part of the feature vector. EEG signals are classified based on this β/α ratio: when the ratio exceeds a predefined threshold, the emotional state is classified as “excited”; otherwise, it is classified as “calm.”
Emotional mapping serves as a pivotal component in this process, referring to the transformation of processed EEG signals into corresponding physical or statistical parameters, thereby establishing a mapping between EEG features and visual elements. In EEG-based emotion visualization research, the central task is to intuitively present EEG data through visualization techniques. During the visual mapping phase, color, shape, size, and spatial position function as the primary variables in visual representation. The color dimension encompasses hue, brightness, and saturation; shapes refer to basic geometric forms; size denotes either the area of a shape or the thickness of lines; and position indicates the spatial distribution of elements within a coordinate system. Specific mapping relationships are constructed between EEG features and these visual variables: for example, alpha wave power can be mapped to the saturation level of a visual element, beta wave frequency may correspond to the speed of motion in a visual animation, and an increasing beta/alpha ratio might trigger a dynamic expansion in the size of the graphic element.
Furthermore, once emotional states are classified and identified based on EEG patterns, these categories can be translated into more concrete visualization parameters. For instance, the emotion of “excitement” might be represented by rapidly dispersing red particles with high saturation, while a “calm” state could be depicted through slowly aggregating cyan-blue graphics with low saturation. This translation not only relies on the principle of synesthetic perception in emotion psychology, where colors evoke associative emotional responses, but also considers users’ intuitive understanding of visual symbols and their perceptual preferences [36,37]. Through dynamic, EEG-driven visual feedback, the system establishes a closed-loop emotional interaction: EEG data generate corresponding visual animations, and patients’ family members gain emotional awareness and psychological comfort through these visual effects, enabling them to indirectly perceive and interpret the patient’s internal emotional state.

3.2. Core Design Methodology

3.2.1. EEG Acquisition and Simulation

The EEG processing workflow consists of two main modules: data acquisition and management, and feature analysis and mapping, as shown in Figure 5. For data acquisition, this study simulates MCS conditions using healthy volunteers to generate EEG signals, aiming to validate the system’s feasibility. The experiment employed the NeuroSky device to record EEG data. A multi-scenario emotion induction paradigm was introduced, in which volunteers were exposed to various emotional stimuli, such as listening to family conversations and public narratives, to evoke distinct emotional states.
In the feature analysis and mapping module, EEG data undergo preprocessing steps such as denoising and band-pass filtering, after which signals from representative frequency bands are extracted and their statistical features, such as power spectral density, are computed. The analysis primarily focuses on identifying the distinct EEG patterns between low-arousal states and emotionally stimulated states. By comparing EEG features across different emotional conditions, a mapping model between emotion and EEG is established. This model provides a neural signal foundation for subsequent visual representations and emotion recognition tasks.

3.2.2. Visualization Implementation

The visualization interface of the system was developed using TouchDesigner, a node-based visual programming tool that supports real-time data stream processing and dynamic visual generation. It is widely used in the development of interactive art installations and context-aware systems. In this study, we propose the construction of an “Emotional Visualization Library” tailored for recognizing emotional states in MCS patients, which serves as the foundational framework for emotion visualization.
This library is designed to support a visual mapping framework encompassing up to ten parameter dimensions: visual form, clarity, structural complexity, motion dynamics, color saturation, brightness/contrast, spatial arrangement, transparency/levels, depth perception, and feedback interactivity. By integrating internal modules such as particle dynamics, morphological language encoding, color strategies, and rhythm control, the framework enables precise tuning of visual parameters based on individual EEG-derived emotional features. It dynamically generates personalized visual outputs.
This library significantly enhances the system’s adaptability to various states of consciousness, and it provides a solid technical foundation for future implementation of automated recommendation algorithms and affective interventions in home-based care scenarios. See Figure 6 for details.
In the preliminary implementation, three representative visual scenes were designed to verify the system’s expressive capacity: fluttering particles were used to represent a relaxed, low-tension emotional state; butterfly wing flaps symbolized aroused, active emotions; and starfield particles conveyed a sense of tranquility and deep contemplation. Each visual unit was dynamically driven by simulated EEG data in real time using Python 3.11 scripts or CHOP nodes. Through expression mapping and logical operations, these visuals rendered the shifting states of consciousness across different emotional dimensions in MCS patients. This approach provides both a perceptual and technical foundation for future emotional intervention experiments and user experience evaluations, as shown in Figure 7.
(1)
In the fluttering particle scene, relaxed emotions were depicted through blue-green particles moving in smooth, low-frequency oscillations, exhibiting periodic motion patterns. In contrast, excited states were represented by red particle streams with increased speed and turbulence effects, signaling heightened arousal.
(2)
The butterfly wing-flapping scene was parameterized to simulate emotional shifts through particle dispersion: during calm states, the flapping amplitude remained small with limited particle spread; under aroused states, the amplitude increased significantly, particle dispersion expanded, wingbeat frequency accelerated, and the tip trajectories became more visually prominent.
(3)
In the starfield particle scene, dynamic transitions were achieved through parameter adjustments: under calm conditions, blue-green star clusters rotated at a constant, slow angular velocity with stable particle density; when excitement was detected, red turbulent particles were activated, angular velocity increased, trajectories became more complex, and particle density intensified, resulting in a visually striking display.
All scene parameters were validated through pre-experiments and conform to ISO 9241-303:2011 standards for visual displays in human–computer interaction [38]. The objective was to offer intuitive and diverse emotional visualizations that facilitate a more nuanced understanding and analysis of individual emotional states.
In TouchDesigner, the SOP (Surface Operator) is used to generate basic geometries, while the MAT (Material Operator) defines surface colors and textures. The CHOP (Channel Operator) processes real-time numerical data to drive animation parameters. The Particle SOP(a particle node within the Surface Operator family) is responsible for particle generation and management, and the Noise CHOP introduces random variations to enhance visual complexity. The Ramp TOP (Texture Operator) controls color gradients, and the Light COMP (Component) adjusts ambient lighting conditions. The final visualization can be displayed through monitor devices, enabling an EEG-driven interactive experience.

3.2.3. System Architecture

The system consists of two core modules: an emotion analysis module and a visual presentation module, as shown in Figure 8. The emotion analysis module is responsible for extracting features from the incoming EEG signals and determining the corresponding emotional category. Based on the classification results, the visual presentation module dynamically updates the graphical interface in real time. These two modules communicate via a local network.
The prototype system is deployed on a standard personal computer (Intel i9 processor, 16GB RAM), running the 64-bit Windows version of TouchDesigner. In the simulated environment, the system supports rapid data updates and visual rendering, ensuring real-time interactive performance. Although the current input utilizes EEG data from healthy individuals, the system architecture is designed to be compatible with real sensor input from patients.

4. System Experiments and Results

4.1. Technical Implementation

The technical implementation of this study consists of three main subsystems: the EEG acquisition subsystem, the signal processing subsystem, and the visualization subsystem. Together, these components enable the real-time acquisition, processing, and visual representation of EEG signals, as shown in Figure 9.

4.1.1. EEG Acquisition Subsystem

The NeuroSky TGAM development kit was selected as the EEG acquisition device for the data collection subsystem. This device uses dry electrodes for non-invasive signal acquisition and connects to a computer via a USB interface, with a sampling rate of 512 Hz. The TGAM module is built around the highly integrated TGAT chip, which features analog-to-digital conversion, abnormal signal detection, electrooculographic (EOG) artifact suppression, and 50/60 Hz power line interference filtering. It stably outputs NeuroSky eSense parameters, such as attention and meditation levels, thereby enhancing signal quality and system robustness [39,40].

4.1.2. Signal Processing Subsystem

In the signal processing subsystem, a custom computer-side program performs preprocessing and feature extraction on the acquired EEG data. The extracted emotional feature parameters are then encapsulated and transmitted in real time to the visualization platform via serial communication through an Arduino development board.

4.1.3. Visualization Subsystem

The visualization subsystem is implemented using TouchDesigner 2023 to construct dynamic visual scenes, as shown in Figure 10. EEG feature data transmitted from the Arduino IDE is received via TouchDesigner’s built-in serial communication module. These data are mapped to specific control channels based on emotional dimensions—for instance, Channel 1 corresponds to arousal (linked to increased β wave activity), while Channel 2 corresponds to relaxation (associated with enhanced α waves). These parameters are then routed through CHOP nodes and further propagated to TOP and SOP nodes to drive the key properties of visual elements. To mitigate abrupt visual fluctuations caused by EEG signal variability, a Lag CHOP node is employed for smoothing before visualization mapping, improving the overall coherence and naturalness of the animation.

4.2. System Demonstration and Validation

4.2.1. Participants

Six healthy adults were recruited for the experiment, with the following specific characteristics: participants were aged between 23 and 25 years, with a balanced gender ratio (3 males, 3 females), as shown in Table 2. Participants were recruited through a volunteer program targeting university students, ensuring fairness and representativeness in the selection process. All participants underwent comprehensive physical and psychological evaluations to confirm the absence of health conditions that could interfere with EEG data collection, such as neurological disorders, mental illnesses, severe sleep disturbances, or substance abuse. The exclusion criteria also included the use of medications that could affect EEG signals, such as sedatives, stimulants, anticonvulsants, and antidepressants. To ensure the accuracy and impartiality of the results, participants who consumed common psychoactive substances, including caffeine, energy drinks, or sugary beverages, were excluded, as these could significantly alter brain activity.
Before the experiment began, all participants were adequately informed about the purpose, procedure, potential risks, and safety precautions, ensuring their full understanding of the study. All participants voluntarily signed a written informed consent form, confirming that their consent met ethical standards and agreeing to the use of EEG data and audio recordings. All data were kept strictly confidential, with privacy protection measures in place. Personal information and experimental data were anonymized during storage, and all audio recordings had any identifying information removed to ensure privacy and security. After the experiment, all audio recordings were securely stored and scheduled for destruction within 5 years.
Furthermore, the informed consent forms and audio recordings were approved by the Ethics Committee of Hubei University of Technology, with ethical approval number HBUT20250042, granted on 3 September 2025. The study adhered to the relevant provisions of the Declaration of Helsinki to ensure the protection of participants’ rights and safety. Specifically, the ethical approval covered both EEG data collection and the use of audio recordings, ensuring that all procedures complied with ethical standards.

4.2.2. Stimulus Materials

For the auditory stimuli used in Phase B, a set of classic, widely known stories was carefully selected. These stories were emotionally rich and conveyed positive, uplifting messages. High-quality audio recordings were produced, with narrators reading in a calm, clear, and moderately paced tone. Each audio segment was kept to approximately three minutes, ensuring that the content could be delivered completely and coherently within the allotted time. The objective was to effectively elicit emotional responses from participants through engaging narrative content.
In Phase C, the stimuli consisted of pre-recorded voice messages from family members of six different participants. With prior consent, audio clips were collected in which the family members expressed love, recalled cherished memories, or offered words of encouragement and support. Each set of recordings was also limited to three minutes. These family-generated voice messages were imbued with genuine and heartfelt emotions, aiming to simulate scenarios of everyday emotional connection. This approach was designed to further investigate the differential impacts of various auditory sources on EEG-based emotional responses. All audio stimuli were stored in dedicated playback devices and were presented during the experiment according to a predefined sequence.

4.2.3. Experimental Paradigm

This study employed a longitudinal comparative experimental paradigm, consisting of three sequential phases: Phase A (no stimulus), Phase B (auditory stimulus with public storytelling), and Phase C (auditory stimulus with a family member’s voice).
Before the experiment, six participants were guided individually into a quiet, comfortable room with soft lighting. Following a standardized procedure, the experimenter fitted each participant with EEG equipment, ensuring proper contact between electrodes and the scalp to guarantee accurate and stable data acquisition. Participants were then instructed to lie on a specially designed bed frame, simulating an MCS. They were asked to relax, maintain a calm state, and acclimate to the experimental environment for approximately one minute. The experimental setting and participants are illustrated in Figure 11.
At each stage of the experiment, the system focused on variations in α and β waves that reflect the brain’s emotional states. EEG signals were collected in real time for one minute per scenario, across three different dynamic visualization conditions, totaling three minutes of EEG recording. After each stage, a three-minute rest interval was arranged to allow participants to recover and stabilize before proceeding to the next phase. Throughout the entire process, professional staff supervised the experiment from behind a one-way mirror to ensure procedural compliance and the validity of data collection. This rigorous setup provided reliable experimental data to verify both the functional effectiveness of the system and the expressiveness of its emotional visualization. By avoiding the ethical risks associated with collecting real EEG data from MCS patients, this experimental design offers a controlled and effective alternative EEG source for the development of the emotional visualization translation system. The specific experimental paradigm is illustrated in Figure 12.

4.2.4. Experimental Results

Based on the experimental results, regarding the dynamic changes in EEG energy values, the raw EEG signals were decomposed into power spectra across different frequency bands using Fast Fourier Transform (FFT). The total power across all frequency points within each band was calculated to obtain a quantified measure of energy. To compare differences across experimental stages, a one-way repeated measures ANOVA was conducted on data from the three stages. The significance level was set at p < 0.05, and the p-values, F-values, and effect sizes (η2) were reported. The results are summarized in Table 3: a significant difference was observed in the Low α band across stages (F = 11.623, p < 0.001, η2 = 0.008434), whereas differences in the High α, Low β, and High β bands were not significant (p > 0.05).
This lack of significance may be attributed to multiple factors. On one hand, individual differences among participants (e.g., baseline EEG states, neural sensitivity) could have influenced the results [41,42,43]. On the other hand, at the beginning of the experiment, Group A was exposed to no stimuli, and participants may not have fully adapted to the experimental environment, leaving their EEG energy in a relatively high and unstable state, which could have contributed to the non-significant differences.
Nevertheless, the three stages still demonstrated distinguishable effects on the energy levels of α and β waves. In Stage A, despite the aforementioned disturbances, the overall EEG energy distribution remained relatively stable, characterized by high α-wave and low β-wave energy levels, indicative of a relaxed neural state with low emotional arousal. In Stage B, α and β wave energies exhibited synchronized fluctuations, suggesting that participants experienced emotional perturbations during this phase. By Stage C, there was a notable increase in β-wave energy accompanied by a slight decrease in α-wave energy. This pattern—marked by a surge in high-frequency β activity and suppression of α activity—reflects a typical emotional arousal response, strongly indicating that participants experienced emotional resonance upon hearing the voices of their loved ones. Detailed EEG data changes of individual participants are presented in Figure 13.
Secondly, in the subjective evaluation of the visual translation effects, participants rated the contextual relevance and overall impression of the three visual formats across different emotional scenarios, as shown in Figure 14. Results indicated that the butterfly flapping animation outperformed the others in most dimensions, particularly achieving a high contextual relevance score of 4.00 in the “calm and relaxed” scenario. It also received the highest ratings in terms of “attraction,” “empathy,” and “willingness to use,” demonstrating strong adaptability and user acceptance. The starry particle effect also reached a contextual relevance score of 4.00 in the “strong emotion” scenario, suggesting its effectiveness in conveying intense and complex emotional states. However, its lower rating in “emotional depth” may be attributed to its relatively abstract form of expression. In contrast, the particles fluttering format received lower scores overall, especially in the “calm and relaxed” scenario, where its relevance was only 3.00, and in “willingness to use,” which scored just 2.67. This suggests that while it may be more suited for expressing low-arousal emotional states, its overall expressiveness and user acceptance remain limited. Overall, the butterfly flapping animation, as a concrete visual form, was the most favored, showing strong cross-contextual expressiveness and communicative potential, and was more readily accepted by users.
The experimental results demonstrate that the system effectively identifies EEG fluctuation patterns under different stimulus conditions and translates them into visually perceivable representations with recognizable emotional characteristics. The incorporation of family members’ voices led to more pronounced emotional fluctuations, thereby enhancing the accuracy of recognition. Participant feedback further confirmed the system’s feasibility and potential in both emotional detection and visual representation. In response to suggestions, the system was optimized by adjusting color saturation and particle lifespan, resulting in visual outputs that are not only more emotionally distinctive but also aesthetically appealing.

4.3. Testing and Results from Family Users

During the system validation phase, a user testing questionnaire was administered to a group that included family members of MCS patients. A total of 102 family members participated in this evaluation. The participants’ demographic information is as follows: their ages ranged from 30 to 70 years, and the gender distribution was approximately equal. All participants were immediate family members of the patients (e.g., children, spouses, siblings), and the majority had over 2 years of caregiving experience. The recruitment of participants was conducted online through hospital social workers and patient community channels, with the aim of ensuring the representativeness and diversity of the sample.
During the user testing phase, family members participated indirectly by watching a system demonstration animation and completing a questionnaire, without directly interacting with the system. After viewing the animation, feedback from family members was gathered using a combination of a 5-point Likert scale questionnaire and semi-structured interviews. The questionnaire assessed various aspects, such as the effectiveness of emotional information transmission, emotional relief, and overall satisfaction, with ratings provided on a 5-point Likert scale (1 = strongly disagree, 5 = strongly agree). Reliability and validity analyses were conducted on the questionnaire data. The results revealed a Cronbach’s α coefficient of 0.914, indicating excellent internal consistency. The KMO value was 0.914, and Bartlett’s test of sphericity was significant (p < 0.001), rejecting the null hypothesis that the variables are unrelated, thus confirming the suitability of the data for factor analysis. Using principal component analysis during factor extraction, three main factors were identified: “emotional transmission effect,” “system soothing effect,” and “overall satisfaction.” These findings further support the reliability and validity of the questionnaire, confirming its effectiveness as a measurement tool.
The feedback from participants was reflected in the following key areas: 70.59% of family members agreed that the system partially conveyed the emotional information of the patient; 67.65% of family members reported that their emotions were calmed and their anxiety alleviated after viewing the system; and 86.27% of family members rated the overall satisfaction with the system as 3 points or higher. Some family members noted that, despite the animation’s abstract nature, it carried symbolic meaning, bringing them a sense of warmth and hope. Additionally, some family members suggested incorporating personalized elements, such as family photos or familiar voices, in future versions of the system to enhance its emotional resonance.
Moreover, prior to experiencing the system, the questionnaire provided an in-depth look into the emotional state of family members during their caregiving roles. The results showed that 68.62% of family members often felt emotionally low, 71.56% frequently felt a lack of emotional support, and 68.63% often experienced feelings of helplessness or anxiety when caring for the patient. These findings highlight the significant psychological pressure faced by family members of MCS patients, who are in urgent need of effective emotional release and support.
During the interview phase, family members provided feedback on several core themes. First, many family members expressed that the system helped them better understand the patient’s emotional state through the symbolic nature of the animation, which deepened their emotional connection. Second, family members generally felt that the system alleviated their anxiety and provided emotional support throughout the caregiving process. Finally, some family members suggested that the system would benefit from more personalized features to better accommodate the diverse needs of different families. These valuable insights offer clear guidance for the future optimization and personalized design of the system.
In summary, the user testing results indicate that the designed emotion visualization system demonstrates strong reliability and validity, and effectively meets the emotional needs of MCS patients’ families while alleviating psychological stress. The positive user feedback not only verifies the system’s core functionality but also offers concrete directions for future improvements, including integration with real patient contexts and personalized design features. Additionally, preliminary trials with healthy participants have confirmed the system’s feasibility and the basic effectiveness of its emotion translation mechanism. Future studies should aim to evaluate its long-term impact in real-life caregiving scenarios and further refine the interaction and presentation design to provide more robust emotional and psychological support for patient families.

5. Discussion

5.1. Effectiveness of the EEG-Based Visual Emotion Support System

The EEG-based visual emotion support system proposed in this study translates the EEG signals of MCS patients into dynamic visual imagery, offering relief to families experiencing loneliness and anxiety. By employing cross-modal technology to construct an emotion mapping model, the system enables family members to intuitively “perceive” the emotional states of patients without verbal interaction. Experimental results indicate that, compared to generic storytelling content, auditory stimulation from a patient’s family member elicits stronger emotional responses, as reflected by greater fluctuations in EEG signals. Based on user feedback, visual elements such as “butterfly wing resonance” and “starry particle waves” were combined and refined to enhance the immersive quality of emotional visualization. These findings align with Lü Baoliang et al.’s research on EEG signal stability [44], providing a reliable physiological basis for emotion recognition. Although EEG does not directly encode subjective intent, the system’s robust feature extraction mechanisms allow for effective reconstruction of emotional states, fostering deeper emotional understanding between humans and machines. Several family members reported in questionnaires that the system allowed them to “see” the patient’s responses, enhancing their confidence in communication and demonstrating the system’s strong potential for human-centered emotional care.

5.2. Significance for Families of MCS Patients

Emotional translation holds multifaceted practical significance in the caregiving of families with patients in an MCS. On one hand, both emotion and consciousness originate from neuronal activity and are inherently interconnected. As a critical component of consciousness, emotion monitoring may offer valuable insights into the presence of residual awareness in patients [45,46]. Moreover, since emotion is closely linked to higher-order cognitive functions such as language comprehension and decision-making, its assessment may facilitate inferences regarding the extent of preserved cognitive abilities and potential for recovery [47]. On the other hand, emotional translation systems are characterized by their non-invasive nature, low cost, and operational simplicity. Compared to traditional clinical assessment methods, they are more suited for home environments and can effectively alleviate caregiving burdens. These systems not only provide patients with a possible means of emotional expression but also enhance emotional bonding and caregiving confidence among family members. From a rehabilitation perspective, accurate emotion evaluation and regulation may also contribute to creating favorable conditions for functional recovery in MCS patients [48].

5.3. Potential Applications

This study presents an innovative pathway for interdisciplinary practices between design and neuroscience, offering broad potential for applications across various domains. In the context of family support, the system can provide emotional comfort to family members and serve as a communicative bridge with patients. In medical rehabilitation, it assists clinicians in diagnosing and monitoring patients’ emotional states, thereby supporting recovery therapies and psychological interventions. For emotion research and psychological analysis, the system introduces novel research methods and assessment tools [49]. In the fields of brain–computer interfaces and human–computer interaction, it enhances emotional feedback and facilitates the exploration of new interaction paradigms. When integrated with art therapy, it may further expand healing approaches and stimulate creative inspiration. Overall, this system is expected to play a significant role across multiple domains, including family care, medical treatment, scientific research, human–machine interaction, and the arts, with promising prospects for widespread application.

5.4. Limitations and Future Work

Nonetheless, the study faces several limitations. First, the current system is based on EEG data from healthy volunteers, and the emotion model has not yet been validated with data from actual MCS patients. The EEG signals in MCS patients tend to have low signal quality and weak emotional cues, meaning that existing emotion recognition algorithms may not effectively capture emotional responses in real-world conditions [46,50,51]. Therefore, future work should calibrate and refine the emotion model using EEG data from MCS patients to improve the system’s robustness in complex clinical settings.
Second, the process of emotional mapping could be affected by the designer’s subjective biases and cultural influences. For instance, the interpretation of color symbolism varies across cultures—red may symbolize “danger” in some cultures but “joy” in others. This variation could impact the accuracy of emotional expression. Future research should enhance the visual library and develop a dynamic visual symbol system that supports cultural adaptability, enabling users to adjust emotional mapping rules according to their own cultural background and preferences. This approach would improve the system’s universal applicability and cultural sensitivity. Moreover, the system currently relies on a single form of visual feedback and does not integrate other sensory channels. Future work should explore multimodal approaches, incorporating signals such as EEG, functional near-infrared spectroscopy (fNIRS), and heart rate variability (HRV) to improve the accuracy of emotional state detection and provide a more immersive experience.
The sample size in this study is limited, and a systematic analysis of gender differences was not conducted in this phase. Future research should aim to increase the sample size and investigate the potential effects of gender on EEG signals and emotional response patterns. This line of research could provide valuable insights for future system designs, particularly in the realm of personalized emotional support.
In deploying the system, several challenges remain. For example, the system must be calibrated to meet the specific needs of users in home care environments, with dynamic adjustments to ensure ease of use for non-professional caregivers. The user interface needs to be simplified so that family caregivers can operate the system without specialized training and receive the emotional support they require. Additionally, cross-cultural differences should be taken into account, particularly in the visual representation of emotional feedback. Optimizing the system to account for these cultural differences will provide a more personalized emotional support experience.
Future research can proceed in the following directions:
(1)
From a technical perspective, the emotion model should be calibrated with EEG data from MCS patients, and deep learning methods (such as LSTM networks) should be applied to enhance the robustness of signal feature extraction, especially for emotional data with low signal-to-noise ratios [52,53].
(2)
From a design perspective, a dynamic visual symbol system should be developed, allowing users to customize emotional mapping rules based on their cultural context and personal preferences. This will enhance the system’s adaptability across cultures and improve personalization [54].
(3)
From an ethical standpoint, a transparent explanatory framework should be implemented to clarify the system’s role as an “emotional support tool” and prevent families from over-medicalizing the visualized results. Additionally, integrating multimodal fusion technologies (such as combining fNIRS and EEG) with extended reality (XR) technologies can create a more inclusive emotional interaction environment, further enhancing the system’s immersive and emotional support capabilities [55,56,57].

6. Conclusions

This study presents and develops an EEG-based emotional visualization system designed to support the families of patients in an MCS. By capturing EEG signals elicited under various emotional scenarios, the system employs the TouchDesigner platform to achieve precise mappings from EEG features to dynamic visual elements, enabling real-time generation of visual representations that reflect the patient’s emotional state. Preliminary testing with healthy participants indicates that this novel approach to visualizing emotional expression can effectively convey affective information, offering emotional comfort and a sense of hope to the families of MCS patients. The findings strongly suggest that the integration of BCI technology with artistic visualization provides a promising new technological pathway for emotional support in MCS caregiving contexts. Future work will focus on optimizing the emotional mapping algorithms, incorporating EEG data from actual MCS patients, and conducting large-scale user studies along with rigorous clinical validation to comprehensively evaluate the system’s effectiveness and feasibility. This research aspires to revitalize the field of MCS rehabilitation and caregiving by delivering enriched, evidence-based strategies for emotional care and support for both patients and their families.

Author Contributions

Conceptualization, H.Z.; methodology, H.Z.; software, H.Z.; validation, H.Z.; formal analysis, H.Z. and X.L.; investigation, H.Z.; resources, H.Z.; data curation, H.Z.; writing—original draft preparation, H.Z.; writing—review and editing, H.Z.; visualization, H.Z.; supervision, H.Z.; project administration, H.Z.; funding acquisition, H.Z. and X.L. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Humanities and Social Science Fund of Ministry of Education of China, grant number 24YJAZH070.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, and approved by the Ethics Committee of Hubei University of Technology (protocol code HBUT20250042 and date of 13 July 2025).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The original contributions presented in the study are included in the article; further inquiries can be directed to the corresponding author.

Acknowledgments

We thank all participants for their contributions to this study and sincerely appreciate the guidance and supervision, as well as the advice and assistance, provided by Xiaoying Li. The authors used ChatGPT (OpenAI, https://chatgpt.com/) solely for language translation purposes. No content generation, data analysis, or idea development was conducted using GenAI tools. The authors take full responsibility for the content of the manuscript.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
MCSMinimally conscious state
EEGElectroencephalogram
DOCDisorders of Consciousness
VSVegetative State
EMCSEmergence from MCS
SCSSpinal cord stimulation
ta VNSTransauricular vagus nerve stimulation
BCIBrain–computer interface
PADPleasure, Arousal, Dominance
αAlpha
βBeta
SOPSurface Operator
MATMaterial Operator
CHOPChannel Operator
TOPTexture Operator
COMPComponent
EOGElectrooculographic
FFTFast Fourier Transform
XRExtended reality

References

  1. Huang, H.; Xie, Q.; Pan, J.; He, Y.; Wen, Z.; Yu, R.; Li, Y. An EEG-Based Brain Computer Interface for Emotion Recognition and Its Application in Patients with Disorder of Consciousness. IEEE Trans. Affect. Comput. 2021, 12, 832–842. [Google Scholar] [CrossRef]
  2. Moretta, P.; Estraneo, A.; De Lucia, L.; Cardinale, V.; Loreto, V.; Trojano, L. A study of the psychological distress in family caregivers of patients with prolonged disorders of consciousness during in-hospital rehabilitation. Clin. Rehabil. 2014, 28, 717–725. [Google Scholar] [CrossRef] [PubMed]
  3. Cruzado, J.A.; Elvira de la Morena, M.J. Coping and distress in caregivers of patients with disorders of consciousness. Brain Inj. 2013, 27, 793–798. [Google Scholar] [CrossRef] [PubMed]
  4. Gosseries, O.; Schnakers, C.; Vanhaudenhuyse, A.; Martial, C.; Aubinet, C.; Charland-Verville, V.; Thibaut, A.; Annen, J.; Ledoux, D.; Laureys, S.; et al. Needs and Quality of Life of Caregivers of Patients with Prolonged Disorders of Consciousness. Brain Sci. 2023, 13, 308. [Google Scholar] [CrossRef]
  5. Pan, Y.; Sun, Y.; Liu, X.; Yang, J. Clinical Application and Mechanism Research on Acupuncture and Moxibustion for Disorders of Consciousness: A Review. World Chin. Med. 2025, 20, 1052–1055+1060. [Google Scholar]
  6. Blok, A.C.; Valley, T.S.; Weston, L.E.; Miller, J.; Lipman, K.; Krein, S.L. Factors Affecting Psychological Distress in Family Caregivers of Critically Ill Patients: A Qualitative Study. Am. J. Crit. Care 2023, 32, 21–30. [Google Scholar] [CrossRef]
  7. Hill, D.L.; Boyden, J.Y.; Feudtner, C. Hope in the context of life-threatening illness and the end of life. Curr. Opin. Psychol. 2023, 49, 101513. [Google Scholar] [CrossRef]
  8. Boegle, K.; Bassi, M.; Comanducci, A.; Kuehlmeyer, K.; Oehl, P.; Raiser, T.; Rosenfelder, M.; Sitt, J.D.; Valota, C.; Willacker, L.; et al. Informal Caregivers of Patients with Disorders of Consciousness: A Qualitative Study of Communication Experiences and Information Needs with Physicians. Neuroethics 2022, 15, 24. [Google Scholar] [CrossRef]
  9. Kameda, N.; Suzuki, M. Caregivers’ lived experience in trying to read slight movements in a child with severe brain injury: A phenomenological study. J. Clin. Nurs. 2018, 27, e1202–e1213. [Google Scholar] [CrossRef]
  10. Chen, J.; Zeng, L.; Liu, X.; Wu, Q.; Jiang, J.; Shi, Y. Family surrogate decision-makers’ perspectives in decision-making of patients with disorders of consciousness. Neuropsychol. Rehabil. 2023, 33, 1582–1597. [Google Scholar] [CrossRef]
  11. Suppes, A.; Fins, J.J. Surrogate expectations in severe brain injury. Brain Inj. 2013, 27, 1141–1147. [Google Scholar] [CrossRef]
  12. Steppacher, I.; Kissler, J. A problem shared is a problem halved? Comparing burdens arising for family caregivers of patients with disorders of consciousness in institutionalized versus at home care. BMC Psychol. 2018, 6, 58. [Google Scholar] [CrossRef]
  13. Cruse, D.; Chennu, S.; Chatelle, C.; Bekinschtein, T.A.; Fernández-Espejo, D.; Pickard, J.D.; Laureys, S.; Owen, A.M. Bedside detection of awareness in the vegetative state: A cohort study. Lancet 2011, 378, 2088–2094. [Google Scholar] [CrossRef] [PubMed]
  14. Owen, A.M.; Coleman, M.R.; Boly, M.; Davis, M.H.; Laureys, S.; Pickard, J.D. Detecting Awareness in the Vegetative State. Science 2006, 313, 1402. [Google Scholar] [CrossRef] [PubMed]
  15. Han, J.; Xie, Q.; Wu, X.; Huang, Z.; Tanabe, S.; Fogel, S.; Hudetz, A.G.; Wu, H.; Northoff, G.; Mao, Y.; et al. The neural correlates of arousal: Ventral posterolateral nucleus-global transient co-activation. Cell Rep. 2024, 43, 113633. [Google Scholar] [CrossRef] [PubMed]
  16. Hua, L.; Lai, H.; Yang, W.; Liu, Y.; Ye, X. Effect of transcutaneous auricular vagus nerve stimulation on patients with prolonged disorders of consciousness. Chin. J. Rehabil. Theory Pract. 2025, 31, 339–347. [Google Scholar]
  17. Sun, F.; Niu, H.; Yang, Y.; He, J.; Zhao, Y. A Comparative Study on the Clinical Effects of Short-term and Long-term Spinal Cord Stimulation in Patients with Prolonged Disorders of Consciousness. Med. J. Peking. Union. Med. Coll. Hosp. 2025, 16, 307–313. [Google Scholar]
  18. Li, R.; Yang, D.; Fang, F.; Hong, K.S.; Reiss, A.L.; Zhang, Y. Concurrent fNIRS and EEG for Brain Function Investigation: A Systematic, Methodology-Focused Review. Sensors 2022, 22, 5865. [Google Scholar] [CrossRef]
  19. Giovannetti, A.M.; Leonardi, M.; Pagani, M.; Sattin, D.; Raggi, A. Burden of caregivers of patients in Vegetative State and Minimally Conscious State. Acta Neurol. Scand. 2013, 127, 10–18. [Google Scholar] [CrossRef]
  20. Soeterik, S.M.; Connolly, S.; Playford, E.D.; Duport, S.; Riazi, A. The psychological impact of prolonged disorders of consciousness on caregivers: A systematic review of quantitative studies. Clin. Rehabil. 2017, 31, 1374–1385. [Google Scholar] [CrossRef]
  21. Noohi, E.; Peyrovi, H.; Imani Goghary, Z.; Kazemi, M. Perception of social support among family caregivers of vegetative patients: A qualitative study. Conscious. Cogn. 2016, 41, 150–158. [Google Scholar] [CrossRef]
  22. Pagani, M.; Giovannetti, A.M.; Covelli, V.; Sattin, D.; Leonardi, M. Caregiving for Patients in Vegetative and Minimally Conscious States: Perceived Burden as a Mediator in Caregivers’ Expression of Needs and Symptoms of Depression and Anxiety. J. Clin. Psychol. Med. Settings 2014, 21, 214–222. [Google Scholar] [PubMed]
  23. Yu, N.Y.; Kanarsky, M.M.; Borisov, I.V.; Pradhan, P.; Yankevich, D.S.; Roshka, S.F.; Petrova, M.V.; Grechko, A.V. Post-discharge plight of patients with chronic disorders of consciousness: A systematic review of socioeconomic and health aspects. Russ. Open Med. J. 2022, 11, 412. [Google Scholar]
  24. Abhang, P.A.; Gawali, B.W.; Mehrotra, S.C. Introduction to EEG- and Speech-Based Emotion Recognition; Academic Press: Cambridge, MA, USA, 2016. [Google Scholar]
  25. Duda, A.T.; Clarke, A.R.; Barry, R.J. Mindfulness meditation alters neural oscillations independently of arousal. Int. J. Psychophysiol. 2024, 205, 112439. [Google Scholar] [CrossRef] [PubMed]
  26. Bălan, O.; Moise, G.; Petrescu, L.; Moldoveanu, A.; Leordeanu, M.; Moldoveanu, F. Emotion Classification Based on Biophysical Signals and Machine Learning Techniques. Symmetry 2020, 12, 21. [Google Scholar]
  27. Wang, X.; Ren, Y.; Luo, Z.; He, W.; Hong, J.; Huang, Y. Deep learning-based EEG emotion recognition: Current trends and future perspectives. Front. Psychol. 2023, 14, 1126994. [Google Scholar] [CrossRef]
  28. Li, J.Y.; Du, X.B.; Zhu, Z.L.; Deng, X.M.; Ma, C.X.; Wang, H.A. Deep Learning for EEG-based Emotion Recognition: A Survey. J. Softw. 2022, 34, 255–276. [Google Scholar]
  29. Qin, T.; Sheng, H.; Yue, L.; Jin, W. Review of Research on Emotion Recognition Based on EEG Signals. Comput. Eng. Appl. 2023, 59, 38–54. [Google Scholar]
  30. Derivative [Internet]. 2024. The Echo Wave Project: Visualizing Mental Health with TouchDesigner. Available online: https://derivative.ca/community-post/echo-wave-project-visualizing-mental-health-touchdesigner/70734 (accessed on 14 June 2025).
  31. Lan, X.; Wu, Y.; Cao, N. Affective Visualization Design: Leveraging the Emotional Impact of Data. IEEE Trans. Vis. Comput. Graph. 2024, 30, 1–11. [Google Scholar] [CrossRef]
  32. Qin, C.Y.; Constantinides, M.; Aiello, L.M.; Quercia, D. Heartbees: Visualizing crowd affects. In Proceedings of the IEEE VIS Arts Program (VISAP), Salt Lake City, UT, USA, 25–30 October 2020; IEEE: New York, NY, USA, 2020; pp. 1–8. [Google Scholar]
  33. Marín-Morales, J.; Higuera-Trujillo, J.L.; Greco, A.; Guixeres, J.; Llinares, C.; Scilingo, E.P.; Alcañiz, M.; Valenza, G. Affective computing in virtual reality: Emotion recognition from brain and heartbeat dynamics using wearable sensors. Sci. Rep. 2018, 8, 13657. [Google Scholar] [CrossRef]
  34. Xu, Z.; Cho, Y. Exploring Artistic Visualization of Physiological Signals for Mindfulness and Relaxation: A Pilot Study. arXiv 2023, arXiv:2310.14343. [Google Scholar] [CrossRef]
  35. Attar, E.T. Review of electroencephalography signals approaches for mental stress assessment. Neurosci. J. 2022, 27, 209–215. [Google Scholar] [CrossRef] [PubMed]
  36. Muratbekova, M.; Shamoi, P. Color-Emotion Associations in Art: Fuzzy Approach. IEEE Access 2024, 12, 37937–37956. [Google Scholar] [CrossRef]
  37. Rouw, R.; Case, L.; Gosavi, R.; Ramachandran, V. Color associations for days and letters across different languages. Front. Psychol. 2014, 5, 369. [Google Scholar] [CrossRef]
  38. ISO [Internet]. ISO 9241-303:2011. Available online: https://www.iso.org/standard/57992.html (accessed on 14 June 2025).
  39. EEG—Electroencephalography—BCI | NeuroSky [Internet]. Available online: https://neurosky.com/biosensors/eeg-sensor/ (accessed on 14 June 2025).
  40. Li, Y.; Zeng, W.; Dong, W.; Han, D.; Chen, L.; Chen, H.; Kang, Z.; Gong, S.; Yan, H.; Siok, W.T.; et al. A Tale of Single-Channel Electroencephalography: Devices, Datasets, Signal Processing, Applications, and Future Directions. IEEE Trans. Instrum. Meas. 2025, 74, 1–20. [Google Scholar] [CrossRef]
  41. Lopez, K.L.; Monachino, A.D.; Vincent, K.M.; Peck, F.C.; Gabard-Durnam, L.J. Stability, change, and reliable individual differences in electroencephalography measures: A lifespan perspective on progress and opportunities. NeuroImage 2023, 275, 120116. [Google Scholar] [CrossRef]
  42. Wang, H.; Yin, N.; Xu, G. Advances in methods and applications of electroencephalogram microstate analysis. J. Biomed. Eng. 2023, 40, 163–170. [Google Scholar]
  43. Xu, Z.; Zhou, Y.; Wen, X.; Niu, Y.; Li, Z.; Xu, X.; Zhang, D.; Wu, X. Cross subject personality assessment based on electroencephalogram functional connectivity and domain adaptation. J. Biomed. Eng. 2022, 39, 257–266. [Google Scholar]
  44. Zheng, W.L.; Zhu, J.Y.; Lu, B.L. Identifying Stable Patterns over Time for Emotion Recognition from EEG. IEEE Trans. Affect. Comput. 2019, 10, 417–429. [Google Scholar]
  45. Gao, J.; Wu, M.; Wu, Y.; Liu, P. Emotional consciousness preserved in patients with disorders of consciousness? Neurol. Sci. 2019, 40, 1409–1418. [Google Scholar] [CrossRef]
  46. Pan, J.; Xie, Q.; Huang, H.; He, Y.; Sun, Y.; Yu, R.; Li, Y. Emotion-related consciousness detection in patients with disorders of consciousness through an EEG-based BCI system. Front. Hum. Neurosci. 2018, 12, 198. [Google Scholar] [CrossRef] [PubMed]
  47. Caramazza, A.; Shelton, J.R. Domain-Specific Knowledge Systems in the Brain: The Animate-Inanimate Distinction. J. Cogn. Neurosci. 1998, 10, 1–34. [Google Scholar] [CrossRef] [PubMed]
  48. De Luca, R.; Lauria, P.; Bonanno, M.; Corallo, F.; Rifici, C.; Castorina, M.V.; Trifirò, S.; Gangemi, A.; Lombardo, C.; Quartarone, A.; et al. Neurophysiological and Psychometric Outcomes in Minimal Consciousness State after Advanced Audio–Video Emotional Stimulation: A Retrospective Study. Brain Sci. 2023, 13, 1619. [Google Scholar] [CrossRef] [PubMed]
  49. Islam, A.; Mohd Noor, N.F.; Abdul Rahman, S.S. Systematic mapping study of tools to identify emotions and personality traits. Discov. Artif. Intell. 2025, 5, 58. [Google Scholar] [CrossRef]
  50. Gao, Y.; Xue, Y.; Gao, J. Emotion recognition from multichannel EEG signals based on low-rank subspace self-representation features. Biomed. Signal Process. Control. 2025, 99, 106877. [Google Scholar] [CrossRef]
  51. Zhang, Z.; Fort, J.M.; Giménez Mateu, L. Mini review: Challenges in EEG emotion recognition. Front. Psychol. 2024, 14, 1289816. [Google Scholar] [CrossRef]
  52. Pan, J.; Yu, Y.; Wu, J.; Zhou, X.; He, Y.; Li, Y. Deep Neural Networks for Automatic Sleep Stage Classification and Consciousness Assessment in Patients with Disorder of Consciousness. IEEE Trans. Cogn. Dev. Syst. 2024, 16, 1589–1603. [Google Scholar] [CrossRef]
  53. Wang, Z.; Yu, J.; Gao, J.; Bai, Y.; Wan, Z. MutaPT: A Multi-Task Pre-Trained Transformer for Classifying State of Disorders of Consciousness Using EEG Signal. Brain Sci. 2024, 14, 688. [Google Scholar] [CrossRef]
  54. Ng, A.W.Y.; Siu, K.W.M.; Chan, C.C.H. The effects of user factors and symbol referents on public symbol design using the stereotype production method. Appl. Ergon. 2012, 43, 230–238. [Google Scholar] [CrossRef]
  55. Liu, Z.; Shore, J.; Wang, M.; Yuan, F.; Buss, A.; Zhao, X. A systematic review on hybrid EEG/fNIRS in brain-computer interface. Biomed. Signal Process. Control. 2021, 68, 102595. [Google Scholar] [CrossRef]
  56. Zhao, Q.; Zhang, X.; Chen, G.; Zhang, J. EEG and fNIRS emotion recognition based on modality attention graph convolution feature fusion. J. ZheJiang Univ. Eng. Sci. 2023, 57, 1987–1997. [Google Scholar]
  57. Rakkolainen, I.; Farooq, A.; Kangas, J.; Hakulinen, J.; Rantala, J.; Turunen, M.; Raisamo, R. Technologies for Multimodal Interaction in Extended Reality—A Scoping Review. Multimodal Technol. Interact. 2021, 5, 81. [Google Scholar] [CrossRef]
Figure 1. Status of families in an MCS.
Figure 1. Status of families in an MCS.
Applsci 15 11149 g001
Figure 2. Types of brain waves.
Figure 2. Types of brain waves.
Applsci 15 11149 g002
Figure 3. Scenario-based breakdown of family/carer needs.
Figure 3. Scenario-based breakdown of family/carer needs.
Applsci 15 11149 g003
Figure 4. Emotion translation model.
Figure 4. Emotion translation model.
Applsci 15 11149 g004
Figure 5. EEG signal processing flow.
Figure 5. EEG signal processing flow.
Applsci 15 11149 g005
Figure 6. Multi-dimensional parameter table for the emotional visual library.
Figure 6. Multi-dimensional parameter table for the emotional visual library.
Applsci 15 11149 g006
Figure 7. Visualization of dynamic scenes.
Figure 7. Visualization of dynamic scenes.
Applsci 15 11149 g007
Figure 8. System architecture.
Figure 8. System architecture.
Applsci 15 11149 g008
Figure 9. Technical implementation framework.
Figure 9. Technical implementation framework.
Applsci 15 11149 g009
Figure 10. TouchDesigner visualization for visual translation.
Figure 10. TouchDesigner visualization for visual translation.
Applsci 15 11149 g010
Figure 11. Experimental scenarios and participants.
Figure 11. Experimental scenarios and participants.
Applsci 15 11149 g011
Figure 12. Experimental paradigm.
Figure 12. Experimental paradigm.
Applsci 15 11149 g012
Figure 13. Participant EEG analysis.
Figure 13. Participant EEG analysis.
Applsci 15 11149 g013
Figure 14. Participants’ evaluation of the visualization.
Figure 14. Participants’ evaluation of the visualization.
Applsci 15 11149 g014
Table 1. Comparative table of clinical and neurophysiological characteristics of different types of consciousness disorders.
Table 1. Comparative table of clinical and neurophysiological characteristics of different types of consciousness disorders.
DOCClinical CharacteristicsNeurophysiological Characteristics
ComaNo conscious response, no speech, no awareness, eyes closed.EEG shows highly synchronized, low-frequency slow-wave activity, with extremely low cerebral blood flow.
VSNo awareness of surroundings, but may show autonomic responses (e.g., breathing, heartbeat), eyes may open and close, no purposeful communication.EEG may show irregular activity, but lacks conscious response.
MCSLimited conscious responses, such as reactions to naming or simple commands, occasional eye-opening.EEG may display mixed low-frequency and moderate-frequency activity, with higher brain activity in some patients.
EMCSPartial recovery of consciousness, with improved communication abilities and self-awareness.Neuroimaging and EEG activity show recovery of functional brain activity in some regions.
Table 2. Participant information.
Table 2. Participant information.
ParticipantsAgeGenderHealth AssessmentUnderstanding Emotional Visualization
P125MaleQualifiedYes
P224MaleQualifiedYes
P325MaleQualifiedNo
P423FemaleQualifiedYes
P524FemaleQualifiedYes
P625FemaleQualifiedYes
Table 3. Statistical table of significant differences and effect sizes.
Table 3. Statistical table of significant differences and effect sizes.
GroupF-Valuep-Valueη2 (Effect Size)Significance
Low Alpha11.623<0.0010.008434Significant
High Alpha0.0400.9610.000024Not significant
Low Beta1.6930.1840.001029Not significant
High Beta0.0590.9420.000036Not significant
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Zhang, H.; Li, X. A Family Emotional Support System for MCS Patients Based on an EEG-to-Visual Translation Mechanism: Design, Implementation, and Preliminary Validation. Appl. Sci. 2025, 15, 11149. https://doi.org/10.3390/app152011149

AMA Style

Zhang H, Li X. A Family Emotional Support System for MCS Patients Based on an EEG-to-Visual Translation Mechanism: Design, Implementation, and Preliminary Validation. Applied Sciences. 2025; 15(20):11149. https://doi.org/10.3390/app152011149

Chicago/Turabian Style

Zhang, Haoyu, and Xiaoying Li. 2025. "A Family Emotional Support System for MCS Patients Based on an EEG-to-Visual Translation Mechanism: Design, Implementation, and Preliminary Validation" Applied Sciences 15, no. 20: 11149. https://doi.org/10.3390/app152011149

APA Style

Zhang, H., & Li, X. (2025). A Family Emotional Support System for MCS Patients Based on an EEG-to-Visual Translation Mechanism: Design, Implementation, and Preliminary Validation. Applied Sciences, 15(20), 11149. https://doi.org/10.3390/app152011149

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop