Next Article in Journal
MGMR-Net: Mamba-Guided Multimodal Reconstruction and Fusion Network for Sentiment Analysis with Incomplete Modalities
Previous Article in Journal
Segmentation Control in Dynamic Wireless Charging for Electric Vehicles
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Design of an Interactive System by Combining Affective Computing Technology with Music for Stress Relief

Department of Digital Media Design, National Yunlin University of Science and Technology, Douliu 64002, Taiwan
*
Author to whom correspondence should be addressed.
Electronics 2025, 14(15), 3087; https://doi.org/10.3390/electronics14153087
Submission received: 13 May 2025 / Revised: 17 July 2025 / Accepted: 23 July 2025 / Published: 1 August 2025
(This article belongs to the Special Issue New Trends in Human-Computer Interactions for Smart Devices)

Abstract

In response to the stress commonly experienced by young people in high-pressure daily environments, a music-based stress-relief interactive system was developed by integrating music-assisted care with emotion-sensing technology. The design principles of the system were established through a literature review on stress, music listening, emotion detection, and interactive devices. A prototype was created accordingly and refined through interviews with four experts and eleven users participating in a preliminary experiment. The system is grounded in a four-stage guided imagery and music framework, along with a static activity model focused on relaxation-based stress management. Emotion detection was achieved using a wearable EEG device (NeuroSky’s MindWave Mobile device) and a two-dimensional emotion model, and the emotional states were translated into visual representations using seasonal and weather metaphors. A formal experiment involving 52 users was conducted. The system was evaluated, and its effectiveness confirmed, through user interviews and questionnaire surveys, with statistical analysis conducted using SPSS 26 and AMOS 23. The findings reveal that: (1) integrating emotion sensing with music listening creates a novel and engaging interactive experience; (2) emotional states can be effectively visualized using nature-inspired metaphors, enhancing user immersion and understanding; and (3) the combination of music listening, guided imagery, and real-time emotional feedback successfully promotes emotional relaxation and increases self-awareness.

1. Introduction

1.1. Research Background

Modern society’s high-pressure environment has led to rising stress levels, contributing to issues such as anxiety, sleep disorders, and even increased youth suicide rates, which have steadily climbed in the U.S. since 2006 [1]. Economic shifts have intensified academic pressures and widened social disparities, heightening psychological stress among youth [2]. In the meantime, emerging romantic interests and the need for companionship make emotional relationships a key factor in youth well-being [3].
With rapid technological advancement, increasing numbers of works are exploring psychology and emotion through art, including therapies that use art or music to treat psychological disorders. In recent years, many interactive devices and systems have been designed to create therapeutic scenarios. For instance, a research team at City University of Hong Kong, in collaboration with the Art Therapists Association, developed “Smart Ambience Therapy”, an interactive gaming device that integrates art therapy with virtual reality [4]. This system enables abused children to express themselves through gameplay, assisting therapists in identifying issues and supporting emotional recovery.
In conclusion, psychological therapy offers promising potential, and greater attention to mental health is essential. Amid rising stress, music and art therapy present meaningful paths to emotional relief and personal well-being.

1.2. Research Motivation

Young people often face considerable psychological pressure from daily life and academics, which can harm their physical and mental well-being. Without effective outlets, this pressure may build up and lead to psychological issues. Common stress-relief methods include eating, listening to music, gaming, and exercise, but choosing appropriate strategies and recognizing stress levels are essential. As technology becomes more embedded in daily life, perceptions of interactive devices have shifted. Murray [5] noted that personal engagement enhances happiness, relaxation, and involvement. Interactive devices offer immediacy, user choice, and feedback, encouraging active participation and enabling personalized support.
In this study, stress relief and interactive devices are combined, with the interactivity and choice of the devices being directed toward stress relief as the primary concept and research focus. Experiments are conducted based on this stress-relief approach, with stress-relief effects being achieved through interaction with the devices. Meanwhile, data from wearable brainwave sensors (NeuroSky’s MindWave Mobile device) are used to understand the relationship between the interactive device and stress relief, providing a new channel for stress-relief features in future interactive devices.

1.3. Research Objectives

In this research, emotional sensing technology was integrated with a music-based interactive installation aimed at developing a system that supports stress relief. The investigation was structured around three main areas: the analysis of existing studies on stress-relief systems and interactive games, the identification of design principles derived from the literature and these case studies, and the examination of musical elements and visual presentations that facilitate emotional relaxation.
To address the issue that stress is not always consciously perceived, a wearable electroencephalographic (EEG) device was employed to detect the users’ emotional states by translating brain activities into visual data. These data were reflected through visual feedback, enabling the participants to interpret and become aware of their emotional responses. This EEG-assisted approach was embedded within an interactive installation that combines music-assisted care with emotion visualization techniques, thereby realizing the application of affective computing technology—an area that remains underexplored in current interactive system design. Participants were invited to engage with the proposed system during experimental sessions, after which interviews and questionnaire surveys were conducted to evaluate the system’s effectiveness in promoting relaxation and emotional awareness.
More specifically, the questions investigated in this study include the following:
(1)
Can a wearable EEG device effectively detect users’ emotional states in the context of an interactive, music-based relaxation experience?
(2)
Does transforming brainwave data into visual feedback help users become more aware of their emotional responses?
(3)
Does the combination of EEG-based emotional detection, music-assisted care, and visual emotional feedback lead to improved subjective feelings of relaxation and emotional clarity among users?
Accordingly, the objectives of the research were as follows:
(1)
To explore the integration of music in interactive systems for stress reduction;
(2)
To propose design principles based on literature and expert input;
(3)
To develop a wearable EEG-based interactive system using music as the input, integrating it with EEG signals from a brainwave device to conduct stress-relief research;
(4)
To verify the role of emotional sensing in promoting relaxation through musical interaction and visual presentation; and
(5)
To examine the impact of visual emotional feedback on user awareness and engagement.

1.4. Limitations of This Study

This study primarily focuses on the usability and experiential aspects of a music-based interactive installation designed for stress relief, as well as on whether users report a sense of emotional relief or relaxation after the interactive experience. The following limitations are acknowledged:
(1)
The term music therapy, as used in this article, is not intended to denote clinical psychological treatment; to avoid misunderstanding, the term music-assisted care is used where appropriate;
(2)
All experiments were conducted in a controlled environment free from external disturbances to ensure a comfortable and secure experience;
(3)
Participants were aged 18 to 24; and
(4)
The evaluation of the system’s effectiveness is based on post-experience surveys and interviews, without comparisons between pre- and post-experience data or a control group.

1.5. Research Process of This Study

As shown in Figure 1, the research process of this study is divided into five stages, represented by the blue blocks, as described in the following.
(1)
Selection of research topic—involving the design of the research direction and selection of research goals.
(2)
Literature review—covering relevant studies in music-assisted care, interactive technology, and emotion sensing.
(3)
System development—including the establishment of design principles, system design, and prototype construction.
(4)
Experience and evaluation—comprising a preliminary experiment, prototype modification, and a formal experiment:
(a)
Preliminary experiment—involving initial field tests with users and experts, followed by user interviews and expert interviews to gather feedback;
(b)
Prototype modification—refining the prototype based on collected feedback;
(c)
Formal experiment—inviting additional users to test the updated prototype, followed by user interviews and a questionnaire survey, with EEG data and user feedback analyzed to assess emotional responses and user experience.
(5)
Analysis and conclusion—involving the analysis of research results, conclusions drawn from the study, and suggestions for future work.

2. Literature Review

In this section, literature pertaining to psychological stress, music therapy, emotion sensing, and interactive technology is examined. First, the sources of psychological stress and methods of stress relief are discussed, with particular attention paid to how stress is commonly experienced and alleviated among younger individuals. Second, the definition of music therapy is explored, along with its application models. Third, current definitions of emotion sensing are reviewed, the connection between brainwaves and emotions is analyzed, and brainwave measurement methods are considered through relevant case studies. In the fourth part, interactive technology is defined, and principles of human–computer interaction are integrated into the device design. Finally, based on the findings from these four domains, relevant elements are synthesized to form the foundation of the proposed work.

2.1. Psychological Stress

2.1.1. Definition of Psychological Stress

The term psychological stress originates from the concept of general adaptation syndrome [6], which was established through biological research and experiments on white rats. These studies demonstrated that both physical and psychological stress can result in physical or chemical harm to an organism. General adaptation syndrome describes a typical pattern of response exhibited by individuals when exposed to external stressors, comprising the following three stages.
(1)
Alarm stage—The body reacts to stress by going into “fight or flight” mode. Heart rate rises, and stress hormones like adrenaline are released to provide extra energy.
(2)
Resistance stage—The body starts to calm down, but stays alert. Hormone levels begin to drop, but if stress continues, they stay high. People may seem to cope, but can feel irritable, frustrated, or have trouble concentrating.
(3)
Exhaustion stage—Long-term stress wears the body out. Energy runs low, making it harder to cope. People may feel tired, hopeless, or emotionally drained, often experiencing fatigue, depression, or anxiety.
Lazarus and Folkman [7] described psychological stress as a challenge that happens when people feel their resources insufficient to handle demands from their environment. This imbalance creates pressure and emotional strain. People see stress as harm, loss, threat, or challenge, which causes emotional reactions. Today, stress often refers to how the body and mind react to pressures from daily life, work, or relationships [8]. While some stress can be helpful, excessive or prolonged stress can lower motivation and harm both physical and mental health.

2.1.2. Youth Life Stress

Hensley [9] surveyed around 300 university students using 52 common stress scenarios, including events like the death of a loved one, academic struggles, and peer pressure. About 80% reported related stress, with female students generally experiencing higher levels than males.
Villanova and Bownas [10] surveyed 200 students and identified seven major stressors: academic workload, interpersonal relationships, financial issues, racial discrimination, family death, sexual relationships, and the learning environment. Academic, financial, and environmental pressures were the most common.
Whitman et al. [11] focused on graduate students, finding four main stress sources: (1) advisor influence, especially regarding thesis work; (2) personal relationships and social isolation; (3) academic pressure from balancing coursework and research; and (4) concerns about careers and finances. Graduate students were most affected by academic demands and future uncertainty, while undergraduates felt more stress from interpersonal issues and academic workload, with concerns shifting as they progressed.
Overall, young people face stress mainly from academics, relationships, and future anxiety. Many lack healthy coping strategies, leading to internalized stress or harmful behaviors, which can result in missed opportunities for timely support or even irreversible outcomes.

2.1.3. Stress Relief Methods

Not all stress is harmful. Most people experience some level of daily stress, which is normal and unavoidable. However, stress becomes a concern when it becomes excessive, negatively affecting behavior, relationships, and health [12]. Selye [13] classified stress into two types: eustress and distress. Eustress is positive stress that enhances performance, motivation, and well-being. It is seen as constructive and energizing. In contrast, distress is negative stress that impairs functioning and harms health.
Young people are particularly vulnerable to distress. The persistently high suicide rate in this group suggests that many struggle with ineffective coping strategies. Thus, managing stress effectively is a major concern in modern society. Jaremko and Meichenbaum [14] proposed three main stress relief methods: (1) verbal expression—sharing emotions with others or through self-talk without judgment; (2) static activities—such as listening to music, meditating, or using aromatherapy to calm the mind; and (3) dynamic activities—including exercise, which improves mood through dopamine and serotonin release.
Wen [15] identified five key strategies for reducing stress and maintaining health: (1) changing perspective—reframing situations or shifting focus; (2) practicing relaxation—calming the body and mind through breathing and quiet reflection; (3) exercise—releasing tension and promoting physical ease; (4) maintaining belief—following personal or spiritual practices and routines; and (5) social support—seeking emotional help from others to improve coping.
Stress is widely recognized as a universal challenge, especially among young people during their academic years, when academic and social pressures are common. In this study, the static activity approach proposed by Jaremko and Meichenbaum [14] is adopted, with changing perspective and practicing relaxation selected as the primary stress-relief methods, supported by an additional medium, music, discussed in the following section.

2.2. Music Therapy

2.2.1. Definition of Music Therapy

The music-assisted care explored in this study for stress relief is derived from the concept of music therapy—a practice widely recognized as a therapeutic tool with diverse applications, though one that remains difficult to define precisely [16]. According to the American Music Therapy Association [17], music therapy is grounded in the therapeutic power of music and is carried out by trained professionals to achieve specific treatment outcomes. It is regarded as a non-invasive, human-centered approach aimed at reducing psychological stress and preventing illness.
Music has long been regarded as a form of communication and self-expression [18], serving as a bridge for inner dialogue and contributing to both physical and mental well-being. However, it is also emphasized that music functions as a supportive method rather than a direct medical treatment [19]. Based on these perspectives, music-assisted care in this study is defined as the use of music as a non-medical, supportive tool for alleviating psychological stress.

2.2.2. Music Therapy and Therapeutic Application Models

Although numerous methods of music therapy have been developed, their complexity and diversity have led to a vast and intricate system. The evolution of music therapy has been shaped by various theoretical foundations, especially from medicine and psychology. Liu [20] and other psychotherapy scholars have noted that early music therapy theories were largely based on distinct psychological frameworks. Robarts [21] further observed that modern music therapy is heavily influenced by traditional psychotherapy, with many approaches stemming from established schools such as psychodynamic therapy, guided imagery with music, and gestalt therapy. In practice, familiar therapeutic orientations often serve as the foundation for music therapy and are adapted to meet individual needs. Two widely referenced orientations are psychodynamic and guided imagery approaches [22], detailed as follows.
(A)
Psychodynamic-oriented music therapy—This approach treats music as a therapeutic medium comparable to verbal psychotherapy. Rooted in Freudian concepts—such as the unconscious, the ego, and transference—it emphasizes emotional expression, personality transformation, and an active therapeutic relationship [23,24]. Sessions are typically held once or twice a week. This method remains influential and is widely used in clinical practice.
(B)
Guided imagery and music therapy (GIM)—Developed by Helen Bonny in the 1970s [25], GIM combines psychodynamic theory with music-guided introspection. In a relaxed state, clients listen to selected classical music, which evokes emotional and symbolic imagery to support self-awareness and growth. A typical GIM session includes four phases: (1) initial discussion and music selection, (2) induction into a relaxed state, (3) music listening with therapist guidance, and (4) post-session integration of insights into daily life.
From prior discussions, it is known that music therapy in psychotherapy is typically conducted through a gradual, non-verbal process to understand a client’s physical and emotional state. Since the researcher in this study does not hold a counseling license, the four-phase guided imagery and music (GIM) framework has been adopted for this study to carry out music-assisted care, with therapist dialogue replaced by pre-recorded guided prompts developed in consultation with experts. Given the close integration of music in daily life and the variability in personal preferences for relaxation music, appropriate music for use in the proposed interactive stress-relief system has been selected based on preliminary brainwave data analysis from 11 users and expert consultation, rather than individualized real-time personalization. The influence of musical elements on emotional responses and the methods for detecting emotional changes will be addressed in the following section.

2.3. Emotional Detection

2.3.1. Definition of Emotion

Emotion is a complex psychological experience shaped by internal stimuli, cognition, and environmental factors, involving cognitive, physiological, and behavioral responses. Drever [26] described it as a state marked by physical changes, intense feelings, and impulses to act. Plutchik [27] defined emotion as a set of patterned bodily responses, which he visualized in his Wheel of Emotions, where emotions that are related are placed next to each other, and opposing emotions are positioned across from one another. Dworetsky [28] emphasized emotion as a combination of conscious experience and physiological reaction that can either drive or inhibit behavior. Norman [29] highlighted its role in speeding up decision-making through rapid interpretation of situations. Strongman [30] argued that while emotions are innate, they develop through age, cognition, socialization, and learning. Thus, emotional experiences emerge from a dynamic interplay of physiological, psychological, cognitive, behavioral, and environmental elements.

2.3.2. Music and Emotion

Music, with its rhythms, melodies, and pitches, triggers diverse emotional responses depending on factors like performance style and environment. Slower tempos often have calming effects, while faster rhythms can increase arousal, such as elevated heart rate or excitement [31]. Melodies, easily remembered and reinforced through daily exposure (e.g., Beethoven’s Für Elise), can evoke emotions, though features like volume, pitch range, and tonal baseline vary across composers and influence perception.
While similar emotional responses may occur to the same music, individual preferences and experiences play a significant role. Zentner [32] suggested that music can evoke deep emotional and cognitive responses that are often not consciously accessed. Emotional reactions, while influenced by general trends, are shaped by personal associations and familiarity.
Schellenberg et al. [33] found that both pitch and rhythm affect emotional interpretation. Their study revealed that pitch alterations had a stronger emotional impact than rhythm changes, although tempo indirectly influenced pitch perception. Wide pitch variations and fast tempos elicited more energetic, joyful responses, while narrow pitch ranges and slow tempos were linked to calmness and tranquility.

2.3.3. Classification of Emotions

Extensive research has led to the development of emotion classification models, foundational for emotion recognition systems. These models categorize emotional states, aiding psychologists in analyzing and identifying common emotional responses.
One prominent model is Plutchik’s Wheel of Emotions [27] mentioned previously, which divides emotions into eight primary dimensions. Emotions close together are more related, while those opposite each other represent opposing states. Similar to a color wheel, emotions can be blended to form complex experiences.
Another widely used model is Thayer’s two-dimensional emotion model [34] as shown in Figure 2, which classifies emotions along two axes: valence (pleasantness) and arousal (activation level). This model defines four quadrants: (1) pleasant/excited, (2) anxious/angry, (3) sad/depressed, and (4) calm/relaxed. The horizontal axis indicates pleasure or relaxation, while the vertical axis represents energy and engagement. In the context of music, these dimensions help map the emotional response to musical elements and performance techniques.
However, Juslin [35] noted that emotions are often complex blends rather than distinct categories. Emotion models cannot fully capture the nuanced coexistence of emotional states. To detect these variations, large volumes of physiological data, such as brainwave patterns, are necessary. In this study, wearable EEG devices are used for real-time emotion sensing.

2.3.4. Affective Computing

With technological advancements, Picard [36] introduced affective computing—an interdisciplinary field combining psychology, medicine, neuroscience, and engineering areas such as brainwave analysis, machine learning, and signal processing.
Affective computing aims to quantify emotions and integrate emotional awareness into everyday technologies. According to Picard [37], the field encompasses four key areas:
(1)
Emotion recognition, which involves identifying emotions through physiological signals or external inputs using computational methods;
(2)
Emotion expression, where computers respond appropriately to recognized emotions;
(3)
having emotions, which explores enabling machines to exhibit emotional behaviors influenced by stimuli; and
(4)
Emotional intelligence, where systems regulate and balance their own emotional states.
This study focuses on emotion recognition, using sensors to detect real-time emotional signals—such as physiological responses—for emotion sensing. These signals are analyzed to determine the subject’s emotional changes.

2.3.5. Measurement of Brainwaves and Emotions

Emotions are shaped by physiological, psychological, and environmental factors, extending beyond subjective experiences to include behavioral responses and external stimuli. Since emotional responses originate in the brain, brainwave data has become a valuable tool in emotion detection and serves as the basis for this study.
Traditional emotion measurement relies on psychological and physiological indicators, including facial expressions, body language, speech, heart rate, and blood pressure [38]. Soleymani [39] proposed a classification method using brainwaves, pupillary responses, and self-assessments on a two-dimensional emotion scale, showing higher accuracy for brainwave-based analysis over self-report methods. The ground truth for emotion classification was based on Thayer’s two-dimensional model and confirmed through preliminary observation of EEG values and participant self-reports. Wu et al. [40] integrated EEG with Thayer’s model to adjust musical parameters and evoke targeted emotional states. Similarly, Valenza et al. [41] employed the International Affective Picture System (IAPS) and physiological signals such as ECG (electrocardiogram), EDR (electrodermal response), and RSP (respiratory signal pattern) to improve emotion recognition accuracy.
EEG is a common method for measuring brain activity, where α waves correlate with relaxation [42], and β waves with attention and perception [43]. Sammler et al. [44] used music to induce brainwave responses, while Petrantonakis and Hadjileontiadis [45] applied multi-channel EEG and the IAPS to classify emotions including anger, joy, and fear, achieving improved recognition through advanced cross-analysis techniques. These studies highlight the strong correlation between brainwaves and emotional states as well as the potential for enhanced accuracy through multimodal analysis.
In this study, NeuroSky’s MindWave Mobile device [46], manufactured by NeuroSky, Inc. in San Jose, CA, USA, is used as the EEG instrument. It is paired with the eSense algorithm [46] to quantify users’ attention and meditation levels in real time. Specifically, the eSense algorithm uses proprietary metrics based on α and β wave activity to calculate attention and meditation scores. These values are presented on a 1–100 scale and categorized into five levels, as shown in Table 1.
As the primary focus of this study is to evaluate whether the integration of a brainwave device with music-based stress-relief elements can effectively induce relaxation, Thayer’s two-dimensional emotion model [34] was selected as the foundation for emotion recognition. The attention and meditation values generated by the MindWave Mobile 2 were interpreted as corresponding to the model’s valence and arousal dimensions. Accordingly, participants’ emotional states were classified into four categories—relaxation, blankness, tension, and anxiety—based on the adapted model.

2.3.6. Brainwaves and Their Applications in Daily Life

Several applications have been developed to integrate brainwave data into daily life, providing interactive feedback for various purposes, as reviewed below.
(1)
Neurocam [47]—a wearable system that uses brainwave input to trigger smartphone-based recording, capturing 5-s memory GIF (graphics interchange format) images when emotional arousal exceeds a threshold, and is designed for entertainment by combining EEG sensing with visual memory capture.
(2)
Eunoia [48]—an interactive installation that converts brainwave data into sound waves, which ripple water in a pool to visually reflect the user’s mental state, using EEG input to adjust audio parameters and promote emotional observation through a multisensory experience.
(3)
Meditation Interaction Device [49]—a wearable system, produced by InteraXon Inc., headquartered in Toronto, Ontario, Canada, monitors relaxation through brainwave signals and uses colored lights, nature sounds, and fragrance feedback to guide meditation, with the degree of lotus flower bloom visualizing the user’s relaxation level for emotional observation.
(4)
Mind-Reading Tree [50]—an EEG-based artwork that visualizes user relaxation through a growing virtual tree and animated natural scenes, using brainwave input to trigger changes in projected visuals and ambient effects to promote calmness in an entertainment setting.
(5)
Muse 2 [51]—a brainwave-sensing device, manufactured by InteraXon Inc. in Toronto, Canada, tracks relaxation, focus, heart rate, and breathing via EEG and other physiological signals, displaying real-time data in smartphone charts to support meditation, stress reduction, and self-observation.
Most of the reviewed cases rely on brainwave values related to relaxation. Except for Neurocam, which involves user movement, the other systems encourage stillness—such as sitting or lying down—to maintain emotional stability. These interactive installations provide visual or physical feedback and are typically used in meditative states, enabling users to reflect on their emotional changes. The interaction feedback and data sources of each system are summarized in Table 2.
In this study, a music-based stress-relief interactive system is proposed, with the primary objective of allowing users to experience a sense of stress relief after interaction. Emotional changes are intended to be communicated through visual feedback following the session. In addition to monitoring participants’ stress levels using a wearable EEG device, the study also aims to foster users’ subjective awareness of having achieved relaxation.
From the reviewed literature above, it is evident that while emotions are universal, they remain difficult to capture accurately. The effects of music vary by individual, though slow rhythms, narrow pitch ranges, and ambiguous melodies have been shown to promote relaxation and increase α wave activity. EEG technology has been widely applied to support emotional stability and meditation, mainly through static methods and natural elements like trees, water, and wind. In most cases, music is not embedded in the device but left to user selection. The next section will explore literature on music integration in interactive devices.

2.4. Interactive Technology

2.4.1. Human–Computer Interaction

With technological advancement, many fields now employ human–computer interaction (HCI) as a medium for communication. A model introduced by Kantowitz and Sorkin [52] as well as Buxton and Baecker [53] outlines three core elements: human, interface, and computer. Sensory input is processed by the brain and expressed through gestures, which operate the interface; the interface then communicates with the system, which returns feedback, forming a complete loop. Furthermore, cognitive processes, tasks, and behaviors are considered [54]. Rogers, Sharp, and Preece [55] highlighted usability as central to system success, defining attributes such as effectiveness, efficiency, safety, utility, learnability, and memorability.
In this study, HCI principles are applied to the design of a music-based stress-relief system. A wearable EEG device is integrated to provide real-time emotional feedback, with EEG data analyzed to evaluate stress-relief effects, offering potential support for psychologists and counselors.

2.4.2. Case Studies of Music Applied in Interactive Technology

Some applications developed to integrate interactive sound experiences into everyday life or artistic settings are reviewed in the following.
(1)
McTrax [56]—a music placemat developed by McDonald’s Nederland, using conductive ink and Bluetooth to let users compose and mix music directly on the mat via smartphone, promoting intuitive music-making and brand interaction.
(2)
Archifon [57]—an interactive church installation where laser-pointing at architectural elements triggers unique sounds and illuminations, turning the space into a collaborative, immersive musical instrument.
(3)
LINES [58]—a sound art exhibition featuring suspended and wall-mounted lines that trigger diverse sounds upon contact, enabling multiple users to create music simultaneously and explore new musical expressions.
(4)
Embracing X Surroundings [59]—an installation blending traditional and natural sounds in a bamboo structure, where visitors physically engage with ambient audio and filtered light to deepen their connection with nature.
(5)
Dream Sound [60]—an audiovisual work that synchronizes frame-by-frame animation with music, using viewer-triggered sensors to vary soundtracks and visuals, enhancing subconscious perception of musical flow.
The above cases of music-based interactive systems are comprehensively analyzed in Table 3, focusing on the interactive feedback and data sources of each work.

2.5. A Summary of Literature Survey and Derivation of Design Principles

One of the objectives of this study is to examine the effectiveness of emotion-sensing technology in a music-based stress-relief interactive system. Literature has shown that music can elicit various emotional responses, with relaxation commonly associated with α waves (8–14 Hz). Therefore, α wave music, which is characterized by slow tempos, narrow pitch ranges, and ambient tones associated with relaxed states, is used as the primary auditory stimulus to promote stress relief.
Previous studies and case analyses indicate that devices designed for relaxation typically involve minimal physical movement, as excessive motion may disrupt relaxation and cause brainwave fluctuations. To assess users’ subjective relaxation experiences, participants are asked to perform specific stationary actions—such as closing/opening their eyes and listening to music—when relaxation is perceived, followed by interviews and questionnaire surveys.
The design principles established from the literature for designing the desired system are as follows.
(1)
Emotion-sensing is implemented using NeuroSky’s portable MindWave Mobile EEG device, reducing physical burden and facilitating data collection.
(2)
The system process is structured according to the static activity model by Jaremko and Meichenbaum [14], emphasizing relaxation, music, and meditation to guide emotional calming and cognitive restructuring.
(3)
The four-phase model of guided imagery and music by Bonny [25] is applied, focusing on the induction (relaxation/focus) and music listening (guided imagery) phases.
(4)
Music selections include crystal, α wave, and classical music, characterized by slow tempos and subtle melodies, intended to relax both mind and body.
(5)
EEG data are segmented and analyzed, with emotional feedback visualized through colorful images representing seasons, weather, oceans, and similar themes to reflect participants’ emotional states.

3. Methods

3.1. Concepts and Selections of Research Methods

An integrated research approach combining music-assisted care, interactive technology, and emotion sensing is proposed to develop a music-based stress-relief interactive system, named Serenity Island, in this study. The catharsis-oriented approach to music therapy [61] is used to guide emotional processing within the interactive system, while emotional tracking and visualization are enabled by the EEG device and software packages to assess the system’s effectiveness.
The research began with a literature review on youth stress, music therapy, emotional modeling, EEG applications, and human–computer interaction, from which design principles were derived to inform the development of a prototype of “Serenity Island”. Visual scene engagement and immersive music interaction are incorporated into the system to induce relaxation.
The study was conducted in two stages. The first stage was a preliminary experiment, in which both users and experts interacted with a prototype of the proposed system, “Serenity Island”. Following this, interviews were conducted to gather feedback for refining the prototype. The second stage was a formal experiment, where participants experienced the system and provided subjective feedback through adapted questionnaires. Additionally, participants’ brainwave data (focusing on values of focus and meditation) were analyzed to assess emotional responses and identify appropriate music. This mixed-methods approach—combining qualitative interviews, quantitative surveys, and real-time emotional data to generate visual and auditory feedback—enabled a comprehensive evaluation of the system’s usability and its effectiveness in stress relief. It supports the use of music-assisted care and visual scene presentations integrated with EEG-based interactive technologies for promoting emotional relaxation.

3.2. System Development

The research goal in this study is to construct a music-based stress relief interactive system that uses emotion-sensing technology to detect users’ emotional states and provide real-time visual and auditory feedback to promote relaxation. The development of the system followed a four-step process: (1) needs analysis—stress in younger populations was identified, and the need for a device combining EEG technology and music-assisted care was established; (2) system design—a wearable EEG device and the Unity software (ver. 6000.0.0) were used, following Bonny’s four-phase guided imagery model [25]; (3) system implementation—emotional states were mapped using Russell’s two-dimensional emotion model [62], with corresponding visual and auditory feedback designed for each state; and (4) product evaluation—The system’s effectiveness and user experience were assessed through two user testing phases: a preliminary experiment involving 11 participants and a formal experiment involving 52 participants, along with feedback from three counseling psychology experts.

3.3. Interview Survey

Interview surveys involve the researcher engaging in face-to-face conversations with the respondents, asking pre-planned questions designed to collect relevant data in a guided manner that encourages in-depth dialogue. This method allows for the collection of rich, detailed information that may not be easily obtained through questionnaires. Through verbal responses, participants can provide nuanced insights closely related to the research topic. The information gathered is then recorded, organized, and analyzed, enabling deeper understanding within the study. The purpose of this method is to offer respondents greater flexibility in their answers, allowing researchers to explore participants’ emotions, attitudes, and value judgments, while also encouraging the expression of detailed personal opinions [63].
In this study, the semi-structured interview approach [64] was adopted. An interview outline was prepared in advance, though no fixed question sequence was followed. Both the interviewer and the respondent were free to express their thoughts and feelings, allowing for more flexibility than in standardized interviews. A main theme and key areas of inquiry were defined, and individual interviews with users and experts were conducted, focusing primarily on participants’ feelings while ensuring a safe and supportive environment.

3.4. Questionaire Survey

The questionnaire survey method adopted in this study is a research approach used to collect opinions and data from a target group through structured questionnaires. This method enables the rapid and direct collection of large volumes of standardized data using quantitative or rating scales to ensure consistency [65]. With subsequent statistical analysis, relationships or influences among various factors can be identified. Based on these analytical results, the effectiveness of the system developed in this study was evaluated. The implementation steps of the questionnaire survey in this study includes: (1) timing—participants were asked to complete the questionnaire for 5 to 10 min after experiencing the proposed system; (2) participants—questionnaires were distributed anonymously to 54 invited young participants who experienced the system, and a total of 52 valid responses collected; (3) procedure—after the questionnaire items were explained, the participants were asked to complete the questionnaire.

4. System Design

4.1. Design Ideas of Proposed System

The proposed system, “Serenity Island”, is centered on the concept of a personal inner island. Participants are guided through sound and visuals, immersing themselves in a narrative experience. A wearable EEG device is integrated into the music-assisted relaxation process, allowing users’ brainwaves to influence changes in the audiovisual environment. Programmed parameters dynamically modify visual elements such as seasons, weather, and the ocean, creating endlessly varied effects that transport participants to a dreamlike island beyond reality.
The immersive experience is further enhanced through environmental design: a minimalist-style carpet, cozy lounge chairs, and slow, calming music together create a space of comfort and security. This carefully crafted atmosphere invites participants into a deeply immersive journey—both surprising and soothing. Such a creative music-based stress-relief system hopefully will leave the participants enchanted and reluctant to leave.

4.2. Architecture of the Proposed System

As illustrated in Figure 3, the software for the proposed system, “Serenity Island”, was developed using Unity (ver. 6000.0.0), with brainwave data collected via the MindWave Mobile 2—a wearable EEG device by NeuroSky [46], as previously mentioned. While users sit and view the screen, their brainwaves are processed in real time by Unity to determine emotional states, which then trigger corresponding visual scenes displayed on a large screen.
The 3D visual scenes were designed using MAYA and textured in Substance Painter before being imported into Unity through plug-in integration. Emotional sensing data, processed via custom programming, dynamically adjusted visual elements such as lighting, color, and surface details in real time. This responsive design, combined with a stress-relief process, provided an immersive and emotionally adaptive audiovisual experience.

4.3. Interactive Experience Flow and Interface Design of the Work

The system “Serenity Island” is primarily designed to provide users with an experience in a comfortable and relaxing environment. The theme begins with the concept of “a solitary island in the sea”, using the installation to present changes in weather and seasons on each individual’s imagined island. Through this process, participants become more aware of their emotional states and discover the music that best helps them relax.
Bonny’s four-phase guided imagery and music (GIM) process [25] was adapted into four phases of “Serenity Island”—growth, stress, music listening, and viewing—to guide users into a relaxed emotional state in the interactive experience of the system, as described in the following.
(1)
Initial Setup
(i)
User seated—The user is guided to be seated by the proposed system and wear a brainwave headset, and the experimental procedure is explained.
(ii)
Login in—A login screen is displayed, asking a researcher to start the system.
(iii)
Device connection—Device connection is conducted to establish communication between the system and the EEG.
(iv)
Standby screen—The island in the visual display appears barren, with no tree present; and an audio prompt is given to instruct the user to focus intently on the island.
(2)
Growing Phase (Pre-Session Conversation)—
(i)
Tree growth initiated—Tree growth on the island is initiated when the user’s real-time EEG attention value surpasses a designated threshold.
(ii)
Tree growth accelerated—Higher attention levels accelerate tree growth, while increased meditation levels enhance the density of the island’s vegetation.
(3)
Stress Phase
(i)
Increasing pressure—After tree growth finishes, the ocean wave parameters are increased to create intense waves, increasing the sense of psychological pressure.
(ii)
Screen darkened—The system gradually darkens the scene to create further an immersive atmosphere of pressure.
(4)
Music Listening Phase
(i)
Immerse in music—The user is prompted by the system to relax his/her body, close their eyes, listen to the music, and fully immerse themselves in the music for 2.5 min. The user is then prompted to open their eyes afterwards.
(ii)
Visual scene changes—Various visual scenes are displayed on the screen to guide the user through different emotional states, including: (a) four seasons; (b) varying plant sizes and leaf densities; (c) varying wind strength; (d) varying rain quantity.
(5)
Viewing Phase (Post-Session Integration)—
(i)
Emotional calming—The user continues music listening for an additional 2.5 min to settle their emotions.
(ii)
Emotion analysis—The user’s brainwaves are read to yield an emotion type with its visual display presented: (a) anxiety; (b) tension; (c) blankness; (d) relaxation. Each user was presented with a single emotional visualization during the viewing phase, customized based on their averaged EEG data during the music listening phase.
(6)
Session End
(i)
Showing emotion diagram—When the music ends, an emotion diagram, including the user’s “relaxation level” and the corresponding “emotional state”, is generated based on all the emotion data and presented on the screen for the user to view.
(ii)
Screen darkened—When the time is up, a notification indicating the end of detection is displayed, the screen gradually darkens, and the data are saved.
(iii)
Detection ended—A message appears, instructing the participant to remove the brainwave headset.
A flowchart of this process is shown in Figure 4, and Table 4 provides further details, in which EEG data was collected in real-time during the music listening phase (Phase 4 in the table), but the emotion classification was based on averaged EEG values calculated over a 100-s interval after that phase. This emotional classification was then used to determine which of the four predefined seasonal scenes (Spring, Summer, Autumn, or Winter) was presented to the participant in the viewing phase (Phase 5). The visual content was not dynamically adjusted in real time, but selected once based on the user’s classified emotional quadrant.

4.4. Creation of Visual Scenes for Emotion Representaiton

The creation of the interactive visual scenes for emotion representation, called visual representation, on the screen of “Serenity Island” includes three major components—the emotion-sensing module, processing module, and feedback display module—which function as follows.
(1)
Emotion-sensing module—While seated comfortably, the user wears an EEG headset that captures brainwave signals and transmits them via a Bluetooth device to a computer.
(2)
Processing module—EEG data are analyzed in real time to identify the user’s emotional state and generate the corresponding visual and auditory feedback.
(3)
Feedback display module—Based on the analysis, the system presents dynamic visualizations on screen that reflect the user’s emotional changes.
The feedback display module consists of three independently functioning visualization components: ocean, mist, and island. Each includes dynamic elements and customizable parameters, allowing researchers to generate varied visual effects. These adjustable features are detailed in the subsequent sections.

4.4.1. Ocean Visualization and Parameter Control

The ocean visualization component rendered an animated mesh in real time to simulate wave motion, with adjustable parameters including: (a) wave height—controlled by a ‘wave scale’ (1–12), where higher values produce taller waves; (b) wave speed—adjusted to modify wave motion and frequency, with higher values resulting in faster waves; (c) sea—categorized into ‘above surface’ and ‘underwater’ views, managed by two separate cameras that switch from Camera A to Camera B when submerged, with an adjustable underwater filter tone. Table 5 outlines how each ocean parameter affects the visual output.

4.4.2. Mist Visualization and Parameter Control

The mist visualization component, developed with Unity and Amplify Shader Editor, enables intuitive adjustments for mist intensity, coverage, and color on various generated visual scenes. Two mist layers, in front and behind the camera, create atmospheric depth. The main adjustable parameters are: (a) mist intensity—ranges from 0 (thinnest) to 1 (thickest); (b) mist coverage—adjusted by four parameters: intensity, height, density gradient, and obscureness; and (c) mist color—controlled using three color palettes for easy modification. Table 6 shows the relationship between system parameters and visual outputs.

4.4.3. Island Visualization and Parameter Control

The island visualization component features five adjustable components—season, vegetation size, leaf density, wind, and rain—each independently controlled to produce diverse visual effects: (a) season—set by a float value from 0 to 4, corresponding to winter, spring, summer, and autumn; (b) vegetation size and leaf density—adjusted via a shared panel, with vegetation ranging from barren to full growth and leaf density from sparse to lush; (c) wind—simulated by directional animation of leaves, with strength adjustable from 0 (calm) to 1 (strong); (d) rain—created using Unity’s particle system, with intensity controlled from 0 (none) to 1 (heavy rain). Table 7 illustrates the relationship between system parameters and visual outputs.

4.4.4. Interactive Feedback and Triggers

The proposed system “Serenity Island” provides interactive feedback based on users’ real-time brainwave activity; EEG signals, captured via the NeuroSky MindWave Mobile 2 headset, are analyzed using the “eSense” algorithm to assess attention and meditation levels. Data are sent to Unity once per second and categorized into four emotional states, each linked to specific animations and color themes displayed on the large screen.
Two interactive phases guide the experience: (a) the growing phase (stress phase)—the virtual plants at the island’s center begin to grow after an audio guide, with tree growth driven by the attention levels, and the leaf and grass growth determined by the meditation levels, progressing linearly until the maximum size is reached; (b) the music listening phase (relaxation phase)—participants close their eyes to listen to music while the system records EEG data over a 100-s span starting at the 20-s mark, calculates average attention and meditation levels, and uses the results to determine a quadrant of an emotional coordinate system, triggering a corresponding graphic display when participants reopen their eyes. Figure 5 presents the emotional sensing flowchart.
The emotional coordinate system mentioned above is based on the concept of the two-dimensional emotion model proposed by Russell [62], as mentioned previously. In this model, the horizontal axis represents valence (pleasure), and the vertical axis represents arousal (activation), categorizing emotions into four quadrants. For this study, the model was adapted by mapping the valence axis to the system’s meditation value and the arousal axis to the attention value. On the meditation dimension, higher values toward the right indicate deeper relaxation. On the attention dimension, higher values upward indicate stronger concentration. This adapted model is illustrated in Figure 6, in which the centered scale (−50 to +50) used on the two axes are a normalized version of the 0–100 eSense values in Table 1 and the four emotion categories relaxation, blankness, tension, and anxiety in the four quadrants are defined as follows:
  • Relaxation: in the range with meditation = (0, 50) and attention = (0, 50);
  • Blankness: in the range with meditation = (0, 50) and attention = (−50, 0);
  • Tension: in the range with meditation = (−50, 0) and attention = (0, 50); and
  • Anxiety: in the range with meditation = (−50, 0) and attention = (−50, 0).
In this study, the four emotional states in the four quadrants of the emotional coordinate system are categorized, respectively, to be: anxiety, tension, blankness, and relaxation. Each state was paired with corresponding visual effects and ambient soundscapes. Visual elements such as season, wind, rain, wave patterns (of the ocean), and overall color tone (of the mist) were adjusted accordingly, while ambient music was modulated based on the related seasonal and weather settings. The resulting emotional state mapping is shown in Table 8.
During interaction, visual effects and ambient music provide feedback, forming an imagined island that mirrors the participant’s inner state. This enhances emotional awareness and helps identify suitable music. In the viewing phase, participants explore emotional feedback by modulating their brainwaves. Table 9 presents the corresponding visualizations of the four emotional states of anxiety, tension, blankness, and relaxation.

5. System Experience and Data Analysis

5.1. Preliminary Experiment

Based on the literature and system development, the “Serenity Island” prototype was completed and evaluated through a preliminary experiment comprising four parts: (1) user interaction and feedback interviews, (2) identification of optimal relaxation music, (3) expert engagement and post-use interviews, and (4) prototype refinement based on feedback. The details are described in the following.

5.1.1. User Interaction and Feedback Interviews

To evaluate the system’s effectiveness and gather user feedback, a field test was conducted involving direct interaction with the prototype.
(1)
Objective—Users tested four music tracks with relaxing elements while wearing an EEG headset. Brainwave data (average meditation values) were collected, and semi-structured interviews assessed whether visuals, music, and feedback effectively conveyed relaxation and aligned with emotional states.
(2)
Participants—11 persons aged 18–24.
(3)
Equipment—NeuroSky’s EEG headset and the prototype system “Serenity Island”.
(4)
Procedure—After setup and instructions, participants experienced four 2-min music sessions while observing the system in operation (with EEG monitoring and system resets between each) and then completed a semi-structured interview.
(5)
Interview process—Sessions lasting 10 to 15 min focused on two aspects, “theme and stress relief” and “visuals and emotions”, recorded via audio and written notes. The interview questions are listed in Table 10.
(6)
Findings from the user interviews—
(a)
About theme and stress relief—Most participants reported using music, rest, games, or exercise to relax and responded positively to natural sounds like water and birds. The prototype was seen as calming, though suggestions included adding more nature visuals and improving immersion. The guided narration received mixed feedback, with calls for a softer tone and clearer, better-paced audio.
(b)
About visuals and emotions—Participants generally noticed visual changes (e.g., tree growth) linked to emotional states, though some found them too subtle. The color scheme was viewed as calming, though bright or saturated elements were discouraged. For anxiety, darker tones and intense weather were preferred; for relaxation, visuals were mostly effective with minor suggestions to enhance immersion.

5.1.2. Identification of Optimal Relaxation Music

In the preliminary experiment, 11 participants each completed four rounds of the prototype experience, with the relaxation music being the only variable. Brainwave data were recorded every second, with irregular spikes or drops excluded based on duration and intensity. Participants’ attention and meditation values for each round were then averaged. Table 11 lists the music tracks by code name, while Table 12 and Table 13 present the corresponding attention and meditation data from the 44 sessions. Each entry includes averaged values for the growing phase (“Tree_A”) and music listening phase (“Music_A”), with participants labeled A through K and trial rounds indicated by “T”.
The data reveal several key points: (1) All four music tracks (T1–T4), except T3 (48.12), achieved average meditation values above 53 during the music listening phase, placing them in the “relaxation” or “blankness” zones of the emotional map; (2) the meditation values increased across all tracks during the music listening phase compared to the growing phase, indicating enhanced relaxation; and (3) T1 recorded the highest average meditation score at 60.09. During the growing phase, T1 and T3 showed average attention values above 51, indicating sustained focus. Although attention slightly declined during the music listening phase, values for T1 and T3 remained above 47, reflecting continued engagement.
Overall, both T1 and T3 aligned with the study’s goals, but T1’s superior meditation score makes it the optimal soundtrack for use in the final version of the proposed system “Serenity Island”.

5.1.3. Expert Engagement and Post-Use Interviews

This part of the preliminary experiment involved three steps, as outlined below.
(1)
Invited experts—Three psychological counseling experts, specializing in psychotherapy, art therapy, and student counseling, were selected for this study. These professionals primarily work with university students and possess extensive experience in psychological support and therapy (see Table 14).
(2)
Before-interview activity—The experts participated in the same interactive experience with the prototype system as the 11 participants mentioned above.
(3)
Interview process—Each expert was interviewed for approximately 40 min using a semi-structured, one-on-one format. The interviews focused mainly on the two aspects “feasibility of the project concept” and “elements and important details of the work”, with a particular emphasis on integrating music-assisted care and wearable EEG technology. Sessions were audio-recorded and supplemented with notes. A list of interview questions is provided in Table 15.
(4)
Findings from expert interviews: (a) the system “Serenity Island” was commended for its innovation and entertainment value, though some counseling-based elements require refinement; (b) EEG-driven visual scenes were effective in helping users recognize their emotions; (c) All four music tracks were considered suitable for relaxation and were used to guide the final selection process; (d) the narration should adopt a gentler tone, and tree growth content should avoid evoking stressful memories; (e) emotion visuals were well received except for the “blankness” scene, which needs minor visual and color adjustments; (f) the atmosphere could be enhanced with soft yellow lighting and comforting elements like sofas and plush toys to foster a sense of safety.

5.1.4. Prototype Refinement Based on Feedback

Interview feedback from users and experts revealed two key areas for improvement: system flow and emotional visuals within Serenity Island, and the physical environment, which should be adjusted to better support relaxation and atmosphere, as suggested by experts. More details are as follows.
(1)
System implementation improvements—
(i)
The guided narration was re-recorded with a softer tone.
(ii)
The growing phase narration was revised to: “Please recall a recent troubling matter that has left you indecisive—Serenity Island will transform this into growth energy”.
(iii)
The growth curve driven by attention and meditation values was adjusted.
(iv)
The transition animation between the growing and music listening phases was refined for smoother flow.
(v)
The four emotional scenes were revised, as detailed in Table 16.
(2)
Surrounding Environment improvements (as shown in Table 17)—
(i)
A soft yellow night light was added to the experiment site.
(ii)
A sofa chair and carpet were added to the space.
(iii)
Cotton fabric was used around the area to cover and block nearby equipment.
(iv)
The screen was replaced with a 50-inch monitor.

5.2. Formal Experiment

With the refined prototype system and environment, a formal experiment was conducted, following the same structure as the preliminary test and consisting of five parts: (1) field tests with invited users, (2) analysis of users’ emotional states, (3) user interviews, (4) questionnaire-survey data collection, and (5) questionnaire-survey data analysis. These activities allowed for a comprehensive assessment of user experience and emotional responses through EEG data, interviews, and questionnaires. The first three parts are described in this section, leaving the fourth and fifth parts, namely, the questionnaire-survey data collection and analysis, presented in the next section.

5.2.1. Field Tests with Invited Users

To assess the effectiveness of the refined system “Serenity Island” in guiding participants toward relaxation, a formal experiment was conducted with the following structure.
(1)
Objective—To evaluate the stress-relief effectiveness of the improved system, the focus of the experiment is put on its ability to facilitate a relaxed emotional state.
(2)
Participants—A total of 52 persons were invited to participate. Among the 52 participants, 29 identified as female and 23 as male. All participants were university students aged 18–24, recruited via campus announcements.
(3)
Procedure—The procedure, using the most effective relaxation music selected earlier, was conducted once and divided into three stages, with the corresponding atmospheres illustrated in Figure 7.
(a)
Stage I: growing and stress phases—Participants viewed a visual scene of a barren island on the screen while recalling personal worries. The EEG tracked their attention and meditation values, which influenced the growth of a virtual tree.
(b)
Stage II: music listening and viewing phases—Participants listened to the relaxing music while dynamic visuals, updated based on the EEG input, reflected their emotional states.
(c)
Stage III: session end—After music playback, the participants received visual feedback on the screen with a relaxation score and an emotional coordinate map derived from their EEG data.

5.2.2. Analysis of Users’ Emotional States

In the formal experiment, 52 participants experienced a four-phase guided process—growing, stress, music listening, and viewing—adapted from Bonny’s framework [25]. EEG data were recorded every second, with irregular spikes or drops removed during processing. Average attention and meditation values were calculated for each phase and plotted on an emotional coordinate system, as shown in Figure 8.
In the growing phase (yielding purple dots), participants were guided to reflect on negative thoughts. Attention values varied, with about half scoring between 60 and 75, and the remainder below 60. Meditation values mostly ranged from 30 to 40, suggesting that the opening guidance had a limited calming effect. During the stress phase (yielding green dots), values became more dispersed but largely concentrated in the anxious zone, likely due to residual emotional tension and the use of darker visual scenes on the screen.
In the music listening phase (yielding blue dots), meditation values increased significantly, often exceeding 50, reflecting a shift toward relaxed or blank mental states. Attention values declined, possibly due to the passive nature of closed-eye listening. In the viewing phase (yielding red dots), data became more scattered but mostly stayed within the relaxed zone, indicating varied yet generally calm reactions to the emotion-reflective visuals.
In summary, the four-phase guided experience on the proposed system “Serenity Island” effectively led the participants from anxious and tense states to relaxed and meditative ones, indicating that most achieved a state of psychological relaxation during the process.

5.2.3. User Interviews

During the formal experiment, 52 participants engaged in individual, semi-structured interviews (10–15 min each) to provide feedback on the Serenity Island system. Discussions focused on the design concept, visual perception, and overall experience to evaluate alignment with intended goals and emotional effectiveness. Key findings include the following.
(1)
Ease of use and engagement—The system was seen as intuitive and engaging.
(2)
Emotional awareness and feedback—EEG-driven emotional visualization enhanced users’ self-awareness, and the relaxation feedback screen was clear.
(3)
Audio effectiveness—Guided prompts and music supported immersion, though improvements in audio recording quality were suggested.
(4)
Visual and environmental design—Lighting, visuals, and soft furnishings helped induce calmness and matched users’ emotional states.
(5)
System performance—Occasional frame drops occurred during EEG headset connection.
(6)
Session duration—Participants generally desired a longer experience.
These insights offer direction for refining the system’s interaction, aesthetics, and technical performance.

5.3. Questionnaire Survey

To further evaluate participants’ experiences with the system “Serenity Island”, a questionnaire was designed and administered to collect quantitative feedback following the formal experiment. The analysis of the collected data by statistical packages SPSS 26 and AMOS 23 is then conducted to verify the effectiveness of the proposed system “Serenity Island” for stress relief. The details of these operations are presented in this section.

5.3.1. Questionnaire Design and Data Collection

A total of 52 questionnaires were collected from participants in the formal experiment. The questionnaire consisted of two parts, with the first focusing on the scale of user interaction experience, adapted from the QUIS scale by Chin, Diehl, and Norman [66] and tailored to this study. It included 16 items (T1–T16) rated on a five-point Likert scale [66] from “strongly disagree” (one point) to “strongly agree” (five points), as detailed in Table 18, with statistical results shown in Table 19.
In addition, to assess the degree of relaxation achieved by each participant after using the system, the second part of the questionnaire was about the scale of relaxation level designed based on the third edition of the Smith Relaxation States Inventory (SRSI-3) proposed by Smith [67] which includes 9 items, labeled S1 through S9, as shown in Table 20. The corresponding statistical data are shown in Table 21.

5.3.2. Analysis of Questionnaire Survey Data—Introduction

The questionnaire data of 52 participants, as described in Table 18, Table 19, Table 20 and Table 21, are analyzed statistically using SPSS 26 and AMOS 23 to check the effectiveness of the proposed system for stress relief. This work was carried out through the following stages and steps.
(1)
Stage I: analysis of the reliability and validity of the questionnaire dataset—
(a)
Step 1: verifying the adequacy of the questionnaire dataset;
(b)
Step 2: finding the latent dimensions of the questions from the collected data;
(c)
Step 3: verifying the reliability of the collected questionnaire dataset;
(d)
Step 4: verifying the applicability of the structural model established with the dimensions;
(e)
Step 5: verification of the validity of the collected questionnaire data.
(2)
Stage II: analysis of the meanings of the latent dimensions of the dataset—
(a)
deriving the meanings of the dataset of each latent dimension of the first scale;
(b)
deriving the meanings of the dataset of each latent dimension of the second scale.
The above two stages of analysis are described in the following two sections.

5.3.3. Analysis of the Reliability and Validity of the Questionnaire Dataset

To evaluate the “user interaction experience” and “relaxation level” of the proposed system for stress relief using the data described in Table 18, Table 19, Table 20 and Table 21, five steps as described previously should be conducted, as described in the following.
  • (A) Verification of the Adequacy of the Questionnaire Dataset
To verify the adequacy of the collected questionnaire data, the Kaiser–Meyer–Olkin (KMO) test and Bartlett’s test of sphericity were used [68]. A KMO value above 0.50 and a Bartlett’s test significance below 0.05 indicate that the data are suitable for further structural analysis. Using the data from Table 19 and Table 21 in SPSS, the KMO and Bartlett’s test results were calculated as shown in Table 22. The KMO values exceeded 0.50, and Bartlett’s significance was below 0.05, confirming the data adequacy for further analysis.
  • (B) Finding the Latent Dimensions of the Questions from the Collected Data
The structural analysis for the questionnaire survey aims to categorize the questions of each scale into meaningful subsets, each representing a latent dimension. For this purpose, exploratory factor analysis (EFA) using principal component analysis and the varimax method was conducted using SPSS. The results, based on the inputs from Table 19 and Table 21, are presented in Table 23 and Table 24 for the two scales: user interaction experience and relaxation level, respectively.
For the first scale of user interaction experience, the variables T1 through T16, representing the 16 questions asked about this scale, are divided into three groups—RA1 = (T13, T8, T4, T15, T5, T12, T1), RA2 = (T2, T3, T6, T14, T16), and RA3 = (T7, T11, T10, T9)– aligning with three latent dimensions termed in this study as “emotional experience”, “device experience”, and “interface design and perception”, respectively.
Likewise, for the second scale of relaxation level, variables S1 through S9, representing the nine questions, are divided into two groups: RB1 = (S4, S6, S2, S5, S9) and RB2 = (S3, S7, S1, S8), corresponding to two latent dimensions termed “physiological feedback” and “emotional feedback”, respectively. These results are comprehensively summarized in Table 25. Furthermore, the texts of the original questions for each latent dimension are shown in Table 26 and Table 27.
  • (C) Verifying the Reliability of the Collected Questionnaire Data
Reliability refers to the consistency of data across repeated measures [69] and was assessed in this study using Cronbach’s α coefficient [70]. A value above 0.35 indicates acceptable reliability, while a value above 0.70 suggests high reliability [71].
Based on data from Table 19 and Table 21, Cronbach’s α coefficients for the two scales and the five latent dimensions are summarized in Table 28. All values exceed 0.70 except two (0.649 and 0.668), which remain above the acceptable threshold of 0.35. These results indicate the data are sufficiently reliable for further analysis.
  • (D) Verification of Applicability of the Structural Model Established with the Dimensions
Before validating the questionnaire data, the appropriateness of the structural model based on the latent dimensions has to be verified. This was achieved using confirmatory factor analysis (CFA) with AMOS 23 software, resulting in two structural model graphs, shown in Figure 9. The CFA process also generated fit indices for the “user interaction experience” and “relaxation level” scales, as presented in Table 29. The fit indices χ2/df, CFI, and RMSEA for each scale are within acceptable ranges: “1 to 5”, “greater than 0.9”, and “larger than 0.05 and smaller than 0.10”, respectively, indicating a reasonably good fit between the structural model and the questionnaire data, as suggested by Hu and Bentler [72].
  • (E) Verification of the Validity of the Collected Questionnaire Data
After confirming that the model structures of the two scales fit the questionnaire data, the validity of the data was analyzed. In Figure 9, all factor loading values (standardized regression weights) for the two scales—along the paths from RA1, RA2, and RA3 to questions T1-T16, and RB1 and RB2 to questions S1–S9—either are close to or exceed the threshold value of 0.5, indicating that the data are of good construct validity. This is further supported by the construct validity values for all latent dimensions, computed via EFA and detailed in Table 30, where each value exceeds the 0.6 threshold.
The above steps (A) through (E) collectively verify the reliability and validity of the questionnaire data, allowing for further analysis of each latent dimension.

5.3.4. Analysis of Questionnaire Data About the Scale of User Interaction Experience

The scale of user interaction experience was designed to evaluate the participants’ perceptions of the system across three latent dimensions: “emotional experience”, “device experience”, and “interface design and perception”. A summary of the questionnaire analysis for each dimension is provided below.
  • (A) Data Analysis for the Latent Dimension of “Emotional Experience”
This dimension focuses on the participants’ emotional feeling while experiencing the proposed system, particularly whether they found the system enjoyable, engaging, pleasant, or relaxing (see Table 31).
(1)
Average scores for this “emotional experience” dimension ranged from 3.81 to 4.33, indicating generally positive emotional responses.
(2)
Item 6 (T12) received less than 80% agreement and had a standard deviation above 0.93, suggesting some divergence in opinion, though the average exceeded 4.02.
(3)
Item 1 (T13) showed the lowest agreement rate (65.4%) and a standard deviation of 0.95. This may be due to participants interpreting “daily life” in terms of practicality or frequency of use, rather than relaxation effectiveness.
(4)
Items 3 and 4 (T4 and T15) had agreement rates above 80%, with no negative responses, indicating that participants generally felt emotionally uplifted and relaxed.
(5)
Of the seven items, five received agreement rates above 80%, suggesting most participants were satisfied with the emotional feedback from the experience.
  • (B) Data Analysis for the Latent Dimension of “Device Experience”
This dimension was intended to assess the participants’ perceptions of the system’s operational smoothness, visual comfort, and the relevance of its stress-relief elements to daily life. The results are presented in Table 32.
(1)
Average scores for this “device experience” dimension ranged from 4.00 to 4.40, suggesting that the experience flow, system operation, and visual presentation were perceived as smooth, comfortable, and relatable.
(2)
Standard deviations for Items 8 and 9 (T2 and T3) were below 0.7, with no negative responses, indicating strong consensus regarding the system’s fluency.
(3)
Item 11 (T14) had the highest standard deviation (0.84), indicating varied views on connecting stress-relief elements to personal experience.
(4)
Four of the five items received over 85% agreement, reflecting high overall satisfaction with the device experience.
  • (C) Data Analysis for the Latent Dimension of “Interface Design and Perception”
This dimension was intended to assess the participants’ perceptions of the device content during the experience, including the overall visual effect and auditory feedback. The analysis results are presented in Table 33.
(1)
The average scores for this dimension of “interface design and perception” ranged from 4.21 to 4.29, indicating that the system’s interface and sensory feedback were found to be engaging.
(2)
All items in this dimension received scores above 4, reflecting generally positive perceptions regarding interface design and sensory experience.
(3)
All four items had agreement rates exceeding 80%, suggesting that most participants were satisfied with the system’s visual and auditory design.

5.3.5. Analysis of Questionnaire Data About the Scale of Relaxation Level

A key objective of this study was to evaluate the degree of relaxation experienced by users after engaging with the music-based stress-relief system “Serenity Island”. To this end, a relaxation-level scale was developed by adapting the Relaxation State Inventory by Smith [67] to fit the specific context of this study. The scale comprises two latent dimensions: “physiological feedback” and “emotional feedback”, designed to measure users’ perceived relaxation. A summary of the questionnaire analysis for each dimension is provided below.
  • (A) Data Analysis for the Latent Dimension of “Physiological Feedback”
This dimension primarily investigates the participants’ physiological states following the experience. The analysis results are shown in Table 34.
(1)
The average scores for this dimension ranged from 3.50 to 4.31, indicating that the participants generally experienced a sense of physical and psychological relaxation.
(2)
For Items 3, 4, and 5 (S2, S5, and S9), the average scores were mostly above 4, with Item 3 (S2) approaching 4, suggesting that participants reported a relaxed and softened physical state.
(3)
Higher standard deviations for Items 1 and 2 (S4 and S6) suggest diverse views on psychological relaxation, with average scores of 3.67 and 3.50 indicating less consensus. While physical relaxation was evident, some participants may have experienced lingering mental concerns that the system could not fully address.
(4)
Among the five items in this dimension, only one item exceeded an 80% agreement rate, accounting for 20% of the total. This implies that participant opinions varied across these items.
  • (B) Data Analysis for the Latent Dimension of “Emotional Feedback”
This dimension focuses on the participants’ emotional responses after the experience, with the analysis results shown in Table 35.
(1)
The average scores for the “emotional feedback” dimension ranged from 4.17 to 4.40, indicating that participants generally felt relaxed, calm, and at ease after the experience.
(2)
The standard deviation for Item 7 (S7) was 0.86, suggesting some variation in responses; however, the average score of 4.19 still reflects that most participants felt rested and refreshed.
(3)
All four items in this dimension had agreement rates exceeding 80%, showing that the majority of participants experienced emotional relaxation following the session.

5.4. Data Analysis and Discussions

In this section, the procedures and results of both the preliminary and formal experiments are presented, along with data from expert interviews and user questionnaires. Experts provided generally positive feedback on the system’s content and flow, suggesting areas for improvement while recognizing its future potential. Questionnaires assessing interaction experience and emotional state demonstrated strong reliability and validity, with most average scores exceeding 4, indicating generally favorable user responses. After incorporating expert suggestions, user interview data were analyzed to assess alignment with the system’s stress-relief goals. Key findings, based on both qualitative and quantitative data, are summarized below (integer numbers in the parentheses refer to participants and experts).
(1)
Most participants (48 individuals) reported achieving a sense of relaxation.
(2)
Many (39 individuals) found the EEG-driven transformation of emotions into seasonal and weather-based visuals to align well with their emotional perceptions.
(3)
Questionnaire responses showed consistent and positive feedback on physical relaxation.
(4)
EEG data indicated a shift toward relaxed and blank states after the music listening phase, especially during the viewing stage.
(5)
While physiological relaxation received high average ratings (3.94, 4.02, 4.31), psychological relaxation ratings (3.67, 3.50) were more varied—possibly due to participants’ unclear definitions of mental relaxation, despite physiological evidence of calm.
(6)
Participants rated interactivity (avg. 4.17) and overall satisfaction (avg. 4.02) highly.
(7)
A majority (43 individuals) felt the session duration was too short.
(8)
Although guided prompts were considered effective (47 individuals), their content and equipment should be carefully selected, ideally with input from counseling professionals.
(9)
Participants showed greater willingness to engage in future experiences (avg. 4.38).

6. Conclusions and Suggestions

6.1. Conclusions

Based on a review of literature on stress, music therapy, emotion sensing, and interactive technologies, a novel system called Serenity Island was developed to integrate music-assisted care with EEG-based emotion detection for stress relief. Grounded in Bonny’s four-stage guided imagery and music approach and a relaxation-based stress management model, the system used spoken prompts and real-time seasonal and weather visuals to guide and reflect users’ emotional states. Initial prototypes were refined through expert and user feedback during a preliminary experiment, leading to improvements implemented before a formal study. Two experimental phases involving 11 and 52 participants evaluated the system’s effectiveness through interviews and questionnaires. Statistical analysis confirmed positive user satisfaction and emotional engagement, demonstrating the system’s potential for stress relief and emotional awareness. Three core conclusions were drawn from these findings, aligned with the study’s goals.
(1)
The integration of emotion-sensing and music listening offers an innovative interactive experience—A wearable EEG device is utilized in this study to incorporate emotion detection into the proposed interactive music-based stress relief system “Serenity Island”. The system was developed using Unity3D (ver. 6000.0.0) and structured around Bonny’s four-stage guided imagery method [25]. Participants reported positive emotional experiences, and interview findings affirmed the novelty and creativity of this emotion-aware interaction model.
(2)
Emotions can be effectively visualized through natural metaphors like seasons and weather—The system translated users’ emotional states—measured through EEG-derived attention and meditation values—into dynamic visual scenes that changed in response to seasonal and weather metaphors. This approach provided intuitive, engaging feedback and maintained user interest through its visual interactivity.
(3)
The proposed system promotes emotional relaxation by combining music listening, guided imagery, and real-time emotion feedback—While participants initially experienced discomfort when reflecting on unpleasant memories, post-experience feedback revealed improved psychological calmness, physical relaxation, and reduced focus on stressors. These results confirm the system’s effectiveness in promoting relaxation through a combination of music listening and emotion-sensing feedback.

6.2. Suggestions for Future Research

This study explored the integration of emotion-sensing technology in music-based stress relief and interactive devices. However, due to the study’s limitations and the required technical demands, several areas for improvement remain in the system design and may be taken as future research directions, as described in the following.
(1)
Enhancing atmosphere and lighting—In terms of environmental atmosphere creation, more careful testing is needed to assess the visual effects of various color tones on the screen and the balance of ambient lighting.
(2)
Improving music selection—For the proposed music-based stress relief interactive system, adding a wider variety of relaxing music options would enhance the diversity of user choice.
(3)
Optimizing real-time feedback in emotion sensing—Regarding the proposed system’s feedback design, the feedback mechanism could be optimized to provide real-time responses or reduce the time interval for data processing to improve user interaction.
(4)
Extending relaxation experience—With regard to the proposed system’s flow design, extending the relaxation phase would allow users to enjoy a longer, more immersive stress-relief experience.
(5)
Enhancing emotional representation—In terms of presenting digital content, more emotional expressions may be added to the emotion coordinates, and additional emotional states may be incorporated into the feedback screens to enrich the content.
(6)
Incorporating psychologist guidance—During the system’s operation, it would be beneficial to incorporate in-depth discussions led by psychological counselors based on the individual case characteristics, replacing the role of guiding prompts, to unify the psychological and physiological states of the users.
(7)
Applying AI-driven dynamic musical adaptation or generation—Incorporating AI-driven dynamic musical adaptation or generation could enhance the system to provide users with a more affectively aware interactive visual experience, in which music responsively reflects their emotional and physiological states.
(8)
Validating the mapping between the (valence, arousal) and (meditation, attention) parameter pairs—Conducting a theoretically grounded and empirically tested validation of the mapping from “valence” and “meditation” to “arousal” and “attention”, respectively, would enhance the credibility of the proposed emotion model described in Figure 6 and strengthen its applicability across emotion-aware interactive systems.
(9)
Integrating additional physiological stress measures—Incorporating additional physiological biomarkers such as heart rate variability (HRV) or cortisol levels can enhance the objectivity and comprehensiveness of stress assessment while improving the accuracy of evaluating stress reduction effects within the system.
(10)
Incorporating a robust control condition—Including a well-defined control group, such as participants exposed to the same music without interactive visuals or engaged in a standardized relaxation method, can strengthen the validity of experimental findings.
(11)
Comparing interactive and non-interactive systems—Conducting controlled experiments to compare the effects of interactive systems (with EEG feedback and responsive visuals) against non-interactive methods using static imagery or music-only listening can clarify the specific contribution of system interactivity.
(12)
Exploring cross-cultural validation of emotional metaphors—Investigating the cultural universality of visual emotional metaphors (e.g., seasons, weather) through user studies involving participants from different cultural backgrounds can help to assess the clarity, relatability, and emotional accuracy of these visual representations.
(13)
Assessing effects of long-term engagement and stress-relief persistence—Investigating user retention, repeated usage behavior, and the sustained effectiveness of stress-relief responses over extended periods (e.g., days or weeks) could help to evaluate the durability of relaxation benefits and long-term user engagement with the system.

Author Contributions

Conceptualization, C.-M.W. and C.-H.L.; methodology, C.-M.W. and C.-H.L.; validation, C.-H.L.; investigation, C.-H.L.; data curation, C.-H.L.; writing—original draft preparation, C.-M.W.; writing—review and editing, C.-M.W.; visualization, C.-H.L.; supervision, C.-M.W.; funding acquisition, C.-M.W. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The datasets used and/or analyzed during the current study are available from the corresponding author on reasonable request.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Hedegaard, H.; Curtin, S.C.; Warner, M. Suicide Mortality in the United States, 1999–2017. NCHS Data Brief 2018, 309, 1–8. [Google Scholar]
  2. Finkelstein, D.M.; Kubzansky, L.D.; Capitman, J.; Goodman, E. Socioeconomic differences in adolescent stress: The role of psychological resources. J. Adolesc. Health 2007, 40, 127–134. [Google Scholar] [CrossRef] [PubMed]
  3. Anderson, S.F.; Salk, R.H.; Hyde, J.S. Stress in romantic relationships and adolescent depressive symptoms: Influence of parental support. J. Fam. Psychol. 2015, 29, 339–348. [Google Scholar] [CrossRef] [PubMed]
  4. Ip, H.H.S.; Kwong, B. Smart Ambience Games for Children with Learning Difficulties. Lect. Notes Comput. Sci. 2006, 3942, 476–487. [Google Scholar] [CrossRef]
  5. Murray, J.H. Inventing the Medium: Principles of Interaction Design as a Cultural Practice; MIT Press: Cambridge, MA, USA, 2012. [Google Scholar]
  6. Selye, H. A Syndrome produced by Diverse Nocuous Agents. Nature 1936, 138, 32. [Google Scholar] [CrossRef]
  7. Lazarus, R.S.; Folkman, S. Stress, Appraisal, and Coping; Springer Publishing Company: New York, NY, USA, 1984. [Google Scholar]
  8. Clark, D.A.; Steer, R.A.; Beck, A.T. Common and specific dimensions of self-reported anxiety and depression: Implications for the cognitive and tripartite models. J. Abnorm. Psychol. 1994, 103, 645. [Google Scholar] [CrossRef]
  9. Hensley, W.E. The Measurement of Stress among College Students. Psychol. Rep. 1991, 68, 1235–1240. [Google Scholar] [CrossRef]
  10. Villanova, P.; Bownas, D.A. Dimensions of College Student Stress. J. Coll. Stud. Dev. 1984, 25, 105–113. [Google Scholar]
  11. Whitman, N.A.; Spendlove, D.C.; Clark, C.H. Student Stress: Effects and Solutions; Association for the Study of Higher Education: Washington, DC, USA, 1985. [Google Scholar]
  12. Jackson, E.M. Stress relief: The role of exercise in stress management. ACSM’s Health Fit. J. 2013, 17, 14–19. [Google Scholar] [CrossRef]
  13. Selye, H. Stress without Distress. In Psychopathology of Human Adaptation; Serban, G., Ed.; Springer: Boston, MA, USA, 1976; pp. 137–146. [Google Scholar]
  14. Meichenbaum, D.; Jaremko, M.E. Stress Reduction and Prevention; Plenum Press: New York, NY, USA, 1983. [Google Scholar]
  15. Wen, S.-S. Stress Management. San Min Book Co.: Taipei, Taiwan, 2017. (In Chinese) [Google Scholar]
  16. Davis, W.B.; Gfeller, K.E.; Thaut, M.H. An Introduction to Music Therapy: Theory and Practice, 3rd ed.; American Music Therapy Association: Silver Spring, MD, USA, 2008. [Google Scholar]
  17. American Music Therapy Association. What Is Music Therapy? Available online: https://www.musictherapy.org/about/musictherapy/ (accessed on 6 May 2024).
  18. Storr, A. Music and the Mind; Free Press: New York, NY, USA, 1992. [Google Scholar]
  19. Frenkel, M.; Ben-Arye, E.; Cohen, L. Complementary and alternative medicine (CAM) and supportive care in cancer: A synopsis of research perspectives and contributions by an interdisciplinary team. Support. Care Cancer 2007, 15, 565–568. [Google Scholar] [CrossRef]
  20. Liu, K.-H. Music Therapy: Theory and Practice. Guid. Couns. 1994, 104, 21–25. (In Chinese) [Google Scholar]
  21. Robarts, J.Z. Music Therapy and Children with Autism: Protocols for Intervention and Future Directions. In Music Therapy in the Treatment of Adults with Mental Disorders: Theoretical Bases and Clinical Interventions; Unkefer, R.F., Ed.; Macmillan Publishing Company: New York, NY, USA, 1998; pp. 123–145. [Google Scholar]
  22. Huang, C.-H.; Wu, H.-J. A Comparative Analysis of Theoretical Models of Music Therapy—A Case Study of the Effects of Orff Music Therapy Groups. J. Couns. Guid. 2004, 10, 1–29. (In Chinese) [Google Scholar]
  23. Unkefer, R.F.; Thaut, M. Music Therapy in the Treatment of Adults with Mental Disorders: Theoretical Bases and Clinical Interventions. Arts Psychother. 1990, 17, 91–104. [Google Scholar]
  24. Wolberg, L.R. The Technique of Psychotherapy, 4th ed.; Grune & Stratton, Inc.: Orlando, FL, USA; Harcourt Brace: Orlando, FL, USA, 1988. [Google Scholar]
  25. Bonny, H.L. Facilitating Guided Imagery and Music Sessions; ICM Books: Baltimore, MD, USA, 1978. [Google Scholar]
  26. Drever, J. A Dictionary of Psychology; Penguin Books: Oxford, UK, 1952. [Google Scholar]
  27. Plutchik, R. Chapter 1—A General Psychoevolutionary Theory of Emotion. In Theories of Emotion; Plutchik, R., Kellerman, H., Eds.; Academic Press: New York, NY, USA, 1980; pp. 3–33. [Google Scholar]
  28. Dworetsky, J.P. Psychology. West Publishing Company: St. Paul, MN, USA, 1988. [Google Scholar]
  29. Norman, D.A. Emotional Design: Why We Love (or Hate) Everyday Things; Basic Books: New York, NY, USA, 2004. [Google Scholar]
  30. Strongman, K.T. The Psychology of Emotion: Theories of Emotion in Perspective, 4th ed.; Wiley: Chichester, UK, 1996. [Google Scholar]
  31. Ruiz-Padial, E.; Sollers, J.J., III; Vila, J.; Thayer, J.F. The rhythm of the heart in the blink of an eye: Emotion-modulated startle magnitude covaries with heart rate variability. Psychophysiology 2003, 40, 306–313. [Google Scholar] [CrossRef] [PubMed]
  32. Zentner, M.; Grandjean, D.; Scherer, K.R. Emotions evoked by the sound of music: Characterization, classification, and measurement. Emotion 2008, 8, 494–521. [Google Scholar] [CrossRef] [PubMed]
  33. Schellenberg, E.G.; Krysciak, A.M.; Campbell, R.J. Perceiving emotion in melody: Interactive effects of pitch and rhythm. Music. Percept. 2000, 18, 155–171. [Google Scholar] [CrossRef]
  34. Thayer, R.E. The Biopsychology of Mood and Arousal; Oxford University Press: New York, NY, USA, 1989. [Google Scholar]
  35. Juslin, P.N. Perceived emotional expression in synthesized performances of a short melody: Capturing the listener’s judgment policy. Music. Sci. 1997, 1, 225–256. [Google Scholar] [CrossRef]
  36. Picard, R.W. Affective Computing; MIT Press: Cambridge, MA, USA, 1995. [Google Scholar]
  37. Picard, R.W.; Klein, J. Computers That Recognise and Respond to User Emotion: Theoretical and Practical Implications. Interact. Comput. 2002, 14, 141–169. [Google Scholar] [CrossRef]
  38. Reynolds, C.; Picard, R.W. Designing for affective interactions. In Proceedings of the 9th International Conference on Human–Computer Interaction (HCI International 2001), New Orleans, LA, USA, 5–10 August 2001; Kumar, S., Ed.; Lawrence Erlbaum Associates: Mahwah, NJ, USA, 2001; pp. 1–6. [Google Scholar]
  39. Soleymani, M.; Pantic, M.; Pun, T. Multimodal Emotion Recognition in Response to Videos. IEEE Trans. Affect. Comput. 2012, 3, 211–223. [Google Scholar] [CrossRef]
  40. Wu, D.; Li, C.; Yin, Y.; Zhou, C.; Yao, D. Music Composition from the Brain Signal: Representing the Mental State by Music. Comput. Intell. Neurosci. 2010, 2010, 267671. [Google Scholar] [CrossRef]
  41. Valenza, G.; Lanata, A.; Scilingo, E.P. The Role of Nonlinear Dynamics in Affective Valence and Arousal Recognition. IEEE Trans. Affect. Comput. 2012, 3, 237–249. [Google Scholar] [CrossRef]
  42. Berger, H. Über das Elektrenkephalogramm des Menschen. Arch. Psychiatr. Nervenkr. 1929, 87, 527–570. [Google Scholar] [CrossRef]
  43. Berger, H. Über das Elektrenkephalogramm des Menschen. Arch. Psychiatr. Nervenkr. 1931, 94, 16–60. [Google Scholar] [CrossRef]
  44. Sammler, D.; Grigutsch, M.; Fritz, T.; Koelsch, S. Music and Emotion: Electrophysiological Correlates of the Processing of Pleasant and Unpleasant Music. Psychophysiology 2007, 44, 293–304. [Google Scholar] [CrossRef]
  45. Petrantonakis, P.; Hadjileontiadis, L. Emotion Recognition from Brain Signals Using Hybrid Adaptive Filtering and Higher Order Crossings Analysis. IEEE Trans. Affect. Comput. 2010, 1, 81–97. [Google Scholar] [CrossRef]
  46. NeuroSky, Inc. MindWave Mobile: User Guide; NeuroSky, Inc.: San Jose, CA, USA, 2015; Available online: https://download.neurosky.com/support_page_files/MindWaveMobile/docs/mindwave_mobile_plus_user_guide.pdf (accessed on 6 May 2025).
  47. Neurowear. Neurocam. Available online: http://neurowear.com/projects_detail/neurocam.html (accessed on 6 May 2025).
  48. Park, L. Beautiful Mind “Eunoia”. Available online: https://www.thelisapark.com/work/eunoia (accessed on 6 May 2025).
  49. Tsai, J.-P. Meditation Practice and Interactive Installation Integrated with Wearable EEG Device. 2015. Available online: https://www.youtube.com/watch?v=_dOlTg3iEsY (accessed on 6 May 2025).
  50. Li, T.-Y. Mind-Reading Tree. 2015. Available online: https://digitalartfestival.tw/daf15/zh/award-4-0103.html (accessed on 6 May 2025).
  51. Muse. Muse 2. 2018. Available online: https://choosemuse.com/ (accessed on 6 May 2025).
  52. Kantowitz, B.H.; Sorkin, R.D. Human Factors: Understanding People-System Relationships; Wiley: New York, NY, USA, 1983. [Google Scholar]
  53. Buxton, W.A.S.; Baecker, R.M. Readings in Human-Computer Interaction: A Multidisciplinary Approach; Morgan Kaufmann: Los Altos, CA, USA, 1987. [Google Scholar]
  54. Preece, J.; Rogers, Y.; Sharp, H.; Benyon, D.; Holland, S.; Carey, T. Human-Computer Interaction; Addison-Wesley: Wokingham, UK, 1994. [Google Scholar]
  55. Rogers, Y.; Sharp, H.; Preece, J. Interaction Design: Beyond Human–Computer Interaction, 3rd ed.; Wiley: Chichester, UK, 2011. [Google Scholar]
  56. McDonald’s Nederland. McTrax. 2016. Available online: https://www.youtube.com/watch?v=X6zPbogDPgU (accessed on 6 May 2025).
  57. Dvořák, T.; Gregor, D. Archifon. 2016. Available online: https://www.youtube.com/watch?v=NF8S0p1bu7s (accessed on 6 May 2025).
  58. Lind, A. LINES—An Interactive Sound Art Exhibition. 2016. Available online: https://www.youtube.com/watch?v=hP36xoPXDnM (accessed on 6 May 2025).
  59. Lien, M. Embracing X Surroundings. Available online: https://matthewlien.com/embracing-x-surroundings-16-channel-outdoor-sound-sculpture/ (accessed on 6 May 2025).
  60. Chen, Y.-S. Immersing in the Wonderland that We Create: The Music Interaction Device. Master’s Thesis, National Cheng Kung University, Tainan, Taiwan, 2022. [Google Scholar]
  61. Bunt, L.; Stige, B. Music Therapy: An Art Beyond Words, 2nd ed.; Routledge: London, UK, 2014. [Google Scholar]
  62. Russell, J.A. A Circumplex Model of Affect. J. Personal. Soc. Psychol. 1980, 39, 1161–1178. [Google Scholar] [CrossRef]
  63. Yeh, C.-C.; Yeh, L.-C. Research Methods and Thesis Writing; Shang-Ding Culture: Taipei, Taiwan, 2011. (In Chinese) [Google Scholar]
  64. Flick, U. An Introduction to Qualitative Research, 6th ed.; SAGE Publications: London, UK, 2018. [Google Scholar]
  65. Chin, J.P.; Diehl, V.A.; Norman, K.L. Development of an Instrument Measuring User Satisfaction of the Human–Computer Interface. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’88), Washington, DC, USA, 15–19 May 1988; pp. 213–218. [Google Scholar] [CrossRef]
  66. Likert, R. A Technique for the Measurement of Attitudes. Arch. Psychol. 1932, 140, 1–55. [Google Scholar]
  67. Smith, J.C. Smith Relaxation States Inventory 3 (SRSI3). Available online: https://bpb-us-e1.wpmucdn.com/blogs.roosevelt.edu/dist/9/20/files/2016/09/SRSI3.pdf (accessed on 6 May 2025).
  68. IBM KMO and Bartlett’s Test. Available online: https://www.ibm.com/docs/en/spss-statistics/28.0.0?topic=detection-kmo-bartletts-test (accessed on 10 May 2023).
  69. Trochim, W.M.K. Hosted by Conjointly. Research Methods Knowledge Base. Available online: https://conjointly.com/kb/theory-of-reliability/ (accessed on 10 May 2023).
  70. Taber, K.S. The use of Cronbach’s alpha when developing and reporting research instruments in science education. Res. Sci. Educ. 2018, 48, 1273–1296. [Google Scholar] [CrossRef]
  71. Guilford, J.P. Psychometric Methods, 2nd ed.; McGraw-Hill: New York, NY, USA, 1954. [Google Scholar]
  72. Hu, L.T.; Bentler, P.M. Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Struct. Equ. Model. 1999, 6, 1–55. [Google Scholar] [CrossRef]
Figure 1. The research process of this study. (The process is divided into five stages, as indicated by the notations on the right side).
Figure 1. The research process of this study. (The process is divided into five stages, as indicated by the notations on the right side).
Electronics 14 03087 g001
Figure 2. Emotion model proposed by Thayer [34] (redrawn in this study).
Figure 2. Emotion model proposed by Thayer [34] (redrawn in this study).
Electronics 14 03087 g002
Figure 3. System architecture of the proposed “Serenity Island”.
Figure 3. System architecture of the proposed “Serenity Island”.
Electronics 14 03087 g003
Figure 4. The diagram of the process of the proposed system “Serenity Island”. The blue area indicates that four scenes (spring, summer, autumn, and winter) were presented to the participant.
Figure 4. The diagram of the process of the proposed system “Serenity Island”. The blue area indicates that four scenes (spring, summer, autumn, and winter) were presented to the participant.
Electronics 14 03087 g004
Figure 5. The emotional sensing flowchart used in this study where the total music playback time is 5 min and divided into two sections: 2.5 min with eyes closed and 2.5 min with eyes open.
Figure 5. The emotional sensing flowchart used in this study where the total music playback time is 5 min and divided into two sections: 2.5 min with eyes closed and 2.5 min with eyes open.
Electronics 14 03087 g005
Figure 6. A diagram of the emotional coordinate system used in the proposed system.
Figure 6. A diagram of the emotional coordinate system used in the proposed system.
Electronics 14 03087 g006
Figure 7. Atmospheres of the stages of the formal experiment procedure. (a) Stage I: growing and stress phases—the tree and leaves grow in response to EEG input. (b) Stage II: music listening and viewing phases—the participant opens their eyes to observe visualizations reflecting emotional states. (c) Stage III: session end—the participant views their relaxation score and emotional coordinate map based on attention and meditation data.
Figure 7. Atmospheres of the stages of the formal experiment procedure. (a) Stage I: growing and stress phases—the tree and leaves grow in response to EEG input. (b) Stage II: music listening and viewing phases—the participant opens their eyes to observe visualizations reflecting emotional states. (c) Stage III: session end—the participant views their relaxation score and emotional coordinate map based on attention and meditation data.
Electronics 14 03087 g007
Figure 8. Emotional coordinate scatter plot (vertical: attention; horizontal: meditation).
Figure 8. Emotional coordinate scatter plot (vertical: attention; horizontal: meditation).
Electronics 14 03087 g008
Figure 9. Confirmatory factor analysis (CFA) results using AMOS: (a) and (b) the structural models of scales “user interaction experience” and “relaxation level” generated through CFA, respectively.
Figure 9. Confirmatory factor analysis (CFA) results using AMOS: (a) and (b) the structural models of scales “user interaction experience” and “relaxation level” generated through CFA, respectively.
Electronics 14 03087 g009
Table 1. Classification of index values of attention and meditation yielded by the eSense algorithm [46].
Table 1. Classification of index values of attention and meditation yielded by the eSense algorithm [46].
ValueIndex RangeMeditation Level StatusAttention Level Status
81–100elevatedvery relaxedhighly focused
61–80slightly elevatedslightly relaxedslightly focused
41–60neutralneutralneutral
21–40reducedslightly tenseslightly distracted
0–20strongly loweredvery tensehighly distracted
Table 2. A summary of five cases of brainwave-related applications in daily life.
Table 2. A summary of five cases of brainwave-related applications in daily life.
Work TitleInteractive FeedbackData SourcePurpose
Neurocam [47]Memory imagesFocus, thinking, relaxationEntertainment
Eunoia [48]Size of water ripplesFocus, relaxationInteractive installation, emotional observation
Meditation Interaction Device [49]Degree of lotus bloomRelaxation, focusInteractive installation, emotional observation
Mind-Reading Tree [50]Visual animationRelaxationEntertainment, interactive installation
Muse 2 [51]Physiological data displayed via smartphone charts Relaxation, focus, heart rate, breathingMeditation and relaxation, self-observation
Table 3. A summary of five case studies of applying music to interactive technology.
Table 3. A summary of five case studies of applying music to interactive technology.
Work TitleContentMusic Type
McTrax [56]The placemat connects to a smartphone via Bluetooth and enables song creation using conductive ink.Energetic, electronic music, percussion
Archifon [57]Music is triggered by sensors when activated with a laser pointer.Nature sounds, crystal music
LINES [58]Pitch varies depending on the participant’s distance from the sensors.Electronic tones, cheerful
Embracing X Surroundings [59]Sound plays when participants hug the interactive installation.Nature sounds, relaxing
Dream Sound [60]Visual changes are triggered by detecting the viewers with layered effects created through image overlay.Relaxing, crystal music
Table 4. Illustrations of the process of system experience.
Table 4. Illustrations of the process of system experience.
ProcedureDiagramDescription
PhaseStep
(1) Initial setup:(i) User seatedElectronics 14 03087 i001The user is guided to be seated beside the proposed system and wear a brainwave headset, and the experimental procedure is explained.
(ii) Login inElectronics 14 03087 i002A login screen is displayed, asking a researcher to start the system.
(iii) Device connectionElectronics 14 03087 i003Device connection is conducted to establish communication between the system and the EEG.
(vi) Standby screenElectronics 14 03087 i004The island in the visual display appears barren, with no tree present; and an audio prompt is given to instruct the user to focus intently on the island.
(2) Growing phase(i) Tree growth initiatedElectronics 14 03087 i005Tree growth on the island is initiated when the user’s real-time EEG attention value surpasses a designated threshold.
(ii) Tree growth acceleratedElectronics 14 03087 i006Higher attention levels accelerate tree growth, while increased meditation levels enhance the density of the island’s vegetation.
(3) Stress phase(i) Increasing pressureElectronics 14 03087 i007After tree growth finishes, the ocean wave parameters are increased to create intense waves, intensifying the sense of psychological pressure.
(ii) Screen darkenedElectronics 14 03087 i008The system gradually darkens the scene to create an immersive atmosphere of pressure.
(4) Music listening phase(i) Immerse in the musicElectronics 14 03087 i009The user is prompted by the system to relax his/her body, close their eyes, listen to the music, and fully immerse themselves in the music for 2.5 min. The user is then prompted to open their eyes.
(ii) Visual scene changes
(a1) spring
Electronics 14 03087 i010Various visual scenes like different seasons are displayed on the screen to guide the user through different emotional states, including: (a) four seasons—(a1) spring.
(ii) Visual scene changes
(a2) summer
Electronics 14 03087 i011Various visual scenes like different seasons are displayed on the screen to guide the user through different emotional states, including: (a) four seasons—(a2) summer.
(ii) Visual scene changes
(a3) autumn
Electronics 14 03087 i012Various visual scenes like different seasons are displayed on the screen to guide the user through different emotional states, including: (a) four seasons—(a3) autumn.
(ii) Visual scene changes
(a4) winter
Electronics 14 03087 i013Various visual scenes like different seasons are displayed on the screen to guide the user through different emotional states, including: (a) four seasons—(a4) winter.
(5) Viewing phase(i) Emotion calmingElectronics 14 03087 i014The user continues music listening for additional 2.5 min to settle their emotions.
(ii) Emotion analysis (a) anxietyElectronics 14 03087 i015The user’s brainwaves are read to yield an emotion type with its visual display presented: anxiety.
(ii) Emotion analysis (b) tensionElectronics 14 03087 i016The user’s brainwaves are read to yield an emotion type with its visual display presented: tension
(ii) Emotion analysis (c) blanknessElectronics 14 03087 i017The user’s brainwaves are read to yield an emotion type with its visual display presented: blankness.
(ii) Emotion analysis (d) relaxationElectronics 14 03087 i018The user’s brainwaves are read to yield an emotion type with its visual display presented: relaxation.
(6) Session end(i) Showing emotion diagramElectronics 14 03087 i019When the music ends, an emotion diagram, including the user’s “relaxation level” and the corresponding “emotional state”, is generated based on all the emotion data and presented on the screen
(ii) Screen darkenedElectronics 14 03087 i020When the time is up, a notification indicating the end of detection is displayed, the screen gradually darkens, and the data is saved.
(iii) Detection endedElectronics 14 03087 i021A message appears, instructing the participant to remove the brainwave headset.
Table 5. Ocean settings and visual effects.
Table 5. Ocean settings and visual effects.
ClassParameters and Visual Scenes
Wave crestHighLow
Electronics 14 03087 i022Electronics 14 03087 i023
SpeedQuickSlow
Electronics 14 03087 i024Electronics 14 03087 i025
SeaAbove surfaceUnderwater
Electronics 14 03087 i026Electronics 14 03087 i027
Table 6. Mist settings and visual effects.
Table 6. Mist settings and visual effects.
ClassParameters and Visual Scenes
Mist intensityDenseLight
Electronics 14 03087 i028Electronics 14 03087 i029
Mist coverageHIghLow
Electronics 14 03087 i030Electronics 14 03087 i031
Mist color (adjusted through three color palettes)Defined freely
Electronics 14 03087 i032
Table 7. Island settings and visual effects.
Table 7. Island settings and visual effects.
ClassParameters and Visual Scenes
SeasonsSpringSummer
Electronics 14 03087 i033Electronics 14 03087 i034
AutumnWinter
Electronics 14 03087 i035Electronics 14 03087 i036
Plant Size and Leaf DensitySpareDense
Electronics 14 03087 i037Electronics 14 03087 i038
WindNo windFierce wind
Electronics 14 03087 i039Electronics 14 03087 i040
RainNo rainHeavy rain
Electronics 14 03087 i041Electronics 14 03087 i042
Table 8. Emotional state mapping used in the proposed system “Serenity Island”.
Table 8. Emotional state mapping used in the proposed system “Serenity Island”.
AnxietyTensionBlanknessRelaxation
seasonwinterwinterautumnsummer
windfiercemoderateslightslight
rainheavymoderateslightnone
wave patterns (ocean)highmoderateslowmoderate
overall color tone (mist)blackredpurpleblue-green
Table 9. Emotion-to-visual mapping for anxiety, tension, blankness, and relaxation.
Table 9. Emotion-to-visual mapping for anxiety, tension, blankness, and relaxation.
Emotional StateAnxietyTension
Visual sceneElectronics 14 03087 i043Electronics 14 03087 i044
Emotional StateBlanknessRelaxation
Visual sceneElectronics 14 03087 i045Electronics 14 03087 i046
Table 10. List of questions asked in the user interviews.
Table 10. List of questions asked in the user interviews.
PerspectivesQuestions
Theme and stress relief
  • What methods do you typically use to relax?
  • What are your impressions of the system’s design and theme in supporting stress relief?
  • How effective was the guided narration in helping you relax?
Visuals and emotions
  • After the session, did you notice any visual changes that reflected your emotional state?
  • Were the colors used in the visuals appropriate, or did they feel too intense or chaotic?
  • Did the visuals align with your personal image of tension?
  • Did they reflect your perception of anxiety?
  • Did they represent a state of mental blankness?
  • Did they match your idea of relaxation?
Table 11. Information of four relaxation music tracks used in the preliminary experiment.
Table 11. Information of four relaxation music tracks used in the preliminary experiment.
RoundMusic TypeInstrumentsTempoMelody ClarityFeatures
T1meditationelectronic soundsslowmost blurredsmall pitch range
T2crystal musicelectronic, percussionslowfairly blurredα waves, water flow sounds
T3symphonypiano, string instrumentslivelyclearpiano, clear melody
T4nature soundspianolivelyvery clearwater flow, insect chirps, bird calls
Table 12. Statistics of attention values of the four music playbacks (avg = average).
Table 12. Statistics of attention values of the four music playbacks (avg = average).
MusicT1T2T3T4
PhaseTree_AMusic_ATree_AMusic_ATree_AMusic_ATree_AMusic_A
A56.3356.334.3340.5345.4453.7138.4821.4
B45.1243.6139.4048.0846.8446.7639.9233.33
C42.5344.7736.8238.7053.3052.2246.9025.83
D49.7348.0341.8845.5245.9448.9346.5121.17
E57.2641.1741.0142.3351.7146.1638.0419.66
F57.9643.8047.5749.9261.6753.5137.4532.27
G46.4056.2034.2239.5053.9447.6538.6624.90
H57.0451.3445.5339.8449.8140.1148.8332.12
I51.8241.0635.7241.6560.3843.6245.7927.77
J56.8847.7238.6746.5645.0651.5337.2734.53
K48.9045.6735.7738.4459.3045.2030.1022.41
Avg51.8147.2439.1742.8252.1248.1240.7226.85
Table 13. Statistics of meditation values of the four music playbacks (avg = average).
Table 13. Statistics of meditation values of the four music playbacks (avg = average).
MusicT1T2T3T4
PhaseTree_AMusic_ATree_AMusic_ATree_AMusic_ATree_AMusic_A
A42.6067.7533.4353.1429.1756.3439.1143.11
B41.5861.0839.8566.5039.8462.7125.20060.84
C37.2467.0341.6751.2336.8153.0127.8359.15
D31.2047.1834.4563.8235.2659.6837.0954.09
E33.3064.2842.5468.0333.9266.3539.2365.16
F29.8163.4357.3353.3334.0756.8525.9153.31
G31.9762.7136.6367.5530.5051.3736.7042.29
H32.5855.1234.0851.9235.5753.4540.3547.44
I36.4649.0235.1354.1241.7053.0929.5151.95
J34.0464.3744.8163.2631.2454.8326.0455.35
K31.5968.6357.3860.2037.5859.1738.6252.45
Avg34.7660.0941.5759.3735.0648.1233.2353.19
Table 14. List of invited experts.
Table 14. List of invited experts.
LabelAffiliationProfessionExpertise
ANational universityFull-time counseling psychologistEmotional regulation, stress management, Individual counseling
BNational universityResource teacher guidance counselorPsychological counseling for students with special needs, individual counseling
CNational universityIntern counseling psychologistArt therapy, self-exploration, individual counseling
Table 15. List of questions asked in the expert interviews.
Table 15. List of questions asked in the expert interviews.
PerspectivesQuestions
Feasibility of the project concept
  • What is your view on using nonverbal communication in counseling?
  • How do you feel about the project’s flow—what stood out most, and what could be improved?
  • What is your opinion on using emotional sensing in interactive installations to reflect users’ emotional states?
Elements and important details of the work
  • How effective are ocean and island elements in encouraging relaxation?
  • Are the emotional categories and their visual representations appropriate?
  • Which music types are commonly used for relaxation or therapeutic purposes?
  • Do the color schemes suitably reflect each emotional state?
  • What settings and lighting best support stress relief?
Table 16. Before-and-after comparison of the visual scenes of the four emotions.
Table 16. Before-and-after comparison of the visual scenes of the four emotions.
ClassBefore ModificationAfter Modification
AnxietyElectronics 14 03087 i047Electronics 14 03087 i048
TensionElectronics 14 03087 i049Electronics 14 03087 i050
BlanknessElectronics 14 03087 i051Electronics 14 03087 i052
RelaxationElectronics 14 03087 i053Electronics 14 03087 i054
Table 17. Modification of experimental environment.
Table 17. Modification of experimental environment.
Illustration of EnvironmentBefore improvementAfter improvement
Electronics 14 03087 i055Electronics 14 03087 i056
Real EnvironmentBefore improvementAfter improvement
Electronics 14 03087 i057Electronics 14 03087 i058
Table 18. The first part of the questionnaire about the scale of “user interaction experience”.
Table 18. The first part of the questionnaire about the scale of “user interaction experience”.
ItemQuestion
T1I find the interactive format engaging.
T2I think the experience flow is smooth.
T3I find the system operates smoothly.
T4I think the device is appealing.
T5I felt relaxed during the experience.
T6I feel the colors on the screen are comfortable.
T7I find the overall visual design of the system calming.
T8I find the ambient sounds in the system relaxing.
T9I think the system’s ambient sounds are rich and immersive.
T10I find the animation design engaging.
T11The auditory feedback captured my attention.
T12The device helped me better understand emotional changes.
T13This device is useful in my daily life.
T14The system’s stress-relief elements are relevant to my life experiences.
T15After the experience, I felt emotionally uplifted.
T16The experience boosted my willingness to join similar activities.
Table 19. Statistics of the questionnaire data about the scale of “user interaction experience”.
Table 19. Statistics of the questionnaire data about the scale of “user interaction experience”.
Item
No.
Min.Max.Avg.S.D.Strongly Agree (5)Agree (4)Neutral (3)Disagree
(2)
Strongly Disagree (1)Agree + Strongly Agree
T1254.170.8138.5%44.2%13.5%3.8%0%82.7%
T2354.400.6450.0%40.4%9.6%0%0%90.4%
T3354.380.6950.0%38.5%11.5%0%0%88.5%
T4154.230.8038.5%51.9%5.8%1.9%1.9%90.4%
T5354.330.7348.1%36.5%15.4%0%0%84.6%
T6354.310.7044.2%42.3%13.5%0%0%86.5%
T7354.270.6336.5%53.8%9.6%0%0%90.3%
T8154.270.7740.4%50.0%7.7%0%1.9%90.4%
T9354.290.7244.2%40.4%15.4%0%0%84.6%
T10254.210.7538.5%46.2%13.5%1.9%0%84.7%
T11354.230.7340.4%42.3%17.3%0%0%82.7%
T12154.020.9334.6%40.4%19.2%3.8%1.9%75.0%
T13153.810.9525.0%40.4%26.9%5.8%1.9%65.4%
T14254.000.8428.8%48.1%17.3%5.8%0%76.9%
T15354.230.6744.2%44.2%11.5%0%0%88.4%
T16254.380.7451.9%36.5%9.6%1.9%0%88.4%
Table 20. The second part of the questionnaire about the scale of “relaxation level”.
Table 20. The second part of the questionnaire about the scale of “relaxation level”.
ItemQuestion
S1I felt a sense of peace.
S2My muscles were deeply relaxed.
S3My heart felt calm and unburdened.
S4I felt that all my worries were cast aside.
S5My hands, arms, or legs felt loose and tension-free.
S6The things I cared about no longer seemed important.
S7I felt rested.
S8I felt physically and mentally refreshed.
S9My body felt at ease.
Table 21. Statistics of the questionnaire data about the scale of “relaxation level”.
Table 21. Statistics of the questionnaire data about the scale of “relaxation level”.
Item
No.
Min.Max.Avg.S.D.Strongly Agree (5)Agree (4)Neutral (3)Disagree
(2)
Strongly Disagree (1)Agree + Strongly Agree
S1354.400.6450.0%40.4%9.6%0%0%90.4%
S2153.940.9730.8%44.2%15.4%7.7%1.9%75.0%
S3154.020.9334.6%40.4%19.2%3.8%1.9%75.0%
S4153.671.0023.1%34.6%30.8%9.6%1.9%57.7%
S5154.020.08528.8%50.0%17.3%1.9%1.9%78.8%
S6153.501.1826.9%21.2%30.8%17.3%3.8%48.1%
S7254.170.8138.5%44.2%13.5%3.8%0%82.7%
S8354.380.6950.0%38.5%11.5%0%0%88.5%
S9254.310.7042.3%48.1%7.7%1.9%0%90.4%
Table 22. The measured values of the KMO test and the significance values of Bartlett’s test of the questionnaire data of the two scales listed in Table 19 and Table 21.
Table 22. The measured values of the KMO test and the significance values of Bartlett’s test of the questionnaire data of the two scales listed in Table 19 and Table 21.
ScaleName of Measure or TestValue
User interaction experienceKMO measure of sampling adequacy0.717
Bartlett test of sphericityApprox. Chi-Square288.846
Degree of freedom120
Significance0.000
Relaxation levelKMO measure of sampling adequacy0.668
Bartlett test of sphericityApprox. Chi-Square166.827
Degree of freedom36
Significance0.000
Table 23. Rotated component matrix of the first scale “user interaction experience”.
Table 23. Rotated component matrix of the first scale “user interaction experience”.
Question Dimension
No 0.123
T130.760−0.1810.015
T80.6700.1390.095
T40.6700.4730.028
T150.6570.349−0.020
T50.6140.1100.320
T120.5850.1330.056
T10.5120.5110.073
T20.0170.7720.162
T3−0.1160.7650.283
T60.2270.6420.054
T140.3670.562−0.048
T160.1540.5450.135
T7−0.0630.0810.782
T110.0440.1010.726
T100.1430.2400.711
T90.4410.0530.607
Table 24. Rotated component matrix of the second scale, “relaxation level”.
Table 24. Rotated component matrix of the second scale, “relaxation level”.
Question Dimension
No 0.12
S40.812−0.044
S60.778−0.159
S20.7590.353
S50.7460.147
S90.7110.193
S30.1190.759
S70.0050.726
S1−0.0380.706
S80.3330.500
Table 25. Collection of questions of the five latent dimensions of the two scales.
Table 25. Collection of questions of the five latent dimensions of the two scales.
IndicatorQuestion DimensionGroup of Related Questions
User interaction experienceEmotional experience (Group RA1)RA1 = (T13, T8, T4, T15, T5, T12, T1)
Device experience (Group RA2)RA2 = (T2, T3, T6, T14, T16)
Interface design and perception (Group RA3)RA3 = (T7, T11, T10, T9)
Relaxation levelPhysiological feedback (Group RB1)RB1 = (S4, S6, S2, S5, S9)
Emotional feedback (Group RB2)RB2 = (S3, S7, S1, S8)
Table 26. Questionnaire about the scale of “user interaction experience”.
Table 26. Questionnaire about the scale of “user interaction experience”.
DimensionItemQuestion
Emotional experienceT13This device is useful in my daily life.
T8I find the ambient sounds in the system relaxing.
T4I think the device is appealing.
T15After the experience, I felt emotionally uplifted.
T5I felt relaxed during the experience.
T12The device helped me better understand emotional changes.
T1I find the interactive format engaging.
Device experienceT2I think the experience flow is smooth.
T3I find the system operates smoothly.
T6I feel the colors on the screen are comfortable.
T14The system’s stress-relief elements are relevant to my life experiences.
T16The experience boosted my willingness to join similar activities.
Interface design and perceptionT7I find the overall visual design of the system calming.
T11The auditory feedback captured my attention.
T10I find the animation design engaging.
T9I think the system’s ambient sounds are rich and immersive.
Table 27. Questionnaire about the aspect of “relaxation level”.
Table 27. Questionnaire about the aspect of “relaxation level”.
DimensionItemQuestion
Physiological feedbackS4I felt that all my worries were cast aside.
S6The things I cared about no longer seemed important.
S2My muscles were deeply relaxed.
S5My hands, arms, or legs felt loose and tension-free.
S9My body felt at ease.
Emotional feedbackS3My heart felt calm and unburdened.
S7I felt rested.
S1I felt a sense of peace.
S8I felt physically and mentally refreshed.
Table 28. The scales and latent dimensions as well as the corresponding Cronbach’s α coefficients.
Table 28. The scales and latent dimensions as well as the corresponding Cronbach’s α coefficients.
IndicatorQuestion Dimension (Q.D.)Cronbach’s α
Coeff. of Q.D.
Cronbach’s α Coeffi. of Indicator
User interaction experienceEmotional experience (Group RA1)0.7230.717
Device experience (Group RA2)0.716
Interface design and perception (Group RA3)0.741
Relaxation levelPhysiological feedback (Group RB1)0.7260.668
Emotional feedback (Group RB2)0.649
Table 29. Fitness indexes of the structural models of the two indicators of “user interaction experience” and “relaxation level” generated through CFA.
Table 29. Fitness indexes of the structural models of the two indicators of “user interaction experience” and “relaxation level” generated through CFA.
Scaledfχ2χ2/dfcfiRMSEARMSEA (90% CI)
LOHI
User interaction experience101121.1511.20.9030.0630.0000.100
Relaxation level2536.6171.4650.9200.0950.0000.158
Meanings of symbols—df: degree of freedom; cfi: comparative fit index; RMSEA: root mean square error of approximation; CI: confidence interval; LO: low; HI: high.
Table 30. The construct validity values of the latent dimension of the two scales “user interaction experience” and “relaxation level” generated through CFA.
Table 30. The construct validity values of the latent dimension of the two scales “user interaction experience” and “relaxation level” generated through CFA.
IndicatorQuestion DimensionGroup of Related QuestionsConstruct Validity Value
User interaction experienceEmotional experience (Group RA1)RA1 = (T13, T8, T4, T15, T5, T12, T1)0.811
Device experience (Group RA2)RA2 = (T2, T3, T6, T14, T16)0.749
Interface design and perception (Group RA3)RA3 = (T7, T11, T10, T9)0.726
Relaxation levelPhysiological feedback (Group RB1)RB1 = (S4, S6, S2, S5, S9)0.802
Emotional feedback (Group RB2)RB2 = (S3, S7, S1, S8)0.652
Table 31. Analysis of responses to questions on the latent dimension of “emotional experience”.
Table 31. Analysis of responses to questions on the latent dimension of “emotional experience”.
Item No.QuestionMinMaxMeanS.D.Strongly AgreeAgreeNo OpinionDisagreeStrongly DisagreeStrongly Agree + Agree
(A)(B)(C)(D)(E)(F = A + B)
1T13This device is useful in my daily life.153.810.9525.0%40.4%26.9%5.8%1.9%65.4%
2T8I find the ambient sounds in the system relaxing.154.270.7740.4%50.0%7.7%0%1.9%90.4%
3T4I think the device is appealing.154.230.8038.5%51.9%5.8%1.9%1.9%90.4%
4T15After the experience, I felt emotionally uplifted.354.230.6744.2%44.2%11.5%0%0%88.4%
5T5I felt relaxed during the experience.354.330.7348.1%36.5%15.4%0%0%84.6%
6T12The device helped me better understand emotional changes.154.020.9334.6%40.4%19.2%3.8%1.9%75.0%
7T1I find the interactive format engaging.254.170.8138.5%44.2%13.5%3.8%0%82.7%
Table 32. Analysis of responses to questions on the latent dimension of “device experience”.
Table 32. Analysis of responses to questions on the latent dimension of “device experience”.
Item No.QuestionMinMaxMeanS.D.Strongly AgreeAgreeNo OpinionDisagreeStrongly DisagreeStrongly Agree + Agree
(A)(B)(C)(D)(E)(F = A + B)
8T2I think the experience flow is smooth.354.400.6450.0%40.4%9.6%0%0%90.4%
9T3I find the system operates smoothly.354.380.6950.0%38.5%11.5%0%0%88.5%
10T6I feel the colors on the screen are comfortable.354.310.7044.2%42.3%13.5%0%0%86.5%
11T14The system’s stress-relief elements are relevant to my life experiences.254.000.8428.8%48.1%17.3%5.8%0%76.9%
12T16The experience boosted my willingness to join similar activities.254.380.7451.9%36.5%9.6%1.9%0%88.4%
Table 33. Analysis of responses to questions on the latent dimension of “interface design and perception”.
Table 33. Analysis of responses to questions on the latent dimension of “interface design and perception”.
Item No.QuestionMinMaxMeanS.D.Strongly AgreeAgreeNo OpinionDisagreeStrongly DisagreeStrongly Agree + Agree
(A)(B)(C)(D)(E)(F = A + B)
13T7I find the overall visual design of the system calming.354.270.6336.5%53.8%9.6%0%0%90.3%
14T11The auditory feedback captured my attention.354.230.7340.4%42.3%17.3%0%0%82.7%
15T10I find the animation design engaging.254.210.7538.5%46.2%13.5%1.9%0%84.7%
16T9I think the system’s ambient sounds are rich and immersive.354.290.7244.2%40.4%15.4%0%0%84.6%
Table 34. Analysis of responses to questions on the latent dimension of “physiological feedback”.
Table 34. Analysis of responses to questions on the latent dimension of “physiological feedback”.
Item No.QuestionMinMaxMeanS.D.Strongly AgreeAgreeNo OpinionDisagreeStrongly DisagreeStrongly Agree + Agree
(A)(B)(C)(D)(E)(F = A + B)
1S4I felt that all my worries were cast aside.153.671.0023.1%34.6%30.8%9.6%1.9%57.7%
2S6The things I cared about no longer seemed important.153.501.1826.9%21.2%30.8%17.3%3.8%48.1%
3S2My muscles were deeply relaxed.153.940.9730.8%44.2%15.4%7.7%1.9%75.0%
4S5My hands, arms, or legs felt loose and tension-free.154.02.08528.8%50.0%17.3%1.9%1.9%78.8%
5S9My body felt at ease.254.310.7042.3%48.1%7.7%1.9%0%90.4%
Table 35. Analysis of responses to questions on the latent dimension of “emotional feedback”.
Table 35. Analysis of responses to questions on the latent dimension of “emotional feedback”.
Item
No.
QuestionMinMaxMeanS.D.Strongly AgreeAgreeNo OpinionDisagreeStrongly DisagreeStrongly Agree + Agree
(A)(B)(C)(D)(E)(F = A + B)
6S3My heart felt calm and unburdened.254.190.6832.7%55.8%9.6%1.9%0%88.5%
7S7I felt rested.254.190.8642.3%40.4%11.5%5.8%0%82.7%
8S1I felt a sense of peace.354.400.6650.0%40.4%9.6%0%0%90.4%
9S8I felt physically and mentally refreshed.254.170.8138.5%44.2%13.5%3.8%0%82.7%
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Wang, C.-M.; Lin, C.-H. Design of an Interactive System by Combining Affective Computing Technology with Music for Stress Relief. Electronics 2025, 14, 3087. https://doi.org/10.3390/electronics14153087

AMA Style

Wang C-M, Lin C-H. Design of an Interactive System by Combining Affective Computing Technology with Music for Stress Relief. Electronics. 2025; 14(15):3087. https://doi.org/10.3390/electronics14153087

Chicago/Turabian Style

Wang, Chao-Ming, and Ching-Hsuan Lin. 2025. "Design of an Interactive System by Combining Affective Computing Technology with Music for Stress Relief" Electronics 14, no. 15: 3087. https://doi.org/10.3390/electronics14153087

APA Style

Wang, C.-M., & Lin, C.-H. (2025). Design of an Interactive System by Combining Affective Computing Technology with Music for Stress Relief. Electronics, 14(15), 3087. https://doi.org/10.3390/electronics14153087

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop