Next Article in Journal
A Building Segmentation Network Based on Improved Spatial Pyramid in Remote Sensing Images
Next Article in Special Issue
Throughput and Packet Loss Probability Analysis of Long Range Wide Area Network
Previous Article in Journal
X-ray Irradiation-Induced Abnormal Development and DNA Damage in Phthorimaea operculella (Lepidoptera: Gelechiidae)
Previous Article in Special Issue
Eye State Identification Based on Discrete Wavelet Transforms
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Adaptive Model for Biofeedback Data Flows Management in the Design of Interactive Immersive Environments

1
LabRP-CIR, Psychosocial Rehabilitation Laboratory, Center for Rehabilitation Research, School of Health, Polytechnic Institute of Porto, 4200-072 Porto, Portugal
2
LabRP-CIR, Psychosocial Rehabilitation Laboratory, School of Media Arts and Design, Polytechnic Institute of Porto, 4480-876 Vila do Conde, Portugal
3
CITIC, Research Center of Information and Communication Technologies, Talionis Research Group, Universidade da Coruña, 15071 A Coruña, Spain
*
Author to whom correspondence should be addressed.
Appl. Sci. 2021, 11(11), 5067; https://doi.org/10.3390/app11115067
Submission received: 2 May 2021 / Revised: 24 May 2021 / Accepted: 27 May 2021 / Published: 30 May 2021
(This article belongs to the Special Issue Advances in Information and Communication Technologies (ICT))

Abstract

:

Featured Application

The integration of biofeedback systems in Emotionally Adaptive Immersive Environments contributes to increase their interactivity, allowing them to be used in therapeutic programs.

Abstract

The interactivity of an immersive environment comes up from the relationship that is established between the user and the system. This relationship results in a set of data exchanges between human and technological actors. The real-time biofeedback devices allow to collect in real time the biodata generated by the user during the exhibition. The analysis, processing and conversion of these biodata into multimodal data allows to relate the stimuli with the emotions they trigger. This work describes an adaptive model for biofeedback data flows management used in the design of interactive immersive systems. The use of an affective algorithm allows to identify the types of emotions felt by the user and the respective intensities. The mapping between stimuli and emotions creates a set of biodata that can be used as elements of interaction that will readjust the stimuli generated by the system. The real-time interaction generated by the evolution of the user’s emotional state and the stimuli generated by the system allows him to adapt attitudes and behaviors to the situations he faces.

1. Introduction

1.1. Emotions in the Empathy Construct

Emotions are related to the stimuli that triggered them within a given context [1]. Emotions are part of the empathy process that reflects the ability to share another person’s affective state [2,3]. Exposure to immersive environments provides impactful experiences capable of generating different types and intensities of emotions [4]. The emotions generated by an immersive environment designed with a specific purpose can contribute to increase the degree of empathy. Emotions arise through the feeling of realism and presence that involve the user during the exhibition.
The emotional state created by the immersive environment increases receptivity and promotes reflection. A more engaging environment will have a greater influence on the user’s perception during the exposure, awakening their senses, generating emotional reactions to the triggered events and creating a sense of realism and presence [4].
The sensation of immersion is caused by a set of factors such as the appearance and graphic quality, the narrative, sound design and interactivity. The interactivity of an immersive system is a process triggered by the emission of different types of stimuli to induce reactions in the user. The stimuli explore the user’s senses, the most used stimuli are visual, sound and tactile. Traditionally, these types of stimuli cause voluntary and involuntary reactions (Figure 1).
In voluntary reactions, the user responds more or less consciously to stimuli in order to adapt and influence the environment. The speed and type of response depends on the user’s attention and physical and intellectual abilities. Involuntary reactions cause unintended physical reflexes, which can be somehow controlled or inhibited by the user and also induce biological changes over which the individual has no control (unless he has specific training that can influence those changes). Some of the relevant biological changes are reflected by changes in respiratory rate, heart rate and intensity, skin conductivity and through brain waves.
The interactivity of a system is characterized by the type of interaction between the system and the user. The stimuli sent by the system or by the user trigger responses that induce actions that can produce new stimuli.
A truly immersive system explores the potential for voluntary and involuntary reactions. Voluntary reactions can be captured using buttons, movements or eye tracking devices, but to capture certain involuntary reactions, it is necessary to use biofeedback devices. Real-time biofeedback allows to capture and evaluate the user’s reactions to stimuli. The use of real-time biofeedback devices should, as far as possible, be non-invasive to avoid interfering with or limiting the immersive experience.
The use of real-time biofeedback mechanisms can add significant value to the interaction process, consequently increasing the feeling of immersion. In this way, an affective algorithm can be used for emotion recognition, identifying and quantifying the emotions felt by the user.
The data collected directly from each biofeedback device, by itself, do not add value to the interaction process if they are not interpreted. These unimodal data must have a first interpretation and conversion, and after this phase, together with the other data, they are transformed into multimodal data. This process is carried out by an affective algorithm, whose function is to collect, interpret, convert and combine the biodata from the different devices. The affective algorithm is also responsible for analyzing the multimodal data, recognizing emotions, comparing the values with the defined objectives and determining the configuration parameters, defining the types and intensities of new stimuli to be emitted, so that the system can automatically generate interactivity.
In the interactive process in a truly immersive system, the user can respond voluntarily to stimuli when their responses and options are made consciously, but the user can also interact in an unconscious way, having no control over the response to stimuli, which is determined through the biodata collected in real time.
The fact that the self-control of a person’s biological state can be developed with time and experience [5] opens doors so that this type of system can be applied in immersive environments for therapeutic purposes, as situations in which it is intended that the user trains self-regulatory mechanisms in the face of concrete situations (phobias, vertigo, anxiety, crisis and catastrophe, etc.).
The objective of this work is to describe an adaptive model for managing data flows generated by bio stimuli for affective algorithms in the design of interactive emotionally adaptive immersive environments, analyzing the interactive biofeedback process as a key element for the development of truly immersive environments.
The Model presented in this work aims to facilitate the planning, design and development of truly immersive interactive environments. Its conceptual base allows its adaptation to different immersive environments (Virtual Reality, Mixed Reality, 360° Video), with different application purposes (therapeutic, self-regulation, training).

1.2. Interactivity as One of the Essential Factors for Creating Immersive Environments

To consider the importance of interactivity mechanisms on virtual reality systems and their role on the level of sense of presence and emotions, it is necessary to understand first what means interactivity. This term originates from the term interaction, the relations between human beings [6]. Michael Jäckel explains the sociological concept of interaction, “The basic model on which the sociological concept of interaction is based is the relationship between two or more people who orientate themselves in their behavior and can perceive each other” [7].
Computer sciences used the term interaction to describe the use of computer systems by users. Starting with very basic interfaces to provide that communication between computers and humans, there was an increase in complexity with the advance of the technology. Human–Computer Interaction (HCI) examines the structure of user interfaces and ways of communication using hardware and software. Reference [8] presents three criteria to distinguish the terms interaction and interactivity. “Interactivity” demands for real and observable interactions among humans via a machine or between man and machine, implying real human behavior. Second, interactivity depends on a technical component that occupies a key position within the communication process. Third, no change of devices will be necessary for interactive communication.
In general sense interactivity can be considered to describe an active relationship between two persons or objects [9]. The term gained relevance in the late 1980s and 1990s with the increasing importance of multimedia and several authors reflected and tried to construct a definition. Reference [10] defines interactivity as the “extent to which users can participate in modifying the form and content of a mediated environment in real time”. Reference [11] addresses the question of whether an interactive artwork can be immersive (narrative as virtual reality).
Sherman and Craig [12] defined the four dimensions they consider relevant for a VR experience: consumer inhabitation, feelings of immersion, sensory feedback and interactivity. According to [13,14] interaction mechanisms have a significant role on increasing the level of telepresence providing social, cognitive and physical engagement.

1.3. Neuro and Biofeedback Systems

Emotions are present in humans’ daily life; they are crucial to communicate and to everyday interpersonal events [15,16]. Emotions are a human sentimental state that are a response to an external or internal stimulus, such as a situation, an object or a memory of a past emotional event [16,17,18,19]. Humans use emotions to communicate between them and it is important to point out that the emotional state of the person can affect several daily factors, such as learning, making decisions and memory [15,17].
Emotion is an episode of interrelated and synchronized change in five subsystems (Figure 2): cognitive processing, subjective feeling, action tendencies, physiological changes and motor expression [16,18].
Emotions comprise complex mental activities, although there is no consensus on a precise and complete definition of emotions [16,18]. Although there is a lot of difficulty in defining emotions, they can be described through two perspectives, the dimensional model and the discrete model (Figure 3). In the dimensional model, there is a need to have two dimensions to describe an emotion, valence (that ranges from unpleasant to pleasant) and arousal (that goes from not aroused to excited). Regarding the discrete model of emotions, the emotions can be discrete as basic emotions or complex emotions, being the last one a combination of basics. The most frequent basic emotions are happiness, sadness, anger, disgust, fear and surprise [16,20,21].
Therefore, nowadays, the ability to recognize emotions is a very important skill to achieve intelligent and effective social communications [22]. The recognition of the different emotional states is very important to several areas of activity, such as medicine, education, intelligent system and human-computer interaction, among others [15,22]. Self-reporting of emotions is unreliable; there are other ways to recognize emotions. It is possible to identify emotions through facial expressions and voice intonation and is also possible to detect emotions via physiological signs, such as electroencephalogram (EEG), cardiovascular changes (heart rate, blood volume pressure and peripheral vascular resistance), respiration patterns, galvanic skin response, skin temperature and body language [15,16,21,22].
The process of emotions involves different parts of the central nervous system which interact between them, especially between the central and autonomic system. There are brain pathways between various cortex areas that have connections to process the emotions [23]. For example, the hypothalamus controls part of the autonomic nervous system responsible for some changes such as heart rate, skin temperature and respiration rate. The hypothalamus also controls parts of the amygdala, where it is processed the feedbacks from the external environment and also mediates emotional responses [21,22]. EEG can be used to measure and monitor changes in the brain, and it has the advantage that the physiological signals are difficult to tamper with. For that reason, the use of EEG to recognize emotions has been receiving attention from the researchers of this area of expertise [15]. On the EEG analysis, it is necessary to consider some characteristics such as time domain, frequency domain and time-frequency domain of the EEG to correlate the information between the different EEG channels and to find more reliable information; however, these domains can also give information about emotion in separate [15,17]. The time domain can identify characteristics of time and can help to do EEG statistics such as mean, power and standard deviation. The frequency domain is used in some frequency domain techniques such as power spectral density [15].
A good quality EEG is dependent on the disturbances during the process of the recording. It is important to know the characteristics of each artefact to decrease it, or during the recording, removing the disorder, or during the analysis, using signal filters [24,25].
The data after being detected and acquired need to be correctly archived, so in that way, it can be processed and analyzed afterwards. It is important that the storage of these information be done in a general format that can be read by different signal software, so then, it can be processed and analyzed [24].
The EEG signals are also used as a brain–computer interface (BCI), which can communicate between a computer and the individual, enabling the human brain to control external equipment. The BCI technology can be used in different areas, like psychology, medicine and neurogaming [26,27]. In neurogaming, it is possible for the user to control the video game with the brain waves. These kinds of games can have therapeutical aims, for example, for improving the working memory. There are studies that use BCI based on non-invasive EEG with virtual and augmented reality. Using these technologies together gives the participant a greater sense of immersion [26,28].
Although an agreement does not yet exist on the patterns and the brain regions that are responsible for the emotions, regardless of the subject, it is believed that the amygdala is either responsible for fear or that is one of the most important brain areas to process this emotion. This part of the brain is most likely to be active when the person cannot predict what sensations can mean, what to do about it or if they are valuable or not in that specific situation [21,29]. The brain activation is considered when there is a decrease in alpha EEG power [30]. There is a study that found evidence that the posterior cingulate cortex, precuneus and medial prefrontal cortex are representative of emotions; the stimuli suggest emotional information or memories of an emotional stimuli. The angular gyrus is an important component in the emotional memory retrieval [19]. Lindquist et al. (2012) found that the brain regions that are involved in basic psychological operations, being emotional or not, are active during the emotion experience [29].
Cortex electrical activity is produced tanks to the activation of the neurons of the cortex. This activity is recorded in the scalp where those neurons are located. This is complex and variable which transmit irregular signals to the computer screen. Despite this irregularly and variability, it is possible to recognize patterns that can be divided into several waves that are identified using frequency (Hertz—Hz) and amplitude (microvolt—μV), such as Alpha (8–13 Hz), Beta (13–30 Hz), Theta (4–8 Hz), Delta (<4 Hz) and Gamma (30–100 Hz). These frequencies also represent different physiologic function, for example, delta waves are seen during sleep, while theta waves are seen during the sleepy stage. When a person is relaxed but awake, this is seen alpha waves, meanwhile when that person is alert, the EEG shows beta waves, and during problem solving, gamma waves are seen [25,31,32].
Another EEG feature that is very important in emotion recognition is the frontal asymmetry and midline power. Regarding the midline power, it can be associated with emotional processing, especially the frontal midline theta, which was associated with the relaxation state from anxiety and positive emotional events [16].
The asymmetric brain activity in the frontal area between the right and left hemispheres is responsible for frontal asymmetry. The frontal asymmetry is normally seen in alpha (8–13 Hz) and theta (4–8 Hz) rhythms and is a potential tool to recognize discrete emotions. For example, anger can provoke higher levels of left frontal activity and low right frontal activity, but the frontal asymmetry is also related to valence, arousal and self-reported dominance [16]. Another way to determine the asymmetry is using the spectral asymmetry index [33]. Orgo et al. (2015) found that negative stimuli increased the spectral asymmetry index in frontocentral, central, centroparietal, parietal and occipital areas, compared to the neutral stimuli. They also found evidence of significant decrease of the spectral asymmetry index in left temporal, centroparietal, parietal and occipital areas when the person was subject to positive stimuli [33]. It was also found that the frontal right hemisphere is more active in response to negative emotions, and in the frontal left hemisphere, there was an increase of activity with positive emotions [30,34]. There are two hypotheses to explain these findings. The first theory explains that the right hemisphere is dominant in the process of emotions regardless of the affective valence. The second theory justifies that the right hemisphere is responsible for processing the negative emotions, and the left hemisphere is specialized in the positive ones [23].
The frontal power asymmetry is a measure frequently used in neurofeedback in order to train emotional self-regulation [30]. Neurofeedback is a non-invasive technique that is an application of brain–computer interface, and it is being used in the treatment of mental disorders as well enhancing brain performances such as behavior, cognitive and emotional processes, in real time [30,35]. Neurofeedback is a technique that has been gaining attention because in addition to the benefits that are known to the patient, it also guides mental activity from the brain area in study through real time feedback [30,35,36].
The subject can learn how to improve the brain activity; although besides the usual applications in 2D, there are studies that use neurofeedback on 3D immersive environments [37,38]. 3D environments offer more user’s interaction with the world around them which helps in the simulation of real-world tasks [38]. The characteristics inherent to the immersive environment make the users feel that they are really there and give them a sense of presence, since it is possible to create complex and real situations and/or environments [39,40].
Due to the sense of presence that this technology gives to the user, it has been indicated as a tool for provoking emotions in laboratory environments. There are several studies that show that immersive environments can provoke different emotional states like anxiety, relation, and different mood in social environments featuring avatars [38,39]. Immersive environments are also a tool for behavioral research in psychological assessment [39].
Electroencephalography is a painless and non-invasive technique to acquire electrical activity from the brain, using electrodes on standard positions on the head that conduct the electrocortical activity to the amplification equipment, which is necessary since this cerebral cortex information is measured in microvolts and needs to be amplified to be exhibited on a computer [24,25,41]. There are several advantages of EEG for the diagnosis of different pathologies, such as, the sensitivity of EEG, almost real-time recording, the accessibility to the recording and EEG moderately low cost [41,42].
Besides other applications of 3D environments like education or architecture, it has been proven as an effective technology for therapeutic applications, since 3D environments have stronger effects than 2D stimulation [39,43].

2. Biofeedback for Self-Regulation Training in Adaptive Environments Model

Exposure to immersive environments affects the user by creating sensations that arouse emotions, causing involuntary reactions such as changes in heart rate and intensity in respiratory rate, brain activity, skin conductivity and eye movements [44]. There are biofeedback devices capable of recording these changes that occur during exposure. This biofeedback allows the analysis of the impact that the immersive environment had on the user, but it is possible to go even further; real-time biofeedback can be used as an element of interactivity.
The biodata captured by biofeedback devices allow to determine how the stimuli affect the user; the conversion of unimodal data obtained by each device into multimodal data allow, through an affective algorithm, to understand how mental changes influence the user’s emotional state.
The multimodal data are interpreted by the affective algorithm. In a first phase, it uses the Discrete Model to identify the emotions felt by the user, and in a second phase, it uses the Dimensional Model to determine the valence and arousal of the emotions under analysis. The combination of these two models allows to determine the emotional state of the user through the binomial emotion-intensity.
The affective algorithm identifies the user’s emotional state in real time and can use two interactivity strategies, a passive interactivity strategy (PIS) or an active interactivity strategy (AIS). The difference between these two strategic approaches is in the information that is provided in real time to the user (Figure 4).
The passive interactivity strategy, after identifying emotions and quantifying their intensity, uses these values for the system to generate new stimuli that can lead the user to the emotional state that is intended. In PIS, the user does not have access to information about his own emotional state, and the interactivity is entirely controlled by the system.
The active interactivity strategy provides the user with real-time information about his own emotional state, allowing him to regulate his emotions to control the stimuli sent by the immersive environment. In AIS, the user is aware that his behavior influences the system, and through self-regulation processes, he can train which responses are most appropriate to the stimuli generated by the system.
Both passive and active interactivity strategies, through measurement of user emotional data, allow to create an Emotionally Adaptive Immersive Environment which continuously adapts the stimulus to the user emotional state. The continuous adaptation of stimuli to the user’s emotional state makes each exposure a unique and personalized experience. During the exhibition, adaptive behavior influences the user’s attitudes and his response to stimuli (Figure 5).
When the participant receives information in real time about a certain aspect of his physiology during an exposure, or a sequence of exposures, to an immersive environment, the participant may realize that changes in his mental state can influence the environment. This progressive awareness influences and transforms the participant’s attitude. (Figure 5).
A biofeedback system applied in an immersive environment transforms it into an emotionally adaptable and immersive environment, where the user’s emotional state can optimize the experience through the continuous adaptation of stimuli to his own emotional state.
The Emotionally Adaptive Immersive Environment uses the affective algorithm to relate stimuli to emotions (Figure 6). The system can generate different stimuli, visual, audible or tactile; their impact depends on their frequency, intensity and duration.
The stimuli trigger voluntary and involuntary reactions in the user. Voluntary reactions reflect a conscious response; the user chooses the attitude considered most appropriate to respond to the stimulus, having control over his emotional state.
Involuntary reactions are manifested through physical reflexes and biological changes. The movement of the eyes and deviating or shrinking the body are examples of physical reflexes that naturally happen in unexpected situations. These movements can be captured through sensors that inform the system of the position that the user takes, and they are also important elements in the interactivity process.
Biological changes as respiratory rate, heart rate and intensity, skin conductivity and brain waves can be captured, in real time, using biofeedback devices. Those biological changes are involuntary, and their intensity may vary from person to person. Each of them generates data, the unimodal data. Unimodal data are extremely relevant, as they characterize the biological response that the human body gives to each stimulus received. The analysis, processing and interpretation of the collected unimodal data allows its conversion into multimodal data (Figure 7).
The affective algorithm uses multimodal data to recognize emotions. This process makes it possible to determine the user’s emotional state during exposure to stimuli and is a key element in the affective algorithm interaction process in emotionally adaptive immersive environments. The use of multimodal data makes it possible to assess the type and intensity of emotions felt with a greater degree of reliability.
The affective algorithm, in addition to analyzing and interpreting the multimodal to determine the user’s mental state, compares these values with standard values and calculates new values for the frequency, intensity and duration of the stimuli, which allow to reach the intended objectives. In this way, the affective algorithm has a real-time influence on the exhibition, transforming the narrative itself into an interactive narrative.

3. Materials and Methods

3.1. Materials

Non-evasive electroencephalogram devices (Looxid Link™ Mask for VIVE) can be used coupled with VR Headsets like HTC Vive Pro™ or the VR Headset HTC Vive Pro Eye™ with Precision eye tracking capabilities (Table 1). Looxid Link™ Mask for VIVE device is compatible with those VR Headsets, and its use does not interfere with the immersive experience because the participant does not realize that he is using it.
Other data acquisition and analysis systems such Biopac™ MP160 (Table 2) allow to capture biological signals (cardiovascular changes, respiration patterns, galvanic skin response) in a non-invasive way and without disturbing the sensation of immersion, maintaining the freedom of movement of the participant, since they can transmit data wirelessly, which avoids the discomfort and inconvenience of cable connection.

3.2. Methods

Interactivity, as one of the factors that trigger the sensation of immersion, must be considered in the development process of immersive environments. The interaction generated by involuntary reactions must be carefully studied, as well as the interaction that uses voluntary reactions.
The authors propose an adaptive model for biofeedback data flows management, to manage the data generated by the participant’s involuntary reactions, to be incorporated in the requirements specification in the design of interactive immersive systems.

3.2.1. Biodata as Elements of Interactivity

To select which involuntary responses to stimuli are most appropriate for use as an interaction element, several aspects must be considered:
  • The purpose of exposure to the immersive environment.
  • The role of the participant.
  • Identifying the possible types of reactions.
  • The mobility allowed to the participant during the exhibition.
The selection of the biodata used as an interaction element must be aligned with the purpose of the exposure. Brain activity, cardiovascular changes, galvanic skin response, skin temperature, respiratory rate and eye movements result from exposure to stimuli and are generators of biodata that can be used as an interaction element. The selection of the types of biodata to be used is decisive for the choice of the most appropriate equipment.
A truly immersive system uses an adaptive environment and explores the participant’s involuntary reactions as an element of interaction. The participant can have a more active or more observant role.
The identification of possible reactions allows to define the division of the scales of values of the biodata in intervals of values. The parameterization of the affective algorithm defines the number of intervals on each scale and the range of values for each interval, to assign the degree of sensitivity of the system to the reactions of the participant.
One of the factors to consider during the exhibition is the mobility of the participant: the participant’s position (standing, sitting), the need to move in space, the possibility of performing sudden movements and the proximity to electrical or magnetic equipment that may interfere with the devices for collecting the biodata.

3.2.2. Interactive Immersive Environments

To test the model presented, the authors developed two immersive environments as similar as possible and able to use the same core engine. The first immersive environment in Virtual Reality was designed with the aim of reproducing, using Virtual computer graphics scenarios, the same room that was used in the second immersive environment that was developed using 360° video.
The same 360° narrative was applied to both environments. Each exhibition lasts approximately seven minutes, with the aim of simulating the realization of a Stroop test that integrates the process of selecting recruitment for a job.
During the exhibition, the immersive environment generates a series of visual and sound stimuli, which simulate the hallucinations that a person with schizophrenia experiences during a psychotic disorder.
The participant must be focused to perform the Stroop test despite the stimuli to which he is being exposed. On the other hand, the participant’s emotional state influences the amount and intensity of stimuli generated by the system. The anxiety and excitement revealed by the participant generate more distracting stimuli, and the concentration and focus on the task decrease the emission and intensity of the stimuli generated by the system, inducing self-regulating behavior.
This exposure makes the participant experience, for a few minutes, the same sensation that a person with schizophrenia feels during a psychotic disorder, understanding the effect that this type of flare can have on the performance of daily tasks.
Biodata selected for this experience used as an interaction element:
  • Brain activity.
  • Cardiovascular changes.
  • Galvanic skin response.
Participant role during the experience:
  • Active role.
Possible reactions:
  • During the experiment, an expert monitors the reactions of the participant.
  • The experience can be interrupted at any time by the participant’s initiative.
  • The expert may interrupt the experiment if he considers that the participant’s reactions endanger safety.
Position and mobility of the participant:
  • Sitting on a chair.
  • Despite being in a fixed position, the participant can freely move his arms, torso and head in order to explore the surrounding environment.
The system control Core Engine is based on real-time Unity platform. The Core Engine (Figure 6) receives the biodata from the devices connected to the participant and analyzes, processes and interprets the values, converting the unimodal data into multimodal data through the IMO protocol (Figure 7). During the exposure, the system readjusts the stimuli in real-time according to the values returned by the affective algorithm. It was applied to the two immersive environments developed with exactly the same narrative, the same timeline, the same stimuli and the same intensities.

4. Results

Empathy VR-Schizophrenia is a Project, within the scope of promoting mental health and wellness literacy, which aims to study the impact of using immersive environments for the participant to create empathy towards people with schizophrenia.
The Adaptive Model for Biofeedback Data Flows Management was applied in the design of two Interactive Immersive Environments. To test the model’s implementation for real cases, the authors developed two different types of immersive environments (Figure 8), a virtual reality environment (virtual computer graphics scenarios) and an immersive environment using 360° video.
To verify the applicability of this model, the authors chose to develop two different immersive environments with similar characteristics, sharing the same purpose and the same narrative. The application of the same model in the development of the two environments made their performance and their dynamic and adaptive behavior similar. The two environments generate the same stimuli and intensities for the same reactions of the participants.

5. Discussion

The application of the Adaptive Model for Biofeedback Data Flows Management allowed to conceptualize, design, develop and test two different types of immersive environments.
The conceptualization of an immersive environment requires a deep reflection on several factors that can determine its success. The engineering process to develop this type of system must consider as requirements, in addition to the socio-technical aspects, a strong human–computer interaction component.
One of the important factors is the use of 360° narratives, in which it is not possible to control where and when the participant will direct his gaze. The autonomy granted to the participant to freely explore the surrounding environment means that each participant can have a different experience. As it is not possible to “force” the participant to look at a certain point, the emission of stimuli is the way to get their attention.
The use of equipment to collect biodata implies mobility restrictions, and these restrictions must consider the movement through space and the performance of sudden movements. The ranges of values collected by the biofeedback devices must consider their sensitivity.
The developed environments had the purpose of creating empathy towards people with schizophrenia; the application of this adaptive model, allowed to develop two immersive environments, using different technologies, but ensuring that both met the defined objectives and requirements.

6. Conclusions

The reaction to stimuli can vary from person to person, and the same person may react differently to the same stimulus in different situations. Exposure to immersive environments contributes to increasing the receptivity for the construction of emotions. There is a relationship between emotions and the stimuli that triggered them. The emotional state influences the type of reaction and the ability to respond to stimuli.
The emotions that the participant feels during exposure to immersive environments trigger voluntary and involuntary responses, conditioning their actions and attitudes. Faced with a stimulus, the participant may have adaptive, reactive, or self-regulating attitudes.
During exposure to immersive environments, biofeedback devices can be used for monitoring and observation. The collection and processing of biodata generated by biofeedback devices in real time allows the use of biodata as an element that generates interactivity.
An Emotionally Adaptive Immersive Environment can use an affective algorithm to convert unimodal data into multimodal data to recognize emotions and relate them to the stimuli that caused them. The mapping between stimuli and emotions made by the affective algorithm allows to raise the level of interactivity of the system to another level.
The type of interactivity that real-time biofeedback systems provide allows their use in therapeutic programs, among which stand out those that promote adaptation behaviors or self-regulation in the face of uncomfortable situations for the user. This investigation opens the possibility of using this type of Emotionally Adaptive Immersive Environments in different areas, which can be integrated into therapeutic programs related to the treatment of some types of phobias (for example, arachnophobia or social phobia) and programs for self-regulation of behaviors (anxiety control, improve the ability to concentrate).

Author Contributions

Conceptualization, P.V.G.; investigation, P.V.G., A.M., J.D., C.S. and A.C.; methodology, P.V.G. and A.M.; project administration, P.V.G.; supervision, A.M. and J.P.; validation, P.V.G., A.M. and J.P.; visualization, P.V.G. and A.C.; writing—original draft, P.V.G., J.D. and C.S.; writing—review and editing, P.V.G., A.M. and J.P. All authors have read and agreed to the published version of the manuscript.

Funding

This publication cost was funded by CITIC, Research Center of Information and Communication Technologies, University of A Coruña.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Acknowledgments

This research was carried out and used the equipment of the LabRP-CIR, Psychosocial Rehabilitation Laboratory, Center for Rehabilitation Research, School of Health, Polytechnic Institute of Porto. This work had scientific support from CITIC, Research Center of Information and Communication Technologies, University of A Coruña.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. da Fonseca, V. Importância das emoções na aprendizagem: Uma abordagem neuropsicopedagógica. Rev. Psicopedag. 2016, 102, 365–384. [Google Scholar]
  2. de Tommaso, M.; Ricci, K.; Conca, G.; Vecchio, E.; Delussi, M.; Invitto, S. Empathy for pain in fibromyalgia patients: An EEG study. Int. J. Psychophysiol. 2019, 146, 43–53. [Google Scholar] [CrossRef]
  3. Santamaría-García, H.; Baez, S.; García, A.M.; Flichtentrei, D.; Prats, M.; Mastandueno, R.; Sigman, M.; Matallana, D.; Cetkovich, M.; Ibáñez, A. Empathy for others’ suffering and its mediators in mental health professionals. Sci. Rep. 2017, 7, 1–13. [Google Scholar] [CrossRef]
  4. Gomes, P.V.; Donga, J.; Marques, A.; Azevedo, J.; Pereira, J. Analysis and definition of data flows generated by bio stimuli in the design of interactive immersive environments. Proceedings 2020, 54, 26. [Google Scholar] [CrossRef]
  5. Bersak, D.; McDarby, G.; Augenblick, N.; McDarby, P.; McDonnell, D.; McDonald, B.; Karkun, R. Intelligent biofeedback using an immersive competitive environment. Most 2001. [Google Scholar]
  6. Blumer, H. Symbolic Interactionism: Perspective and Method; University of California Press: Berkeley, CA, USA, 1986. [Google Scholar]
  7. Jäckel, M. Interaktion. Soziologische Anmerkungen zu einem Begriff. Rundfunk und Fernsehen 1995, 43, 463–476. [Google Scholar]
  8. Quiring, O.; Schweiger, W. Interactivity: A review of the concept and a framework for analysis. Communications 2008, 33, 147–167. [Google Scholar] [CrossRef] [Green Version]
  9. Mechant, P.; Van Looy, J. Interactivity. In The Johns Hopkins Guide to Digital Media; Ryan, M.-L., Emerson, L., Robertson, B., Eds.; Johns Hopkins University Press: Baltimore, MD, USA, 2014; pp. 302–305. [Google Scholar]
  10. Steuer, J. Defining Virtual Reality: Dimensions Determining Telepresence. J. Commun. 1992, 42, 73–93. [Google Scholar] [CrossRef]
  11. Ryan, M. Immersion vs. interactivity: Virtual reality and literary theory. Substance 1999, 2, 110–137. [Google Scholar] [CrossRef]
  12. Sherman, W.R.; Craig, A.B. Understanding Virtual Reality; Elsevier: Cambridge, MA, USA, 2019. [Google Scholar]
  13. Baker, S.; Waycott, J.; Robertson, E.; Carrasco, R.; Neves, B.B.; Hampson, R.; Vetere, F. Evaluating the use of interactive virtual reality technology with older adults living in residential aged care. Inf. Process. Manag. 2020, 57, 102105. [Google Scholar] [CrossRef]
  14. Carrasco, R.; Baker, S.; Waycott, J.; Vetere, F. Negotiating stereotypes of older adults through avatars. In Proceedings of the 29th Australian Conference on Computer-Human Interaction, Brisbane, Australia, 28 November—1 December 2017; pp. 218–227. [Google Scholar] [CrossRef]
  15. Chao, H.; Zhi, H.; Dong, L.; Liu, Y. Recognition of emotions using multichannel EEG data and DBN-GC-based ensemble deep learning framework. Comput. Intell. Neurosci. 2018, 2018, 1–11. [Google Scholar] [CrossRef]
  16. Zhao, G.; Zhang, Y.; Ge, Y. Frontal EEG Asymmetry and middle line power difference in discrete emotions. Front. Behav. Neurosci. 2018, 12, 1–14. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  17. Sá, C.; Gomes, P.V.; Marques, A.; Correia, A. The use of portable EEG devices in development of immersive virtual reality environments for converting emotional states into specific commands. Proceedings 2020, 54, 43. [Google Scholar] [CrossRef]
  18. Scherer, K.R. What are emotions? And how can they be measured? Soc. Sci. Inf. 2005, 44, 695–729. [Google Scholar] [CrossRef]
  19. Kim, J.; Schultz, J.; Rohe, T.; Wallraven, C.; Lee, S.-W.; Bülthoff, H.H. Abstract representations of associated emotions in the human brain. J. Neurosci. 2015, 35, 5655–5663. [Google Scholar] [CrossRef] [Green Version]
  20. Hamann, S. Mapping discrete and dimensional emotions onto the brain: Controversies and consensus. Trends Cogn. Sci. 2012, 16, 458–466. [Google Scholar] [CrossRef]
  21. Val-Calvo, M.; Álvarez-Sánchez, J.R.; Ferrández-Vicente, J.M.; Díaz-Morcillo, A.; Fernández-Jover, E. Real-time multi-modal estimation of dynamically evoked emotions using EEG, heart rate and galvanic skin response. Int. J. Neural Syst. 2020, 4, 1–17. [Google Scholar] [CrossRef] [Green Version]
  22. Alazrai, R.; Homoud, R.; Alwanni, H.; Daoud, M.I. EEG-based emotion recognition using quadratic time-frequency distribution. Sens. Switz. 2018, 8, 2739. [Google Scholar] [CrossRef] [Green Version]
  23. Cao, R.; Shi, H.; Wang, X.; Huo, S.; Hao, Y.; Wang, B.; Guo, H.; Xiang, J. Hemispheric asymmetry of functional brain networks under different emotions using EEG data. Entropy 2020, 22, 939. [Google Scholar] [CrossRef]
  24. Paszkiel, S.; Szpulak, P. Methods of acquisition, archiving and biomedical data analysis of brain functioning. In Biomedical Engineering and Neuroscience. BCI 2018. Advances in Intelligent Systems and Computing; Hunek, W., Paszkiel, S., Eds.; Springer: Cham, Switzerland, 2018; Volume 720. [Google Scholar] [CrossRef]
  25. Marcuse, L.; Fields, M.; Yoo, Y.J. Rowan’s Primer of EEG E-Book, 2nd ed.; Elsevier: Amsterdam, The Netherlands, 2016. [Google Scholar]
  26. Paszkiel, S. Using BCI and VR technology in neurogaming. In Econometrics for Financial Applications; Springer Science and Business Media LLC: Cham, Switzerland, 2019; Volume 852, pp. 93–99. [Google Scholar]
  27. Zhao, M.; Gao, H.; Wang, W.; Qu, J. Research on human-computer interaction intention recognition based on EEG and eye movement. IEEE Access 2020, 8, 145824–145832. [Google Scholar] [CrossRef]
  28. Paszkiel, S. Augmented reality of technological environment in correlation with brain computer interfaces for control processes. In Recent Advances in Automation, Robotics and Measuring Techniques. Advances in Intelligent Systems and Computing; Szewczyk, R., Zieliński, C., Kaliczyńska, M., Eds.; Springer: Cham, Switzerland, 2014; Volume 267, pp. 197–203. [Google Scholar]
  29. Lindquist, K.A.; Wager, T.D.; Kober, H.; Bliss-Moreau, E.; Barrett, L.F. The brain basis of emotion: A meta-analytic review. Behav. Brain Sci. 2012, 35, 121–143. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  30. Zotev, V.; Phillips, R.; Yuan, H.; Misaki, M.; Bodurka, J. Self-regulation of human brain activity using simultaneous real-time fMRI and EEG neurofeedback. NeuroImage 2014, 85, 985–995. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  31. Rabcan, J.; Levashenko, V.; Zaitseva, E.; Kvassay, M. Review of methods for EEG signal classification and development of new fuzzy classification-based approach. IEEE Access 2020, 8, 189720–189734. [Google Scholar] [CrossRef]
  32. Marzbani, H.; Marateb, H.R.; Mansourian, M. Methodological note: Neurofeedback: A comprehensive review on system design, methodology and clinical applications. Basic Clin. Neurosci. J. 2016, 7, 143–158. [Google Scholar] [CrossRef] [Green Version]
  33. Orgo, L.; Bachmann, M.; Lass, J.; Hinrikus, H. Effect of negative and positive emotions on EEG spectral asymmetry. In Proceedings of the 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Milan, Italy, 25–29 August 2015; pp. 8107–8110. [Google Scholar] [CrossRef]
  34. Poole, B.D.; Gable, P.A. Affective motivational direction drives asymmetric frontal hemisphere activation. Exp. Brain Res. 2014, 232, 2121–2130. [Google Scholar] [CrossRef] [PubMed]
  35. Lorenzetti, V.; Melo, B.; Basílio, R.; Suo, C.; Yücel, M.; Tierra-Criollo, C.J.; Moll, J. Emotion regulation using virtual environments and real-time fMRI neurofeedback. Front. Neurol. 2018, 9, 390. [Google Scholar] [CrossRef] [PubMed]
  36. Herwig, U.; Lutz, J.; Scherpiet, S.; Scheerer, H.; Kohlberg, J.; Opialla, S.; Preuss, A.; Steiger, V.; Sulzer, J.; Weidt, S.; et al. Training emotion regulation through real-time fMRI neurofeedback of amygdala activity. NeuroImage 2019, 184, 687–696. [Google Scholar] [CrossRef]
  37. Batail, J.-M.; Bioulac, S.; Cabestaing, F.; Daudet, C.; Drapier, D.; Fouillen, M.; Fovet, T.; Hakoun, A.; Jardri, R.; Jeunet, C.; et al. EEG neurofeedback research: A fertile ground for psychiatry? Encéphale 2019, 45, 245–255. [Google Scholar] [CrossRef] [PubMed]
  38. Marín-Morales, J.; Llinares, C.; Guixeres, J.; Alcañiz, M. Emotion recognition in immersive virtual reality: From statistics to affective computing. Sensors 2020, 20, 5163. [Google Scholar] [CrossRef] [PubMed]
  39. Valenza, G. Interactive Storytelling in a Mixed Reality Environment: The Effects of Interactivity. Sci. Rep. 2018, 8, 1–15. [Google Scholar]
  40. Baños, R.M.; Botella, C.; Rubió, I.; Quero, S.; García-Palacios, A.; Alcañiz, M. Presence and emotions in virtual environments: The influence of stereoscopy. Cyberpsychol. Behav. 2008, 1, 1–8. [Google Scholar] [CrossRef]
  41. Peng, H.; Chen, X.; Wang, Z.; Zhu, J.; Zhang, X.; Sun, S.; Li, J.; Huo, X.; Li, X. Multivariate pattern analysis of EEG-based functional connectivity: A study on the identification of depression. IEEE Access 2019, 7, 92630–92641. [Google Scholar] [CrossRef]
  42. Jebelli, H.; Khalili, M.M.; Lee, S. A continuously updated, computationally efficient stress recognition framework using electroencephalogram (EEG) by applying online multitask learning algorithms (OMTL). IEEE J. Biomed. Heal. Inform. 2019, 23, 1928–1939. [Google Scholar] [CrossRef] [PubMed]
  43. Schubring, D.; Kraus, M.; Stolz, C.; Weiler, N.; Keim, D.A.; Schupp, H. Virtual reality potentiates emotion and task effects of alpha/beta brain oscillations. Brain Sci. 2020, 10, 537. [Google Scholar] [CrossRef] [PubMed]
  44. Veloso Gomes, P.; Donga, J.; Sá, V.J. Software requirements definition processes in gamification development for Immersive environments. In Handbook of Research on Solving Modern Healthcare Challenges with Gamification; de Queirós, R.A.P., Marques, A.J., Eds.; IGI Global: Hershey, PA, USA, 2021; pp. 68–78. [Google Scholar]
Figure 1. Voluntary and involuntary reactions to stimuli in immersive environments.
Figure 1. Voluntary and involuntary reactions to stimuli in immersive environments.
Applsci 11 05067 g001
Figure 2. Subsystems of interrelated and synchronized change in an Emotion episode.
Figure 2. Subsystems of interrelated and synchronized change in an Emotion episode.
Applsci 11 05067 g002
Figure 3. The dimensional model and the discrete model in emotions definition.
Figure 3. The dimensional model and the discrete model in emotions definition.
Applsci 11 05067 g003
Figure 4. Passive and Active Interactivity strategies used by Affective Algorithms in immersive environments.
Figure 4. Passive and Active Interactivity strategies used by Affective Algorithms in immersive environments.
Applsci 11 05067 g004
Figure 5. Behavioral change during continuous exposure to stimuli generated in immersive environments.
Figure 5. Behavioral change during continuous exposure to stimuli generated in immersive environments.
Applsci 11 05067 g005
Figure 6. Biofeedback mechanisms to identify emotions and generate interactivity in adaptive immersive environments conceptual protocol model.
Figure 6. Biofeedback mechanisms to identify emotions and generate interactivity in adaptive immersive environments conceptual protocol model.
Applsci 11 05067 g006
Figure 7. Affective algorithm in the conversion from unimodal data to multimodal data after application of the IMO protocol.
Figure 7. Affective algorithm in the conversion from unimodal data to multimodal data after application of the IMO protocol.
Applsci 11 05067 g007
Figure 8. Empathy VR—Schizophrenia Project Scenarios: (a) virtual reality—virtual computer graphics scenario; (b) 360° video—real scenario.
Figure 8. Empathy VR—Schizophrenia Project Scenarios: (a) virtual reality—virtual computer graphics scenario; (b) 360° video—real scenario.
Applsci 11 05067 g008
Table 1. EEG equipment specification applied to Emotionally Adaptive Immersive Environment.
Table 1. EEG equipment specification applied to Emotionally Adaptive Immersive Environment.
EquipmentSpecifications
ComputerCPU: Intel® Core™ i7-9700K (3.60 GHz–4.90 GHz)
Graphic card: NVIDIA® GeForce® RTX 2080 Ti
Memory: 64 GB RAM
VR Headset
HTC Vive Pro™
Or
Vive Pro Eye™
High resolution Dual AMOLED 3.5″ diagonal screens
1440 × 1600 pixels per eye (2880 × 1600 pixels combined)
Refresh rate: 90 Hz
Field of view: 110 degrees
Integrated microphones with 3D Spatial Audio
Four SteamVR Base Station 2.0: 10 m × 10 m
VIVE Wireless Adapter
Looxid Link™ Mask for VIVEEEG sensors
Looxid Link Hub
6 channels: AF3, AF4, AF7, AF8, Fp1, Fp2
1 reference: FPz at extended 10-10 system
Dry electrodes on flexible PCB
Sampling rate: 500 Hz
Resolution: 24 bits per channel (with 1 LSB = 0.27 μV)
Filtering: digital notch filters at 50 Hz and 60 Hz, 1–50 Hz
digital bandpass
Real-time data access
Raw EEG data: 500 Hz (with/without filter options)
Feature indexes (alpha, beta, gamma, theta, delta): 10 Hz
Mind indexes (attention, relaxation, balance): 10 Hz
Table 2. ECG, plethysmography and galvanic skin response equipment specification applied to Emotionally Adaptive Immersive Environment.
Table 2. ECG, plethysmography and galvanic skin response equipment specification applied to Emotionally Adaptive Immersive Environment.
EquipmentBIOPAC™ MP160—Specifications
ECGNumber of Channels: 16
Absolute Maximum Input: ±15 V
Operational Input Voltage: ±10 V
A/D Resolution: 16 Bits
Accuracy (% of FSR): ±0.003
Input impedance: 1.0 MΩ
Amplifier Module Isolation: Provided by the MP unit, isolated clean power
CE Marking: EC Low Voltage and EMC Directives
Leakage current: <8 µA (Normal), <400 µA (Single Fault)
Fuse: 2 A (fast blow)
ECG and Respiratory AmplifierTransmitter: Ultra-low power 2.4 GHz bi-directional digital RF transmitter
Rate: 2 kHz, maximum
Screen: Color, 6 cm diagonal RF reception range: 1 m (line of sight, approx.)
Memory: 32 GB
Built-in Accelerometer: X, Y, Z—axes; rate 100–400 Hz; Range: ±2–16 G
Plethysmography and galvanic skin responseSignal type: PPG plus EDA
Resolution: PPG: FSR/4096, (4.88 mV); EDA: 0.012 μS (min step)
Operational range: 10 m
Transmitter: Ultra-low power, 2.4 GHz bi-directional digital RF transmitter; Rate: 2.000 Hz (between transmitter and receiver)
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Gomes, P.V.; Marques, A.; Donga, J.; Sá, C.; Correia, A.; Pereira, J. Adaptive Model for Biofeedback Data Flows Management in the Design of Interactive Immersive Environments. Appl. Sci. 2021, 11, 5067. https://doi.org/10.3390/app11115067

AMA Style

Gomes PV, Marques A, Donga J, Sá C, Correia A, Pereira J. Adaptive Model for Biofeedback Data Flows Management in the Design of Interactive Immersive Environments. Applied Sciences. 2021; 11(11):5067. https://doi.org/10.3390/app11115067

Chicago/Turabian Style

Gomes, Paulo Veloso, António Marques, João Donga, Catarina Sá, António Correia, and Javier Pereira. 2021. "Adaptive Model for Biofeedback Data Flows Management in the Design of Interactive Immersive Environments" Applied Sciences 11, no. 11: 5067. https://doi.org/10.3390/app11115067

APA Style

Gomes, P. V., Marques, A., Donga, J., Sá, C., Correia, A., & Pereira, J. (2021). Adaptive Model for Biofeedback Data Flows Management in the Design of Interactive Immersive Environments. Applied Sciences, 11(11), 5067. https://doi.org/10.3390/app11115067

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop