You are currently viewing a new version of our website. To view the old version click .
Sensors
  • Review
  • Open Access

9 June 2019

Towards the Recognition of the Emotions of People with Visual Disabilities through Brain–Computer Interfaces

,
,
and
Computer Science Department, Universidad Carlos III de Madrid, Av. Universidad 30, 28911 Leganés, Madrid, Spain
*
Author to whom correspondence should be addressed.
This article belongs to the Section Intelligent Sensors

Abstract

A brain–computer interface is an alternative for communication between people and computers, through the acquisition and analysis of brain signals. Research related to this field has focused on serving people with different types of motor, visual or auditory disabilities. On the other hand, affective computing studies and extracts information about the emotional state of a person in certain situations, an important aspect for the interaction between people and the computer. In particular, this manuscript considers people with visual disabilities and their need for personalized systems that prioritize their disability and the degree that affects them. In this article, a review of the state of the techniques is presented, where the importance of the study of the emotions of people with visual disabilities, and the possibility of representing those emotions through a brain–computer interface and affective computing, are discussed. Finally, the authors propose a framework to study and evaluate the possibility of representing and interpreting the emotions of people with visual disabilities for improving their experience with the use of technology and their integration into today’s society.

1. Introduction

In essence, a brain–computer interface (BCI) is a system that aims to read the activity of the human brain, thought to be the most complex biological system in the world [1]. Through a BCI an individual with a disability can have effective control over devices and computers, speech synthesizers, assistance applications and neuronal prostheses [2]. Currently, there are different research studies that focus on brain signals as a central point for assisting people with disabilities, considering it viable to analyze brain signals to convert these signals into instructions that are executed by external devices or to interpret people’s emotions [3]. Assistive technologies for people with disabilities are of great interest, and in fact, there are numerous studies aimed at improving their quality of life. However, it is still difficult to access assistive technologies, because usually they are only focused on one type of disability [4,5]. Most BCI research presents mental tasks and paradigms related to visual stimulation which then later analyze brain signals [6].
This work evaluates different research of the last decade, from 2009 to 2018, based on the application of the BCI for people with disabilities. The main search included BCI for people with visual or motor disabilities and the detection of affective states or emotions in people with visual disabilities. The research identifies an area that can still be explored, which is discussed throughout this manuscript.
For the realization of this work, the authors considered the recommendations and guidelines of Brereton et al. described in [7] and by Kitchenham et al. proposed in [8] for a systematic review of the literature.
In Figure 1, different areas of interest for this research are presented: BCI, affective computing (AC) and visually disability. As shown in Figure 1, all areas together are relevant for the acquisition of brain signals through a BCI, to recognize the affective states of people with a visual disability using AC.
Figure 1. Topics evaluated in this work.
The following questions are a fundamental part of this research:
  • Why is it important to study the emotions of a person with a visual disability?
  • Can artificial intelligence through affective computing obtain information of interest to represent the emotions of a person with a visual disability?
The answer to these questions is exposed in the discussion section.
The most common way to identify an emotion is through facial expressions and speech, although these expressions are not commonly available in all situations. In some cases bio-signals are required to examine the emotional state [9]. Accordingly, the analysis of the emotions of a person with a visual disability through a BCI and current technological tools could lead to improvement in their quality of life and integration into society.
In [10], a pilot system for communication between the brain and computer was proposed, based on evoked potentials (EP), which served as the basis for the BCI. In recent years, the study of BCI has grown exponentially [11], where the main objective was to provide a channel of communication between an individual and a computer through the analysis of brain signals.
Current data show that the efforts made have been developed with the implementation of BCI systems, seeking to improve people’s quality of life. In [2,12], they classify BCI into seven groups, according to the neural mechanisms and recording technology used. The continuous advancement of technology and its inclusion in people’s lives is resulting in improvements in accessing technologies and also new forms of communication between people and things [13].
The goal of this work was defined as follows: firstly, the evaluation of BCI systems and its impact on the lives of people with a disability, and secondly the integration of BCI and AC in the detection of emotions to people with visual disabilities. The results obtained from the analysis are detailed in the discussion section.
The text is organized as follows: Section 2 shows a review of the technologies for the implementation of BCI systems and those related to AC and visual disabilities. Section 3 presents an analysis of related research with this work. Section 4 compares the different studies presented in Section 3, and in the Section 5, possible challenges in the research of systems that integrate BCI and AC for people with visual disabilities are presented. Finally, in Section 6, conclusions and future work are presented.

2. Perspective

In this section a general description of brain–computer interfaces, affective computing and visual disability is provided.

2.1. Brain–Computer Interface

A BCI involves the work of the brain and a device that is shared to enable a communication channel between the brain and an object that is controlled externally, as described by Prasat et al. [14]. Their study describes the implementation of the classification of the movement of the left hand and the right, through a BCI.
Lebedev et al. in [15] proposed a two-fold classification of BCI: invasive and non-invasive. The first types are implanted at the level of the brain (intracranially) and their goal is to obtain signals of the highest quality. At the same time, the non-invasive ones that are placed on the scalp are based on electroencephalogram (EEG) recordings of the surface of the head.
In [16], Minguillon et al. determined that EEG recordings are generally contaminated with noise generated during the acquisition of the signals, which can be caused by endogenous reasons (physiological sources such as eye, muscle and/or cardiac activity) or exogenous ones (non-physiological sources, such as impedance mismatch, coupling of power lines, etc.).
A method for the extraction of characteristics without noise is proposed in [17]. The results obtained by Jafarifarmand et al. demonstrate the effectiveness in the extraction of the desired EEG characteristics.
In [18], Arvaneh et al. proposed an algorithm for EEG channel selection. The proposed algorithm is formulated as an optimization problem to select the least number of channels within a constraint of classification accuracy. As such, the proposed approach can be customized to yield the best classification accuracy by removing the noisy and irrelevant channels.
In [19], Kübler et al. exemplified the objective of a BCI, providing people a tool to interact with a computer without the need for any muscular participation.
The evoked potential stimuli are related to the electrophysiological measurements of the processes that have to do with certain cognitive functions of the brain (attention given when performing an activity) [20].
The evoked potentials (EP) are identified by fluctuations of the electrical potentials of the brain in a cognitive process caused either by the occurrence of an event or the presentation of a stimulus [21]. With reference to the polarity, the components of the EP can be of two types: positive P, positive polarity and negative N, negative polarity. The P300 is a type of positive EP, which appears at 300 ms after the presentation of a stimulus or event, and has proven to be one of the main approaches of the BCI to provide an effective communication channel [22].
The P300 evoked potential occurs after the start of the stimulus, which can be physical, visual or auditory. EEG and EP techniques have been used to evaluate brain activity (brain functions) and sensory function. However, EP related to events have not been used regularly [23].
In [24], electrodes that do not require gel or even a direct coupling of the scalp have been considered for practices of the BCI. This study compares wet electrodes with dry and non-contacting electrodes within a BCI paradigm of visual evoked potential. They present the development of a new capacitive electrode, without contact, that uses a custom integrated high impedance analog front-end. The contactless electrode data, which work on the upper part of the hair, show 100% accuracy compared to wet electrodes.

2.2. Affective Computing

Part of human interaction involves expressing emotions, which can be through speech or facial expressions [25]. AC is considered a discipline of artificial intelligence which seeks to develop computational methods oriented to recognizing human emotions, in addition to generating artificial emotions using computers. Emotions are a psychophysical response to an external stimulus. People express their emotions based on communication with other people [26]. In an attempt to capture the emotions of a person through a computer and the need to improve the interaction between people and the computer. Picard [27] established the main concepts of affective computing and its relationship with people with disabilities.

2.3. Visual Disability

A visual disability is a condition that directly affects the perception of images, whether partially or totally. Vison is a global sense that allows us to identify objects at a distance and at the same time. A visual disability is related to visual acuity and visual field. The term visual disability is used when there is a significant decrease in visual acuity even with the use of glasses, or a significant decrease in the visual field. People with some degree of visual disability must make a greater effort to interact with the world around them and to thereby achieve social inclusion [28].

4. Results

In this section a comparative analysis of the reviewed works is presented. It is observed that there is a trend towards the creation of BCI systems based on EEG, as a support technology for people with disabilities. In addition, it is possible to visualize the combination between BCI and AC systems, where the results of this analysis indicate that this combination is possible. The authors also analyze general purpose studies, that is, BCI for experimental research and its behavior with other technologies.
The results of the analysis of the integration of support technologies for people with visual disabilities are shown in Table 1. The features that have been considered show that the basis of the systems is BCI and an EGG. The type of potential visual or auditory evoked stimulus depends on the disability in question. Finally, AC was considered as a field that allows identifying the affective state of a person.
Table 1. Brain–computer interface and affective computing for people with a visual disability.
Figure 2 concentrates the works reviewed in the field of investigation. The results show that the efforts performed to implement BCI systems and how to detect the affective state of a person with a visual disability still requires work.
Figure 2. Trends in BCI and AC for people with disabilities.

5. Discussion

The answer to the first proposed question is discussed below: Why is it important to study the emotions of a person with a visual disability? Emotions are the way in which a person expresses their feelings—joy, anger, sadness, pleasure, etc.—before a certain situation or stimulus. However, this is difficult for individuals with a disability because they are not able to interact naturally. People with visual disabilities commonly require an intermediary that allows them to recognize and interact in the environment around them.
Affective computing has been shown to be applicable in the treatment of disorders such as autism, Asperger syndrome or depression, as well as in the recognition of stress and its mitigation. The study of affective states of a person with visual disabilities could be useful as a virtual assistant, which allows this type of person to express, recognize and interpret their emotions to improve their interaction with the environment, without the need to depend on someone else.
Although there are several ways of recognizing a person’s emotions, either through facial expressions, speech or bio-sensors, the study of brain signals by means of affective computation and a BCI is the main object of investigation of this work.
As indicated by Pantic et al. in [59] human–computer interaction should include the ability to recognize the affective states of users, to make systems more human, effective and efficient.
Regarding the second question stated in this paper: “Can affective computing obtain information of interest to represent the emotions of a person with a visual disability?”; to the best of our knowledge, the results do not show evidence of the integration of BCI and AC for detection of emotions in people with visual disabilities. However, to improve the efficiency in the interaction between the human and the computer, affective computation plays an important role; it could provide people with visual disabilities a new experience with the use of technology, through the detection of their emotions. Therefore, the authors identify that there is still a motivation to continue exploring areas that integrate affective computing, BCI systems and visual disabilities.
Based on the related research and on the results reported and analyzed, our manuscript shows that a BCI gives the opportunity for people with or without disabilities to communicate and interact with their environment through the interpretation of their brain signals. Under this approach, a BCI system widely used in the interpretation of brain signals can be based on a visual stimulus as a trigger; however, in people with visual disabilities, a BCI based on visual information is not entirely useful, which makes it necessary to move from visual stimuli to auditory stimuli in order to adjust the system to the needs of these people.
Emotions represent the affective state of a person and are expressions of mental states, given as a response to the stimuli produced in the environment. In addition, emotions influence the perception, communication and decision-making of people with or without disabilities. People with visual disabilities require auditory stimuli due to their condition. There are works related to the field of AC and BCI, which have positively reported the possibility of recognizing the affective states of a person who has been stimulated in an auditory way.
Based on the observed lack of the related works and trends on the integration of BCI systems for the detection of the affective states of a person with a visual disability, the authors propose a framework for covering this gap. Our proposal is visualized in Figure 3.
Figure 3. Integration of a BCI and AC for the detection of emotions in people with a visual disability.
The modules that compose our proposal are: (1) auditory stimulation of a person with visual disabilities; (2) use of a BCI to obtain the brain signals given by the evoked potentials; (3) an offline module to analyze the data set of the brain signals; (4) apply techniques for the extraction and classification of emotions, to finally pass to the module; (5) the identification of the affective state of the person with a visual disability.
There is also some research that identifies the emotions of people through physiological signals, as proposed by Healey et al. [60]. In this case, the authors mention that it is possible to recognize emotions using heart rate or muscle activity.
In [61], Hamdi et al. evaluated the emotional states of a person using human–machine interfaces and biofeedback sensors. Their work evaluated the data in real-time, defined as a behavioral engine to allow a realistic multimodal dialogue between an incorporated conversational agent and the person.
Kousarrizi et al. mentioned that detecting artifacts produced in electroencephalography data by muscle activity, eye blinks and electrical noise is a common and important problem in electroencephalography research [62]. Also, researchers and scientists must consider the needs of users during the design and testing of BCI systems [63].
In BCI systems, users explicitly manipulate their brain activity instead of using motor movements to produce signals that can be used to control computers or communication devices [64].
A BCI offers people the opportunity to increase their capacities by providing a new bond of interaction with the outside world and is especially relevant as an aid to paralyzed people [65]. A BCI system provides people with visual, motor, severe motor or basic communication abilities with the ability to express their wishes, emotions or to communicate and even operate external devices [66].

6. Conclusions

The results of this review show that the efforts made in this area have implemented BCI based on auditory stimulation for people with visual disabilities. On the other hand, the affective computing has detected emotional states in people who do not have a visual disability, however, the implementation of a BCI using auditory stimuli in conjunction with affective computing, for the detection of emotions in people with a visual disability, still has not been proven. Therefore, the authors consider that the use of a BCI and AC for such individuals should be evaluated and they propose a framework architecture for integrating these areas.
A BCI provides another method of communication for those who have difficulty communicating with the outside world [67], researchers have used BCI technology to build systems that allow communication between the brain and the computer through brain signals.
The construction of robust and useful BCI models from accumulated biological knowledge and available data is a major challenge, as are the associated technical problems [68]. The needs of people with visual disabilities are greater every day. Technology will continue to make an impact on the lives of people with visual disabilities in ways that were not possible before [69]. In this sense, future research is needed in several areas, in addition to developing high performance BCI systems to allow people with needs to perform activities of daily living [70].
Therefore, research that is functional, and not just experimental, is a priority for people with visual disabilities as it could enable them to live a new experience with the use of technology. The priority goal was to improve the experience with technology and promote the integration of people with visual disabilities into society.
In future work the authors aim to implement the proposed framework in order to test its impact on people with visual disabilities. In addition, other future lines of research should be focused on the effect of audiovisual stimulation in healthy people and auditory stimulation in people with visual disabilities, in order to offer a similar experience with the use of technology. In the same way, further research is required in the response to the evoked potentials from a person with visual disabilities based on auditory stimulation. It is necessary to continue evaluating the effects that occur during these types of experiments in people with visual disabilities and evaluate the results in comparison to other types of stimuli presented to people without disabilities; that is, systems that adapt to the degree of stimulation required by people with visual disabilities. Another aspect that could be evaluated is the generation of adaptive recommendation systems for people with visual disabilities, which allow these people to select the audio according to their emotional states in real-time. Also, another future line would be the creation of an emotional virtual assistant for people with visual disabilities that identifies their emotions according to the environment in which they interact and gives them alternative communication and improvement of their affective state.

Author Contributions

Writing—original draft preparation, J.L.L.-H., I.G.-C. and J.L.L.-C.; writing—review and editing, J.L.L.-H., I.G.-C., J.L.L.-C. and B.R.-M.; supervision and funding acquisition, I.G.-C. and B.R.-M.

Funding

This work was supported by the Consejo Nacional de Ciencia y Tecnología CONACyT, through the number 709656 and by the Research Program of the Ministry of Economy and Competitiveness—Government of Spain, (DeepEMR project TIN2017-87548-C2-1-R).

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Gao, S.; Wang, Y.; Gao, X.; Hong, B. Visual and auditory brain-computer interfaces. IEEE Trans. Biomed. Eng. 2014, 61, 1436–1447. [Google Scholar] [PubMed]
  2. Bashashati, A.; Fatourechi, M.; Ward, R.K.; Birch, G.E. A survey of signal processing algorithms in brain-computer interfaces based on electrical brain signals. J. Neural Eng. 2007, 4. [Google Scholar] [CrossRef] [PubMed]
  3. Domingo, M.C. An overview of the Internet of Things for people with disabilities. J. Netw. Comput. Appl. 2012, 35, 584–596. [Google Scholar] [CrossRef]
  4. Millán, J.D.R.; Rupp, R.; Müller-Putz, G.R.; Murray-Smith, R.; Giugliemma, C.; Tangermann, M.; Vidaurre, C.; Cincotti, F.; Kübler, A.; Leeb, R.; et al. Combining brain-computer interfaces and assistive technologies: State-of-the-art and challenges. Front. Neurosci. 2010, 4, 1–15. [Google Scholar] [CrossRef] [PubMed]
  5. Deng, J.; Yao, J.; Dewald, J.P.A. Classification of the intention to generate a shoulder versus elbow torque by means of a time-frequency synthesized spatial patterns BCI algorithm. J. Neural Eng. 2005, 2, 131–138. [Google Scholar] [CrossRef] [PubMed]
  6. Riccio, A.; Mattia, D.; Simione, L.; Olivetti, M.; Cincotti, F. Eye-gaze independent EEG-based brain-computer interfaces for communication. J. Neural Eng. 2012, 9. [Google Scholar] [CrossRef] [PubMed]
  7. Brereton, P.; Kitchenham, B.A.; Budgen, D.; Turner, M.; Khalil, M. Lessons from applying the systematic literature review process within the software engineering domain. J. Syst. Softw. 2007, 80, 571–583. [Google Scholar] [CrossRef]
  8. Kitchenham, B.; Charters, S. Procedures for Performing Systematic Literature Reviews in Software Engineering; Technical Report; Durham University: Durham, UK, 2007. [Google Scholar]
  9. Hamdi, H.; Richard, P.; Suteau, A.; Allain, P. Emotion assessment for affective computing based on physiological responses. In Proceedings of the 2012 IEEE International Conference on Fuzzy Systems, Brisbane, QLD, Australia, 10–15 June 2012. [Google Scholar]
  10. Vidal, J.J. Toward Direct Brain-Computer Communication. Annu. Rev. Biophys. Bioeng. 1973, 2, 157–180. [Google Scholar] [CrossRef] [PubMed]
  11. Jeanmonod, D.J.; Suzuki, K. We are IntechOpen, the world’s leading publisher of Open Access books Built by scientists, for scientists TOP 1% Control of a Proportional Hydraulic System. Intech. Open 2018, 2, 64. [Google Scholar]
  12. Blankertz, B.; Curio, G.; Vaughan, T.M.; Schalk, G.; Wolpaw, J.R.; Neuper, C.; Pfurtscheller, G.; Hinterberger, T.; Birbaumer, N. The BCI Competition 2003: Progress and Perspectives in Detection and Discrimination of EEG Single Trials. IEEE Trans. Biomed. Eng. 2004, 51, 1044–1051. [Google Scholar] [CrossRef]
  13. Khan, R. Future Internet: The Internet of Things Architecture, Possible Applications and Key Challenges. In Proceedings of the 2012 10th International Conference on Frontiers of Information Technology, Islamabad, India, 17–19 December 2012; pp. 257–260. [Google Scholar]
  14. Pattnaik, P.K.; Sarraf, J. Brain Computer Interface issues on hand movement. J. King Saud. Univ. Comput. Inf. Sci. 2018, 30, 18–24. [Google Scholar] [CrossRef]
  15. Lebedev, M.A.; Nicolelis, M.A.L. Brain-machine interfaces: past, present and future. Trends Neurosci. 2006, 29, 536–546. [Google Scholar] [CrossRef] [PubMed]
  16. Minguillon, J.; Lopez-Gordo, M.A.; Pelayo, F. Trends in EEG-BCI for daily-life: Requirements for artifact removal. Biomed. Signal Process. Control 2017, 31, 407–418. [Google Scholar] [CrossRef]
  17. Jafarifarmand, A.; Badamchizadeh, M.A. Artifacts removal in EEG signal using a new neural network enhanced adaptive filter. Neurocomputing 2013, 103, 222–231. [Google Scholar] [CrossRef]
  18. Arvaneh, M.; Guan, C.; Ang, K.K.; Quek, C. Optimizing the channel selection and classification accuracy in EEG-based BCI. IEEE Trans. Biomed. Eng. 2011, 58, 1865–1873. [Google Scholar] [CrossRef]
  19. Kübler, A.; Furdea, A.; Halder, S.; Hammer, E.M.; Nijboer, F.; Kotchoubey, B. A brain-computer interface controlled auditory event-related potential (p300) spelling system for locked-in patients. Ann. N. Y. Acad. Sci. 2009, 1157, 90–100. [Google Scholar] [CrossRef]
  20. Shukla, R.; Trivedi, J.K.; Singh, R.; Singh, Y.; Chakravorty, P. P300 event related potential in normal healthy controls of different age groups. Indian J. Psychiatry 2000, 42, 397–401. [Google Scholar]
  21. Núñez-Peña, M.I.; Corral, M.J.; Escera, C. Potenciales evocados cerebrales en el contexto de la investigación psicológica: Una actualización. Anu. Psicol. 2004, 35, 3–21. [Google Scholar]
  22. Utsumi, K.; Takano, K.; Okahara, Y.; Komori, T.; Onodera, O.; Kansaku, K. Operation of a P300-based braincomputer interface in patients with Duchenne muscular dystrophy. Sci. Rep. 2018, 8, 1753. [Google Scholar] [CrossRef]
  23. Polich, J. Clinical application of the P300 event-related brain potential. Phys. Med. Rehabil. Clin. N. Am. 2004, 15, 133–161. [Google Scholar] [CrossRef]
  24. Chi, Y.M.; Wang, Y.T.; Wang, Y.; Maier, C.; Jung, T.P.; Cauwenberghs, G. Dry and noncontact EEG sensors for mobile brain-computer interfaces. IEEE Trans. Neural Syst. Rehabil. Eng. 2012, 20, 228–235. [Google Scholar] [CrossRef] [PubMed]
  25. Terven, J.R.; Salas, J.; Raducanu, B. New Opportunities for computer vision-based assistive technology systems for the visually impaired. Computer (Long. Beach. Calif.) 2014, 47, 52–58. [Google Scholar]
  26. Tsai, T.-W.; Lo, H.Y.; Chen, K.-S. An affective computing approach to develop the game-based adaptive learning material for the elementary students. In Proceedings of the 2012 Joint International Conference on Human-Centered Computer Environments (HCCE ’12), New York, NY, USA, 8–13 March 2012. [Google Scholar]
  27. Picard, R.W. Affective Computing; MIT Press: Cambridge, MA, USA, 1997. [Google Scholar]
  28. Lobera, J.; Mondragón, V.; Contreras, B. Guía Didáctica para la Inclusión en Educación inicial y Básica; Technical Report; Secretaria de Educación Pública: México city, México, 2010.
  29. Sarwar, S.Z.; Aslam, M.S.; Manarvi, I.; Ishaque, A.; Azeem, M. Noninvasive imaging system for visually impaired people. In Proceedings of the 2010 3rd International Conference on Computer Science and Information Technology (ICCSIT 2010), Chengdu, China, 9–11 July 2010. [Google Scholar]
  30. Guo, J.; Gao, S.; Hong, B. An Auditory Brain–Computer Interface. IEEE Trans. Neural Syst. Rehabil. Eng. 2010, 18, 230–235. [Google Scholar] [PubMed]
  31. Hinterberger, T.; Hi, J.; Birbaume, N. Auditory brain-computer communication device. In Proceedings of the IEEE International Workshop on Biomedical Circuits and Systems, Singapore, 1–3 December 2004. [Google Scholar]
  32. Nijboer, F.; Furdea, A.; Gunst, I.; Mellinger, J.; McFarland, D.J.; Birbaumer, N.; Kübler, A. An auditory brain-computer interface (BCI). J. Neurosci. Methods 2008, 167, 43–50. [Google Scholar] [CrossRef] [PubMed]
  33. Klobassa, D.S.; Vaughan, T.M.; Brunner, P.; Schwartz, N.E.; Wolpaw, J.R.; Neuper, C.; Sellers, E.W. Toward a high-throughput auditory P300-based brain-computer interface. Clin. Neurophysiol. 2009, 120, 1252–1261. [Google Scholar] [CrossRef] [PubMed]
  34. Sellers, E.W.; Ryan, D.B.; Hauser, C.K. Noninvasive brain-computer interface enables communication after brainstem stroke. Sci. Transl. Med. 2014, 6. [Google Scholar] [CrossRef] [PubMed]
  35. Okahara, Y.; Takano, K.; Komori, T.; Nagao, M.; Iwadate, Y.; Kansaku, K. Operation of a P300-based brain-computer interface by patients with spinocerebellar ataxia. Clin. Neurophysiol. Pract. 2017, 2, 147–153. [Google Scholar] [CrossRef] [PubMed]
  36. Millán, J.d.R.; Renkens, F.; Mouriño, J.; Gerstner, W. Brain-actuated interaction. Artif. Intell. 2004, 159, 241–259. [Google Scholar] [CrossRef]
  37. Sirvent Blasco, J.L.; Iáñez, E.; Úbeda, A.; Azorín, J.M. Visual evoked potential-based brain-machine interface applications to assist disabled people. Expert Syst. Appl. 2012, 39, 7908–7918. [Google Scholar] [CrossRef]
  38. Hill, N.J.; Lal, T.N.; Bierig, K.; Birbaumer, N.; Scholkopf, B. Attentional modulation of auditory event-related potentials in a brain-computer interface. In Proceedings of the IEEE International Workshop on Biomedical Circuits and Systems, Singapore, 1–3 December 2004. [Google Scholar]
  39. Hill, N.J.; Schölkopf, B. An online brain-computer interface based on shifting attention to concurrent streams of auditory stimuli. J. Neural Eng. 2012, 9, 1–23. [Google Scholar] [CrossRef]
  40. Suwa, S.; Yin, Y.; Cui, G.; Tanaka, T.; Cao, J.; Algorithm, A.E.M.D. A design method of an auditory P300 with P100 brain computer interface system. In Proceedings of the 2012 IEEE 11th International Conference on Signal Processing, Beijing, China, 21–25 October 2012. [Google Scholar]
  41. Yin, E.; Zeyl, T.; Saab, R.; Hu, D.; Zhou, Z.; Chau, T. An Auditory-Tactile Visual Saccade-Independent P300 Brain–Computer Interface. Int. J. Neural Syst. 2015, 26, 1650001. [Google Scholar] [CrossRef] [PubMed]
  42. Collinger, J.L.; Wodlinger, B.; Downey, J.E.; Wang, W.; Tyler-Kabara, E.C.; Weber, D.J.; McMorland, A.J.C.; Velliste, M.; Boninger, M.L.; Schwartz, A.B. High-performance neuroprosthetic control by an individual with tetraplegia. Lancet 2013, 381, 557–564. [Google Scholar] [CrossRef]
  43. Wang, Y.T.; Wang, Y.; Jung, T.P. A cell-phone based brain-computer interface for communication in daily life. J. Neural Eng. 2011, 8, 025018. [Google Scholar] [CrossRef] [PubMed]
  44. Daly, I.; Williams, D.; Malik, A.; Weaver, J.; Kirke, A.; Hwang, F.; Miranda, E.; Nasuto, S.J. Personalised, Multi-modal, Affective State Detection for Hybrid Brain-Computer Music Interfacing. IEEE Trans. Affect. Comput. 2018, 3045, 1–14. [Google Scholar] [CrossRef]
  45. Williams, D.; Kirke, A.; Miranda, E.; Daly, I.; Hwang, F.; Weaver, J.; Nasuto, S. Affective calibration of musical feature sets in an emotionally intelligent music composition system. ACM Trans. Appl. Percept. 2017, 14, 1–13. [Google Scholar] [CrossRef]
  46. Murugappan, M.; Nagarajan, R.; Yaacob, S. Combining spatial filtering and wavelet transform for classifying human emotions using EEG Signals. J. Med. Biol. Eng. 2011, 31, 45–51. [Google Scholar] [CrossRef]
  47. Miranda, E.R.; Durrant, S.; Anders, T. Towards brain-computer music interfaces: Progress and challenges. In Proceedings of the 2008 First International Symposium on Applied Sciences on Biomedical and Communication Technologies (ISABEL 2008), Aalborg, Denmark, 25–28 October 2008. [Google Scholar]
  48. Khosrowabadi, R.; Quek, H.C.; Wahab, A.; Ang, K.K. EEG-based emotion recognition using self-organizing map for boundary detection. In Proceedings of the 2010 20th International Conference on Pattern Recognition, Istanbul, Turkey, 23–26 August 2010. [Google Scholar]
  49. Mühl, C.; Brouwer, A.M.; van Wouwe, N.; van den Broek, E.; Nijboer, F.; Heylen, D.K.J. Modality-specific affective responses and their implications for affective BCI. In Proceedings of the Fifth International Brain-Computer Interface Conference 2011, Graz, Austria, 22–24 September 2011. [Google Scholar]
  50. Petrantonakis, P.C.; Hadjileontiadis, L.J. Emotion Recognition From EEG Using Higher Order Crossings. IEEE Trans. Inf. Technol. Biomed. 2010, 14, 186–197. [Google Scholar] [CrossRef]
  51. Nie, D.; Wang, X.W.; Shi, L.C.; Lu, B.L. EEG-based emotion recognition during watching movies. In Proceedings of the 2011 5th International IEEE/EMBS Conference on Neural Engineering, Cancun, Mexico, 27 April–1 May 2011. [Google Scholar]
  52. Hsu, J.L.; Zhen, Y.L.; Lin, T.C.; Chiu, Y.S. Personalized music emotion recognition using electroencephalography (EEG). In Proceedings of the 2014 IEEE International Symposium on Multimedia, Taichung, Taiwan, 10–12 December 2014. [Google Scholar]
  53. Byun, S.W.; Lee, S.P.; Han, H.S. Feature selection and comparison for the emotion recognition according to music listening. In Proceedings of the 2017 International Conference on Robotics and Automation Sciences (ICRAS), Hong Kong, China, 26–29 August 2017. [Google Scholar]
  54. Sourina, O.; Liu, Y. EEG-enabled affective applications. In Proceedings of the 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction, Geneva, Switzerland, 2–5 September 2013. [Google Scholar]
  55. Diesner, J.; Evans, C.S. Little Bad Concerns: Using Sentiment Analysis to Assess Structural Balance in Communication Networks. In Proceedings of the 2015 IEEE/ACM International Conference on Advances in Social Networks Analysis and Mining (ASONAM), Paris, France, 25–28 August 2015. [Google Scholar]
  56. Tseng, K.C.; Lin, B.S.; Wong, A.M.K.; Lin, B.S. Design of a mobile brain computer interface-based smart multimedia controller. Sensors 2015, 15, 5518–5530. [Google Scholar] [CrossRef]
  57. Xu, H.; Zhang, D.; Ouyang, M.; Hong, B. Employing an active mental task to enhance the performance of auditory attention-based brain-computer interfaces. Clin. Neurophysiol. 2013, 124, 83–90. [Google Scholar] [CrossRef]
  58. Koelstra, S.; Mühl, C.; Soleymani, M.; Lee, J.S.; Yazdani, A.; Ebrahimi, T.; Pun, T.; Nijholt, A.; Patras, I. DEAP: A database for emotion analysis; Using physiological signals. IEEE Trans. Affect. Comput. 2012, 3, 18–31. [Google Scholar] [CrossRef]
  59. Pantic, M.; Rothkrantz, L.J.M. Toward an affect-sensitive multimodal human-computer interaction. Proc. IEEE 2003, 91, 1370. [Google Scholar] [CrossRef]
  60. Healey, J.; Picard, R. SmartCar: Detecting driver stress. In Proceedings of the 15th International Conference on Pattern Recognition (ICPR-2000), Barcelona, Spain, 3–7 September 2000. [Google Scholar]
  61. Hamdi, H.; Richard, P.; Suteau, A.; Saleh, M. A Multi-Modal Virtual Environment To Train for Job Interview. In Proceedings of the 1st International Conference on Pervasive and Embedded Computing and Communication Systems (PECCS 2011), Algarve, Portugal, 5–7 March 2011. [Google Scholar]
  62. Kousarrizi, M.R.N.; Ghanbari, A.A.; Teshnehlab, M.; Aliyari, M.; Gharaviri, A. Feature extraction and classification of EEG signals using wavelet transform, SVM and artificial neural networks for brain computer interfaces. In Proceedings of the 2009 International Joint Conference on Bioinformatics, Systems Biology and Intelligent Computing, Shanghai, China, 3–5 August 2009. [Google Scholar]
  63. Padfield, N.; Zabalza, J.; Zhao, H.; Masero, V.; Ren, J. EEG-Based Brain-Computer Interfaces Using Motor-Imagery: Techniques and Challenges. Sensors 2019, 19, 1423. [Google Scholar] [CrossRef] [PubMed]
  64. Allison, B.Z.; Neuper, C. Could Anyone Use a BCI? In Brain-Computer Interfaces; Tan, D., Nijholt, A., Eds.; Springer: London, UK, 2010. [Google Scholar]
  65. De Negueruela, C.; Broschart, M.; Menon, C.; Del R. Millán, J. Brain-computer interfaces for space applications. Pers. Ubiquitous Comput. 2011, 15, 527–537. [Google Scholar] [CrossRef]
  66. Wolpaw, J.R.; Birbaumer, N.; McFarland, D.J.; Pfurtscheller, G.; Vaughan, T.M. Brain-computer interfaces for communication and control. Clin. Neurophysiol. 2002, 113, 767–791. [Google Scholar] [CrossRef]
  67. Suefusa, K.; Tanaka, T. A comparison study of visually stimulated brain-computer and eye-tracking interfaces. J. Neural Eng. 2017, 14. [Google Scholar] [CrossRef] [PubMed]
  68. Makeig, S.; Kothe, C.; Mullen, T.; Bigdely-Shamlo, N.; Zhang, Z.; Kreutz-Delgado, K. Evolving Signal Processing for Brain 2013; Computer Interfaces. Proc. IEEE 2012, 100, 1567–1584. [Google Scholar] [CrossRef]
  69. Bhowmick, A.; Hazarika, S.M. An insight into assistive technology for the visually impaired and blind people: state-of-the-art and future trends. J. Multimodal User Interfaces 2017, 11, 149–172. [Google Scholar] [CrossRef]
  70. Yuan, H.; He, B. Brain–Computer Interfaces Using Sensorimotor Rhythms: Current State and Future Perspectives. IEEE Trans. Biomed. Eng. 2014, 61, 1425–1435. [Google Scholar] [CrossRef] [PubMed]

Article Metrics

Citations

Article Access Statistics

Multiple requests from the same IP address are counted as one view.