Next Article in Journal
Editorial for the Special Issue “Personal Health and Wellbeing Intelligent Systems Based on Wearable and Mobile Technologies”
Next Article in Special Issue
Dance Pose Identification from Motion Capture Data: A Comparison of Classifiers
Previous Article in Journal
Risks of Stigmatisation Resulting from Assistive Technologies for Persons with Autism Spectrum Disorder
Previous Article in Special Issue
Experiments with a First Prototype of a Spatial Model of Cultural Meaning through Natural-Language Human-Robot Interaction
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

User’s Emotions and Usability Study of a Brain-Computer Interface Applied to People with Cerebral Palsy †

1
Department of Applied Computing, Universidade do Vale de Itajaí, Itajaí 88302-901, Santa Catarina, Brazil
2
Assistive Technology Center, Fundação Catarinense de Educação Especial, São José 88108-900, Santa Catarina, Brazil
3
Department of Production Engineering, Universidade Federal de Santa Catarina, Florianópolis 88035-001, Santa Catarina, Brazil
*
Authors to whom correspondence should be addressed.
This paper is an extended version of our paper published in Proceedings of the 10th International Conference on PErvasive Technologies Related to Assistive Environments (PETRA’17), Island of Rhodes, Greece, 21–23 June 2017.
Technologies 2018, 6(1), 28; https://doi.org/10.3390/technologies6010028
Submission received: 15 November 2017 / Revised: 22 February 2018 / Accepted: 23 February 2018 / Published: 28 February 2018

Abstract

:
People with motor and communication disorders face serious challenges in interacting with computers. To enhance this functionality, new human-computer interfaces are being studied. In this work, a brain-computer interface based on the Emotiv Epoc is used to analyze human-computer interactions in cases of cerebral palsy. The Phrase-Composer software was developed to interact with the brain-computer interface. A system usability evaluation was carried out with the participation of three specialists from The Fundação Catarinense de Educação especial (FCEE) and four cerebral palsy volunteers. Even though the System Usability Scale (SUS) score was acceptable, several challenges remain. Raw electroencephalography (EEG) data were also analyzed in order to assess the user’s emotions during their interaction with the communication device. This study brings new evidences about human-computer interaction related to individuals with cerebral palsy.

1. Introduction

Cerebral palsy is a nomenclature used to define a group of sensorimotor and posture developmental disorders, involving muscle tone changes, described as secondary and non-progressive motor developmental syndromes [1]. Lesions or abnormalities in the brain are the source of these disorders in infancy or early childhood, causing complex symptoms and various sorts and degrees of motor involvement that permanently affect body movement and muscle coordination [2]. The prevalence of cerebral palsy is two or three per 1000 births [3]. In Brazil, epidemiological studies reported about 30,000 new cases annually [1].
People with severe motor disorders, such as those caused by cerebral palsy, communicate through eye movements or blinking, as well as through extremely limited finger, foot, and toe movements [4]. In severe cases, traditional interfaces and even adapted mouse and keyboards remain a barrier to promoting interaction with computers [5].
Individuals with cerebral palsy present motor and speech disorders which constrain the usage of traditional communication media, thus requiring great effort and skills to deal with their participation in education and social life [6]. Augmentative and Alternative Communication (AAC) strategies can be employed to promote social inclusion in these complex cases [7]. AAC could supplement, or even completely replace, speech and writing modes [2].
Nowadays, advanced AAC technologies are offering new ways of interaction with computers. High-tech AAC solutions involve software and electronic components designed for standard computers or mobile devices [2]. Human-computer interfaces are examples of those high-tech solutions.
Research on brain-computer interfaces (BCI) with humans began in the 1970s at the University of California, Los Angeles (UCLA), under a grant from the National Science Foundation. This was followed by a contract from the Defense Advanced Research Projects Agency (DARPA) [8].
A brain-computer interface is a kind of human-computer interface. This terminology is also known as brain-machine interface (BMI) in the scientific literature. A BCI can be defined as a computational system that is able to read, analyze, and interpret brain signals to trigger a desired action [9].
The research and development in the field of BCI has focused primarily on neuroprosthetics applications to restore hearing, sight, and movement [8]. Recent research addressed the use of brain-computer interfaces in medicine for the treatment of post-stroke conditions as well as in other applications [10].
Brain-computer interfaces can be classified into three categories, according to how a particular equipment acquires brain signals. These categories are invasive, partially invasive, and noninvasive [11]. Measuring electroencephalographic signals (EEG) is a simple noninvasive technique to monitor electrical brain activity. Small signal amplitudes and noisy measurements [12] characterize this technique.
In this research, a usability study of a noninvasive brain-computer interface was performed with spastic quadriplegic cerebral palsy individuals, based on the Emotiv Epoc [13]. A software, named Phrase-Composer, was designed to interact with the Emotiv system. This software allows writing single words and phrases [14]. User experience was preliminarily evaluated by adapting the System Usability Scale tool [15]. In addition, data were analyzed in order to assess the user’s emotions during interactions with the communication device.
The study was developed in partnership with The Associação Catarinense de Educação Especial (FCEE), in Santa Catarina, Brazil. Two pedagogues, a therapist, and four cerebral palsy volunteers participated in the study. It is expected that this study will add new evidences to the design of accessible brain-computer applications for the target audience.
This work is an extended version of the paper submitted to the 10th International Conference on Pervasive Technologies Related to Assistive Environments (PETRA). In this version, the usability study was extended to eight cerebral palsy users, but due to the reasons that will be exposed in this paper, only four qualified for the next steps. In addition, a preliminary study of emotion with one of the users was added.

2. Related Works

Within the advent of BCI, the idea of affective interfaces arose, enabling affect detection from brain signals [16]. The authors show that there is a growing interest in researching the capabilities, limitations, and challenges of affecting neurophysiological activity. They also present a survey of noninvasive brain-computer interfaces, as well as their principles, state-of-the-art, and challenges.
More recently, new types of devices are being developed to enable disabled people to interact with their environment. Studies on the uses of brain-computer interfaces to benefit individuals’ cognitive and perceptual skills development are underway [17].
In Reference [18], the authors compared the usability of three electroencephalogram systems in a classical BCI paradigm. They concluded that appearance and ease of setup are more important than comfort and, without any information about effectiveness, research participants preferred the Emotiv Epoc.
Non-invasive EEG based brain-computer interfaces have been proposed as potential assistive devices for individuals with cerebral palsy [19]. According to the authors, there are several challenges to be overcome because the translation of EEG signals may be unreliable and requires long periods of training. In addition, individuals with cerebral palsy may exhibit high levels of spontaneous and uncontrolled movement, which impact EEG signal quality.
In Reference [20], a new EEG-based row-column scanning communication board was developed for individuals with cerebral palsy. The results suggest that users performed better than chance and were consequently able to communicate by using the developed system.
In Reference [21], a case study describes how individuals with spastic quadriplegic cerebral palsy were trained to use a commercial EEG-based brain-computer interface. The participant played a game, based on EEG feedback, by the training of left and right arm motor imagery, showing improvement in the production of two distinct EEG patterns.
In Reference [22], disabled students controlled a brain-computer interface to type words onto a computer screen via the Dasher text entry system. The study suggests that BCI technology shows great potential as a viable text entry alternative for people with cerebral palsy.
Recently, training and testing protocol for individuals with spastic quadriplegic cerebral palsy was evaluated by using a commercial electroencephalography (EEG)-based BCI [23]. The study used EEG feedback of left and right arm motor imagery. The results related to the ability to produce two distinct EEG patterns were inconclusive.
Scientific literature highlights that brain-computer interfaces are still premature as a form of assistive technology for people with cerebral palsy [23]. It is also necessary to create suitable training environments for the use of brain-computer interfaces [21]. In addition, it has been shown that the performance of participants is often influenced by mood, motivation, physical illness, fatigue, and concentration. It can be inferred that usability studies are necessary to guide the design of BCI to better meet the needs of specific user groups, such as disabled people, as well as common users.

3. Materials and Methods

In this research, the Emotiv Epoc [13] was used, as shown in Figure 1. This BCI is a commercial system that captures and analyzes electrophysiological and non-physiological signals. It combines physiological (EEG and myoelectric signals) and non-physiological (head movements) interaction. It is able to process cognitive analysis of facial expressions and plain movements of the head, eyes, and eyebrows. By using the software-developed kit (SDK), it is possible to convert the physiological and non-physiological signals into commands of traditional peripherals, such as keyboard and mouse, thereby emulating specific actions.
This system allows:
  • Capture of head movements by using acceleration sensors;
  • Capture of blink and movements of the eyes and different facial expressions and through encephalography (EEG);
  • Affective detection.
Command recognition can occur in two ways: directly, through the control panel of the SDK, or by communicating the SDK with another application software. It is important to note that this interaction requires driver’s installation and previous configuration of the SDK commands. The SDK is available as free or commercial versions. The free version, Emotiv Control Panel—Lite, enables tests without the headset (from simulations) and can also process real signals obtained from the sensors.
On the other hand, the commercial version, Emotiv Pure—EEG Raw, features advanced detection and processing mechanisms, visualizing the physiological signals in real-time, as pictured in Figure 2, in addition to advanced configuration options. In this figure, FFT refers to the Fast Fourier Transform [12].
Figure 2 shows the time domain in EEG brain signals registering a simultaneous eye blink, which was detected by the sensors located in AF3, F7, AF4, F8 (note the two first lines at the top and the two last lines at the bottom of the screen) according to the International 10–20 System for EEG electrode placement [13].
Figure 3 shows the main screen of the Phrase-Composer software. It was designed to mediate the communication between the Emotiv system and the user. The software allows the user to select words and phrases through a row-column scanning technique. The red box means that this particular set of letters is active and any letter inside this group could be selected by using brain signals.
Methodological procedures were based on bibliographical, documentary, and experimental research. The bibliographic research addressed alternative and augmentative communication solutions in cases of severe motor disorders and lack of speech, with emphasis on brain-computer interfaces. The documentary research focused on the work of professionals and on the physical study, facial movements, and skills of the cerebral palsy volunteers who were involved in this study. The experimental research aimed at evaluating user interaction with a brain-computer interface through a participatory design.
Eight users were available to participate in the experiments, but only four completed all of the phases. It was not possible to obtain the electrical signals from one of the volunteers (adult, male). A child (male) turned out to be very anxious. A third volunteer (adult, female) experienced vertigo and fatigue. A fourth subject (adult, female) became ill. The therapist involved in this research suggested that those users be eliminated from the experiment. The remaining four participants experienced four sessions of 30 min each, exploring the system on different days (these sessions were also used to customize the system for each user), in addition to three or four sessions using the system. After configuring and training, the participants could control the BCI and used it to type alphabet letters using the Phase-Composer application. It is important to note that no complaints were observed using the BCI during the experiments, probably motivated by the usage of such advanced technology for the first time.
Although four is a modest number of individuals to carry out statistical inferences on the results, the difficulties in recruiting individuals with disabilities for usability evaluations should be taken into account, as discussed in Reference [24]. In addition, one should consider that, according to Reference [25], four individuals are sufficient to reveal more than 65% of the usability problems found in experiments with a much higher number of participants.
The usability study was conducted over six months. The study involved the following activities:
  • Participatory design with students and professionals;
  • Phrase-Composer software development and improvement;
  • System evaluation by two pedagogues and one therapist.
This research was conducted with full ethics approval on human research, provided by CEP-Universidade do Vale de Itajai (Univali) under code CAAE 08390412.3.0000.0120.

4. Experiment

The first part of the experiment consisted in completing four sessions of exploring and familiarizing the users and professionals with the system. This procedure was also used to customize the software for each participant. Table 1 shows the types of user interactions that were studied after configuring the physiological and non-physiological signals. Letter Y means that it was possible to recognize a given signal, and letter N, that was unreliable. It is important to remark that the Emotiv Epoc can also capture facial expressions and can process cognitive skills; however, those features were not addressed in this work.
All the four adults are male, having spastic quadriplegic cerebral palsy. User A, aged 40–50 years, uses a wheelchair; his speech is not understandable; he performs head movements of low amplitude and has low visual acuity. This user responded quite well to EEG signals. However, sometimes the eye movements overlapped with head movements, making it very difficult to discern between them. After the analysis, eye blink and head movement were selected to interact with the system.
User B, aged 30–40 years, has moderate mental deficiency. He uses a wheelchair and has limitations in the upper limbs. His speech is understandable, and he has low-amplitude head movements and visual impairment, compensated by wearing glasses. Eye blink and eye movement were selected to interact with the system in this case.
User C, aged 30–40 years, uses a wheelchair, and moves the upper limbs with some difficulty. His speech is not understandable. He also performs head movements of low amplitude and has low visual acuity. Additional tests were performed to identify facial expressions with this user. After the analysis, eye and head movements were selected to interact with the system.
User D, aged 30–40 years, uses a wheelchair; his speech is not understandable and he performs random movement of the head. After the analysis, simultaneous blink and eye movement were selected to interact with the system.
The last part of the experiment was performed in three or four sessions using the system. This consisted in requesting the selection of specific letters of the alphabet, measuring the time spent, errors and adequate alphabet letter selections. In addition, one of the pedagogues executed the same procedures, generating a reference profile about system usability when no motor and communication disorders are present.
The therapist and a second pedagogue conducted the experiments, deciding when it should be stopped, for example, in case of fatigue. The experiment consisted in requesting the user to select specific letters of the alphabet by using the system.
Table 2 shows the performance of User A. He was requested to select the first three letters of User A’s name. The time between selections was set to 1 s for this user. There was a five minute break between the four sessions.
Action time represents the time (in seconds) which is needed to select the first letter. In this case, it took 12 s in the first experiment. The user selected the letters that were requested. However, it is possible to note a slight decrease in the number of selections and an increase in the total time, from one session to the next. This was probably motivated by fatigue or loss of concentration. On the other hand, the time required to select the first letter was reduced in the last sessions. This procedure took 12 s in the firsts sessions and 7 s in the third and fourth trials. Another important feature is that this user did not make any mistakes.
Figure 4 shows User A interacting with the system. Letters selection was made from the “yes” movement of the head. The eye blink was used to delete a possible incorrect selection. In Figure 4, User A is interacting with the system.
Table 3 illustrates the pedagogue’s performance. Blinking and eye movement were selected to interact with the system. This participant spent one half-hour session exploring the system and three sessions using it. After configuring and training, the participant could control the BCI and used it to type the alphabet letters that were requested.
The time between selections was set to 1 s; however, this participant could have used a shorter time, because of his skills. The pedagogue selected the first letter in 12 s. The pedagogue also selected a higher number of letters in each experiment, compared to the user with cerebral palsy. In the last trial, two letters of the alphabet were erased and rectified (because they were incorrectly typed). It could be inferred that fatigue or loss of concentration caused the decreasing number of selections and the increasing total time spent through the sessions.
Figure 5 shows the results of all users. There is a downward trend in the number of selections through the sessions.

Usability

Qualitative assessments also supplied important feedback. In this work, the System Usability Scale (SUS) tool [15] was adapted according to the perspective of the professionals who participated in the experience. The questionnaire was composed of 10 items evaluated using a Likert scale [26]. The values were the following: totally disagree (value 1), disagree (value 2), neutral (value 3), agree (value 4), and strongly agree (value 5). There was a balance between positive and negative affirmations, avoiding bias. The items of the questionnaire were:
  • I think the user would like to use this system frequently.
  • Computer-user interaction through the Emotiv Epoc is too complex.
  • The use of the Phrase-Composer software is intuitive.
  • I think the user would need more technical support to use the system.
  • I think the user learned to use the system quickly.
  • The system does not allow proper configuration.
  • I would imagine that most people, under similar conditions as the user, would learn to use this system.
  • I believe that this system would not meet the communication needs in cases of cerebral palsy users.
  • I felt that the user was very confident using the system.
  • The user needs previous knowledge before using this system.
The SUS score was calculated using the following equation:
score = 2.5     [ ( q p o s 1 ) + ( 5   q n e g ) ]
where qpos and qneg are the values of the positive and negative items, respectively. According to Reference [27], an SUS score above 68 indicates acceptable usability of the system or product, and an SUS score below 50 indicates an unacceptable usability. Table A1 in Appendix A, shows the SUS results.

5. Results

The overall evaluation was 67.5, showing an acceptable usability. Among the questions, the student’s desire to use the system stood out positively (agree). In addition, other characteristics were also evaluated positively (agree), such as: “I think the user learned to use the system quickly”, “I would imagine that most people, under similar conditions of the user, would learn to use this system”, and “I felt that the user was very confident using the system”. In addition, in analyzing the negative statements “The system does not allow proper configuration” and “The user needs previous knowledge before using this system”, the answers were strongly disagree. Other features were evaluated as neutral, such as “Computer-user interaction through the Emotiv Epoc is too complex”, “The use of the Phrase-Composer software is intuitive”, and “I believe that this system would not meet the communication needs in cases of cerebral palsy users”. Among the negative aspects, the answer to “I think the user would need technical support to use the system” was agree.

Affective Study

A preliminary analysis of the raw EEG data was also conducted in order to assess the users’ emotions during their interaction with the communication device. According to Reference [28], positive and negative emotions are associated with alpha frequencies, from 8 Hz to 13 Hz, at points F3 and F4 in the International 10–20 System for EEG electrode placement, respectively.
The raw data were analyzed with the EEGLab software [29], filtering out noises from blinking. Figure 6 and Figure 7 shows the brain activity, in a range from 7 Hz to 30 Hz—from alpha to beta (Bootstrap Method, 0.1%). In the figures, ERSP means Event-related spectral perturbation, ITC means Inter-trial coherence and ERP means Event-Related Potential.
By correlating the brain activity data at F3 (alpha frequencies) with the videos showing the user interacting with the device, we inferred that positive emotions were associated with moments at which the user interacted with the device without disturbance. On the other hand, by relating brain activity data at F4 (alpha frequencies) with the same videos, we inferred that negative emotions were associated with moments when the user committed flaws or when he was interrupted to restart the program.
From this preliminary analysis, EGG seems to have the potential to be used to identify, in an objective way, events (negative or positive) that usually pass unnoticed by system developers. However, we still need to deepen our research on the use of EEG in the analysis of affective interaction aspects between users with cerebral palsy and Augmentative and Alternative Communication (AAC) devices. A link to this research was included in the Supplementary Materials Video S1.

6. Discussion

Non-invasive electroencephalogram (EEG) based on brain-computer interfaces (BCIs) have been proposed as potential assistive devices for impaired individuals. In this research, a BCI technique, based on the Emotiv Epoc, was studied to obtain new evidence about human-computer interaction for cerebral palsy individuals.
A therapist and a pedagogue from Fundação Catarinense de Educação Especial (FCEE) conducted the experiments with the participation of four cerebral palsy volunteers and one pedagogue. We considered the specialist’s point of view for each user to complete the SUS questionnaire. This was done because the scientific literature reports the difficulty to evaluate usability in cerebral palsy individuals due to the stringency of users’ conditions and limited communication skills. In fact, the scientific literature highlights that brain-computer interfaces are still premature as a form of assistive technology for people with cerebral palsy.
Quantitative and qualitative assessments produced new evidence about the design of augmentative and alternative communication approaches for the target audience. From the qualitative point of view, we observe that the time required to select the first letter was reduced in all cases as the experiment was repeated. However, there was a trend to decrease the number of selections over time.
According to the qualitative evaluation, the overall score was 67.5, showing an acceptable usability. Despite the fact that the SUS score was acceptable, several challenges remain. In fact, recognition of EEG signals may not be able to be relied upon, as they require multiple training sessions and present the necessity of exploring new processing algorithms [30]. In addition, individuals with cerebral palsy may exhibit spontaneous and uncontrolled movements, which also affect EEG signal quality. Moreover, motivation, fatigue, and concentration could also influence performance [23].
It is important to remark that the number of participants is usually low for this target audience. Though four is a modest number of individuals to carry out statistical inferences on the results, it is not considerably lower than in other works with similar user groups. For example, in Reference [20], only seven were classified for the study; in Reference [21], only one user participated, in Reference [22], seven, and in Reference [23], six. Moreover, the difficulties in recruiting individuals with disabilities for usability evaluations should be taken into account, as discussed in Reference [24]. In addition, according to Reference [25], four individuals are sufficient to reveal more than 65% of usability problems found in other studies with a higher number of participants.
The EGG technique also allowed identifying events (affective analysis) that could go unnoticed by the system developer, as well as identifying fragilities with potential for improvement. Preliminary results show that positive emotions were associated with moments at which the user’s interaction with the device did not suffer any disturbance. On the other hand, it could be inferred that negative emotions were associated with moments when the user committed mistakes or when he was interrupted to restart the test.
This is an experimental study in its early stages, still with several gaps. However, it is important to remark that a standard approach to evaluate the usability of BCI for cerebral palsy individuals has not yet been developed and the most appropriate approach may depend on the objectives of the evaluation.
In future works, the study of facial expressions and cognitive skills will be addressed. On the other hand, open hardware and software solutions will be approached due to the cost of current solutions and the commercial secrecy of EEG processing algorithms.

Supplementary Materials

The following are available online at https://www.mdpi.com/2227-7080/6/1/28/s1, Video S1: A pioneering study at Fundação Catarinense de Educação Especial (FCEE).
Supplementary File 1

Acknowledgments

Our thanks to the National Council of Scientific and Technological Development (CNPq), process 309429/2015-3 and Fundação de Amparo à Pesquisa e Inovação do Estado de Santa Catarina (FAPESC) under grant 2015TR300.

Author Contributions

Alejandro Rafael García Ramírez and Ana Carolina Rodrigues Savall conceived and designed the experiments; Jéferson Fernandes da Silva develop the Phrase-Composer Software; Jéferson Fernandes da Silva, Ana Carolina Rodrigues Savall and Tiago Catecati perform the experiments; Tiago Catecati and Marcelo Gitirana Gomes Ferreira analyzed the usability and affective data. Alejandro Rafael García Ramírez and Marcelo Gitirana Gomes Ferreira wrote the paper. All authors have read and approved the final manuscript.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Table A1. The System Usability Scale (SUS) results.
Table A1. The System Usability Scale (SUS) results.
QuestionPositive/NegativeUser AUser BUser CUser DPedagogueMean
1P555334
2N343223
3P333333
4N444224
5P443554
6N111111
7P444444
8N333333
9P444334
10N111111
SCORE------67.5

References

  1. Rothstein, J.R.; Beltrame, T.S. Características motoras e biopsicossociais de crianças com paralisia cerebral. Revista Brasileira de Ciência & Movimento 2013, 21, 118–126. (In Portuguese) [Google Scholar] [CrossRef]
  2. Cook, A.M.; Polgar, J.M. Cook & Hussey’s, Assistive Technologies: Principles and Practice, 4th ed.; Mosby Elsevier: St. Louis, MO, USA, 2015. [Google Scholar]
  3. Bayón, L.C.; Ramírez, O.; Serrano, J.I.; del Castillo, M.D.; Pérez-Somarriba, A.; Belda-Loisb, J.M.; Martínez-Caballero, I.; Lerma-Lara, S.; Cifuentes, C.; Frizera, A.; et al. Development and evaluation of a novel robotic platform for gait rehabilitation in patients with Cerebral Palsy: CPWalker. Robot. Auton. Syst. 2017, 91, 101–114. [Google Scholar] [CrossRef]
  4. Mariano, D.T.G.; Freitas, A.M.; Luiz, L.M.D.; Silva, A.N.; Pierre, P.; Naves, E.L.M. An accelerometer-based human computer interface driving an alternative communication system. In Proceedings of the 5th ISSNIP-IEEE Biosignals and Biorobotics Conference: Biosignals and Robotics for Better and Safer Living (BRC), Salvador, Brazil, 26–28 May 2014. [Google Scholar]
  5. Caltenco, H.A.; Struijk, L.N.S.A.; Breidegard, B. Tonguewise: Tongue-Computer Interface Software for People with Tetraplegia. In Proceedings of the Annual International Conference of the Engineering in Medicine and Biology Society, Buenos Aires, Argentina, 31 August–4 September 2010. [Google Scholar]
  6. Perini, E.; Soria, S.; Prati, A.; Cucchiara, R. Facemouse: A human–computer interface for tetraplegic people. In Proceedings of the ECCV Workshop on HCI, Graz, Austria, 13 May 2006. [Google Scholar]
  7. Ann, O.C.; Theng, L.B. Biometrics based assistive communication tool for children with special needs. In Proceedings of the 7th International Conference on Information Technology in Asia, Kuching, Malaysia, 12–14 July 2011. [Google Scholar]
  8. Vidal, J.J. Toward direct brain-computer communication. Annu. Rev. Biophys. Bioeng. 1973, 2, 157–180. [Google Scholar] [CrossRef] [PubMed]
  9. Krucoff, M.O.; Rahimpour, S.; Slutzky, M.W.; Edgerton, V.R.; Turner, D.A. Enhancing Nervous System Recovery through Neurobiologics, Neural Interface Training, and Neurorehabilitation. Neuroprosthetics 2016, 10, 584. [Google Scholar] [CrossRef] [PubMed]
  10. Chun, R.; Shon, Y. Processos de significação de afásicos usuários de comunicação suplementar e/ou alternativa. Revista da Sociedade Brasileira de Fonoaudiologia 2010, 15, 598–603. (In Portuguese) [Google Scholar] [CrossRef]
  11. Ramadan, R.A.; Refat, S.; Elshahed, M.A.; Ali, R.A. Basics of Brain Computer Interface. In Brain-Computer Interfaces: Current Trends and Applications; Hassanien, A.E., Azar, A.T., Eds.; Springer International Publishing: Cham, Switzerland, 2015; pp. 31–50. [Google Scholar]
  12. Wolpaw, J.R.; McFarland, D.J. Control of a Two-Dimensional Movement Signal by a Noninvasive Brain-Computer Interface in Humans. Proc. Natl. Acad. Sci. USA 2004, 101, 17849–17854. [Google Scholar] [CrossRef] [PubMed]
  13. Emotiv Epoc. 2015. Available online: http://emotiv.com/epoc/ (accessed on 25 February 2018).
  14. Saturno, C.E.; Farhat, M.; Conte, M.J.; Piucco, E.C.; Ramirez, A.R.G. An augmentative and alternative communication tool for children and adolescents with cerebral palsy. Behav. Inf. Technol. 2015, 34, 632–645. [Google Scholar] [CrossRef]
  15. Lewis, J.R. IBM computer usability satisfaction questionnaires: Psychometric evaluation and instructions for use. Int. J. Hum. Comput. Interact. 1995, 7, 57–58. [Google Scholar] [CrossRef]
  16. Mühl, C.; Allison, B.; Nijholt, A.; Chanel, G. A survey of affective brain computer interfaces: Principles, state of the art, and challenges. Brain-Comput. Interfaces 2014, 1, 66–84. [Google Scholar] [CrossRef]
  17. Sakamaki, I.; del Campo, C.E.P.; Wiebe, S.A.; Tavakoli, M.; Adams, K. Assistive Technology Design and Preliminary Testing of a Robot Platform Based on Movement Intention using Low-Cost Brain Computer Interface. In Proceedings of the IEEE International Conference on Systems, Man, and Cybernetics (SMC), Banff Center, Banff, AB, Canada, 5–8 October 2017. [Google Scholar]
  18. Nijboer, F.; Van De Laar, B.; Gerritsen, S.; Nijholt, A.; Poel, M. Usability of three electroencephalogram headsets for brain–computer interfaces: A within subject comparison. Interact. Comput. 2015, 27, 500–511. [Google Scholar] [CrossRef]
  19. Daly, I.; Billinger, M.; Laparra-Hernandez, J.; Aloise, F.; García, M.L.; Faller, J.; Scherer, R.; Müller-Putz, G. On the control of brain–computer interfaces by users with cerebral palsy. Clin. Neurophysiol. 2013, 124, 1787–1797. [Google Scholar] [CrossRef] [PubMed]
  20. Scherer, R.; Billinger, M.; Wagner, J.; Schwarz, A.; Hettich, D.T.; Bolinger, E.; Lloria Garcia, M.; Navarro, J.; Müller-Putz, G. Thought-based row-column scanning communication board for individuals with cerebral palsy. Ann. Phys. Rehabil. Med. 2015, 58, 14–22. [Google Scholar] [CrossRef] [PubMed]
  21. Taherian, S.; Selitskiy, D.; Pau, J.; Davies, T.C.; Owens, R.G. Training to use a commercial brain-computer interface as access technology: A case study. Disabil. Rehabil. Assist. Technol. 2016, 11, 345–350. [Google Scholar] [CrossRef] [PubMed]
  22. Welton, T.; Brown, D.J.; Evett, L.; Sherkat, N. A brain–computer interface for the Dasher alternative text entry system. Univers. Access Inf. Soc. 2016, 15, 77–83. [Google Scholar] [CrossRef]
  23. Taherian, S.; Selitskiy, D.; Pau, J.; Davies, T.C. Are we there yet? Evaluating commercial grade brain–computer interface for control of computer applications by individuals with cerebral palsy. Disabil. Rehabil. Assist. Technol. 2017, 12, 165–174. [Google Scholar] [CrossRef] [PubMed]
  24. Rubin, J.; Chisnell, D. Handbook of Usability Testing: How to Plan, Design, and Conduct Effective Tests, 2nd ed.; Wiley Publishing, Inc.: Hoboken, NJ, USA, 2008. [Google Scholar]
  25. Nielsen, J. Usability Engineering; Morgan Kaufmann: Burlington, MA, USA, 1993. [Google Scholar]
  26. Likert, R. A Technique for the Measurement of Attitudes. Arch. Psychol. 1932, 140, 1–55. [Google Scholar]
  27. Tullis, T.; Albert, B. Measuring the User Experience: Collecting, Analyzing and Presenting Usability Metrics; MK Elsevier: London, UK, 2013. [Google Scholar]
  28. Hagemann, D.; Naumann, E.; Thayer, J.F.; Bartussek, D. Does resting electroencephalograph asymmetry reflect a trait? An application of latent state-trait theory. J. Pers. Soc. Psychol. 2002, 82, 619–641. [Google Scholar] [CrossRef] [PubMed]
  29. Delorme, A.; Makeig, S. EEGLAB: An open source toolbox for analysis of single-trial EEG dynamics including independent component analysis. J. Neurosci. Methods 2004, 134, 9–21. [Google Scholar] [CrossRef] [PubMed]
  30. Scherer, R.; Wagner, J.; Billinger, M.; Müller-Putz, G.; Raya, R.; Rocón, E.; Hettich, D.T.; Bolinger, E.; Iosa, M.; Cincotti, F.; et al. Augmenting communication, emotion expression and interaction capabilities of individuals with cerebral palsy. In Proceedings of the 6th International Brain-Computer Interface Conference, Graz, Austria, 16–19 September 2014; pp. 312–315. [Google Scholar]
Figure 1. The Emotiv Epoc Headset [13].
Figure 1. The Emotiv Epoc Headset [13].
Technologies 06 00028 g001
Figure 2. Emotiv Pure—Electroencephalography (EEG) raw interface.
Figure 2. Emotiv Pure—Electroencephalography (EEG) raw interface.
Technologies 06 00028 g002
Figure 3. Phase-Composer main interface.
Figure 3. Phase-Composer main interface.
Technologies 06 00028 g003
Figure 4. User A interacting with the system. Credit by Record News Santa Catarina.
Figure 4. User A interacting with the system. Credit by Record News Santa Catarina.
Technologies 06 00028 g004
Figure 5. Users interacting with the system. (The y-axis represents the number of letters that each user selected in each experiment. The x-axis represents the particular session.)
Figure 5. Users interacting with the system. (The y-axis represents the number of letters that each user selected in each experiment. The x-axis represents the particular session.)
Technologies 06 00028 g005
Figure 6. Brain activity of User A at point F3.
Figure 6. Brain activity of User A at point F3.
Technologies 06 00028 g006
Figure 7. Brain activity of User A at point F4.
Figure 7. Brain activity of User A at point F4.
Technologies 06 00028 g007
Table 1. Exploring the system.
Table 1. Exploring the system.
User/SignalsGender Eye BlinkSimultaneous BlinkEye MovementHead
User AMYYYY
User BMYNYY
User CMNNYY
User DMYYYN
PedagogueMYYYY
Table 2. Performance of User A.
Table 2. Performance of User A.
Action/ExperimentSessionSessionSessionSession
Selecting3221
Errors0000
Time (s)121377
Total time (s)41223039
Table 3. Performance of the pedagogue.
Table 3. Performance of the pedagogue.
Action/ExperimentSessionSessionSession
Selecting1078
Erasing002
Time (s)121212
Total time (s)598178

Share and Cite

MDPI and ACS Style

García Ramírez, A.R.; Da Silva, J.F.; Savall, A.C.R.; Catecati, T.; Ferreira, M.G.G. User’s Emotions and Usability Study of a Brain-Computer Interface Applied to People with Cerebral Palsy. Technologies 2018, 6, 28. https://doi.org/10.3390/technologies6010028

AMA Style

García Ramírez AR, Da Silva JF, Savall ACR, Catecati T, Ferreira MGG. User’s Emotions and Usability Study of a Brain-Computer Interface Applied to People with Cerebral Palsy. Technologies. 2018; 6(1):28. https://doi.org/10.3390/technologies6010028

Chicago/Turabian Style

García Ramírez, Alejandro Rafael, Jéferson Fernandes Da Silva, Ana Carolina Rodrigues Savall, Tiago Catecati, and Marcelo Gitirana Gomes Ferreira. 2018. "User’s Emotions and Usability Study of a Brain-Computer Interface Applied to People with Cerebral Palsy" Technologies 6, no. 1: 28. https://doi.org/10.3390/technologies6010028

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop