Cognitive and Affective Assessment of Navigation and Mobility Tasks for the Visually Impaired via Electroencephalography and Behavioral Signals
Abstract
:1. Introduction
2. Overview of Mobility Assistive Aids
2.1. Auditory Vision Sensory Substitution
2.2. Tactile Visual Sensory Substitution
2.3. Auditory Tactile Visual Substitution Devices
3. Biophysical Signals and Cognitive Load
3.1. Electroencephalography
3.2. Electrodermal Activity and Heart Rate
4. The Sound of Vision Device
4.1. Technical Description
- −
- a headgear, including a 3D acquisition unit (depth camera for indoor or low light outdoor conditions, stereo camera for outdoor or bright light conditions, head and body inertial measurement unit (IMU) for body orientation) and an audio rendering unit (mounted on the head);
- −
- a haptic belt with a matrix of 60 vibrating motors (six rows and 10 columns, placed on the abdomen);
- −
- a processing unit: a small laptop with powerful CPU and GPU units (in a backpack);
- −
- a wireless remote control (in the pocket).
4.2. The Focus of the Study
- when using the Sound of Vision device with audio vs. haptic vs. multimodal input;
- when using the Sound of Vision device vs. white cane during a navigation task in the real-world environment.
5. Materials and Methods
5.1. Experimental Setup
5.2. Data Collection
- a BrainProducts V-Amp 16 amplifier and an EasyCap helmet with 19 sintered Ag/AgCl miniaturized passive electrodes for EEG signal acquisition with a sampling rate of 512 Hz;
- a Shimmer3 GSR+ unit sensor for measuring electrodermal activity/galvanic skin response (EDA/GSR) and continuous HR;
- a video camera or smartphone for video recording in real-time.
5.3. Data Acquisition and Preprocessing
5.4. Data Analysis
6. Results and Discussion
6.1. Navigation Metrics Analysis
6.2. Cognitive Load Analysis
6.3. Brain Activity Analysis
6.4. Visual Cortex Activation Analysis
6.5. Emotions Assessment During Real-World Navigation
6.6. Limitation of This Study
7. Conclusions
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
References
- World Health Organization—Blindness and Vision Impairment. Available online: https://www.who.int/news-room/fact-sheets/detail/blindness-and-visual-impairment (accessed on 10 September 2020).
- Sound of Vision. Available online: https://soundofvision.net/ (accessed on 10 September 2020).
- Väljamäe, A.; Kleiner, M. Spatial sound in auditory vision substitution systems. Audio Eng. Soc. Conv. 2006, 120, 6795. [Google Scholar]
- Fournier d’Albe, E.E. The Optophone: An instrument for reading by ear. Nature 1920, 105, 295–296. [Google Scholar]
- Meijer, P.B.L. The Voice Technology. Available online: http://www.seeingwithsound.com (accessed on 10 September 2020).
- Fish, R.M. Audio Display for Blind. IEEE Trans. Biomed. Eng. 1976, 23, 144–154. [Google Scholar] [CrossRef]
- Sanz, P.R.; Mezcua, B.R.; Sánchez Pena, J.M. A sonification proposal for safe travels of blind people. In Proceedings of the 18th International Conference on Auditory Display, Atlanta, GA, USA, 18–21 June 2012. [Google Scholar]
- Milios, E.; Kapralos, B.; Kopinska, A.; Stergiopoulos, S. Sonification of range information for 3-D space perception. IEEE Trans. Neural Syst. Rehab. Eng. 2003, 11, 416–421. [Google Scholar] [CrossRef]
- Bestpluton World Cie. The “Mini-Radar”, Your Small Precious Companion that Warns You Obstacles in a Spoken Way, and that Helps You to Walk Straight. Available online: http://aweb1208.free.fr/EnglishMiniRadar.htm (accessed on 10 September 2020).
- Pun, T.; Roth, P.; Bologna, G.; Moustakas, K.; Tzovaras, D. Image and video processing for visually handicapped people. J. Image Video Process. 2007, 5, 1–12. [Google Scholar]
- Kaczmarek, K.A.; Bach-y-Rita, P. Tactile displays. In Virtual Environments and Advanced Interface Design; Barfield, W., Furness, T.A., Eds.; Oxford University Press Inc.: Oxford, UK, 1995. [Google Scholar]
- Wall, S.A.; Brewster, S. Sensory substitution using tactile pin arrays: Human factors, technology, and applications. Signal Process. 2006, 86, 3674–3695. [Google Scholar] [CrossRef] [Green Version]
- Loomis, J.M.; Lederman, S.J. Tactual perception. In Handbook of Perception and Human Performance: Cognitive Processes and Performance; Boff, K.R., Kaufman, L., Thomas, J.P., Eds.; John Wiley & Sons: New York, NY, USA, 1986. [Google Scholar]
- Cancar, L.; Díaz, A.; Barrientos, A.; Travieso, D.; Jacobs, D.M. Tactile-sight: A sensory substitution device based on distance-related vibrotactile flow. Int. J. Adv. Robot. Syst. 2013, 10, 272. [Google Scholar] [CrossRef] [Green Version]
- Parkes, D. Nomad: An audio-tactile tool for the acquisition, use and management of spatially distributed information by visually impaired people. In Proceedings of the 2nd International Symposium on Maps and Graphics for Visually Impaired People, London, UK, 20–22 April 1988; pp. 24–29. [Google Scholar]
- Dewhurst, D. Creating and accessing audio-tactile images with HFVE vision substitution software. In Proceedings of the Third Interactive Sonification Workshop; KTH: Stockholm, Sweden, 2010; pp. 101–104. [Google Scholar]
- Dewhurst, D. Using imprints to summarise accessible images. In Proceedings of the Interactive Sonification Workshop ISon, 4th Interactive Sonification Workshop, Fraunhofer IIS, Erlangen, Germany, 10 December 2013. [Google Scholar]
- Hosseini, S.A.; Naghibi-Sistani, M.B. Classification of emotional stress using brain activity. In Applied Biomedical Engineering; Gargiulo, G.D., McEwan, A., Eds.; InTech: Rijeka, Croatia, 2011; pp. 313–336. [Google Scholar]
- Jena, S.K. Examination stress and its effect on EEG. Int. J. Med. Sci. Public Health 2015, 11, 1493–1497. [Google Scholar] [CrossRef] [Green Version]
- Bos, D.O. EEG-Based Emotion Recognition: The Influence of Visual and Auditory Stimuli; Department of Computer Science, University of Twente: Enschede, The Netherlands, 2006. [Google Scholar]
- Antonenko, P.; Pass, F.; Grabner, R.; van Gog, T. Using Electroencephalography to Measure Cognitive Load. Educ. Psychol. Rev. 2010, 22, 425–438. [Google Scholar] [CrossRef]
- Berka, C.; Levendowski, D.J.; Lumicao, M.N.; Yau, A.; Davis, G.; Zivkovic, V.T.; Olmstead, R.E.; Tremoulet, P.D.; Craven, P.L. EEG correlates of task engagement and mental workload in vigilance, learning, and memory tasks. Aviat. Space Environ. Med. 2007, 78, B231–B244. [Google Scholar]
- Nilsson, R.M.; Mayer, R.E. The effects of graphics organizers giving cues to the structure of a hypertext document on users’ navigation strategies and performance. Int. J. Hum. Comput. Stud. 2002, 57, 1–26. [Google Scholar] [CrossRef]
- Scott, B.M.; Schwartz, N.H. Navigational spatial displays: The role of metacognition as cognitive load. Learn. Instr. 2007, 17, 89–105. [Google Scholar] [CrossRef]
- Pekrun, R. The impact of emotions on learning and achievement: Towards a theory of cognitive/motivational mediators. Appl. Psychol. 1992, 41, 359–376. [Google Scholar] [CrossRef]
- 26. Soleymani, M.; Pantic, M.; Pun, T. Multimodal emotion recognition in response to videos (Extended abstract). In Proceedings of the International Conference on Affective Computing and Intelligent Interaction (ACII), Xi’an, China, 21–24 September 2015; pp. 491–497. [Google Scholar]
- Bower, G.H.; Forgas, J.P. Mood and Social Memory; Lawrence Erlbaum Associates Publishers: Mahwah, NJ, USA, 2001. [Google Scholar]
- Cattaneo, Z.; Vecchi, T.; Cornoldi, C.; Mammarella, I.; Bonino, D.; Ricciardi, E.; Pietrini, P. Imagery and spatial processes in blindness and visual impairment. Neurosci. Biobehav. Rev. 2008, 32, 1346–1360. [Google Scholar] [CrossRef]
- Healey, J.A.; Picard, R.W. Detecting stress during real-world driving tasks using physiological sensors. IEEE Trans. Intell. Transp. Syst. 2005, 6, 156–166. [Google Scholar] [CrossRef] [Green Version]
- Setz, C.; Arnrich, B.; Schumm, J.; La Marca, R.; Tröster, G. Discriminating stress from cognitive load using a wearable EDA device. IEEE Trans. Inf. Technol. Biomed. 2010, 14, 410–417. [Google Scholar] [CrossRef]
- Peper, E.; Harvey, R.; Lin, I.M.; Tylova, H.; Moss, D. Is There More to Blood Volume Pulse Than Heart Rate Variability, Respiratory Sinus Arrhythmia, and Cardiorespiratory Synchrony? Biofeedback 2007, 35, 54–61. [Google Scholar]
- Peake, P.; Leonard, J.A. The use of heart rate as an index of stress in blind pedestrians. Ergonomics 1971, 14, 189–204. [Google Scholar] [CrossRef]
- Wycherley, R.J.; Nicklin, B.H. The heart rate of blind and sighted pedestrians on a town route. Ergonomics 1970, 13, 181–192. [Google Scholar] [CrossRef]
- Urgen, B.A.; Plank, M.; Ishiguro, H.; Poizner, H.; Saygin, A. EEG theta and Mu oscillations during perception of human and robot actions. Front. Neurorobotics 2013, 7, 19. [Google Scholar] [CrossRef] [Green Version]
- Gallistel, C.R. The Organization of Learning; MIT Press: Cambridge, MA, USA, 1993. [Google Scholar]
- Pinquart, M.; Pfeiffer, J.P. Psychological well-being in visually impaired and unimpaired individuals: A meta-analysis. Br. J. Vis. Impair. 2011, 29, 27–45. [Google Scholar] [CrossRef]
- Kalimeri, K.; Saitis, C. Exploring Multimodal Biosignal Features for Stress Detection during Indoor Mobility. In Proceedings of the 18th ACM International Conference on Multimodal Interaction, Tokyo, Japan, 12–16 November 2016. [Google Scholar]
- Saitis, C.; Parvez, M.Z.; Kalimeri, K. Cognitive Load Assessment from EEG and Peripheral Biosignals for the Design of Visually Impaired Mobility Aids. Wirel. Commun. Mob. Comput. 2018, 2018. [Google Scholar] [CrossRef] [Green Version]
- Saitis, C.; Kalimeri, K. Identifying urban mobility challenges for the visually impaired with mobile monitoring of multimodal biosignals. In Proceedings of the International Conference on Universal Access in Human-Computer Interaction, Toronto, ON, Canada, 17–22 July 2016. [Google Scholar]
- Amedi, A.; Stern, W.; Camprodon, J.; Bermpohl, F.; Merabet, L.; Rotman, S.; Hemond, C.; Meijer, P.; Pascual-Leone, A. Shape conveyed by visual-to-auditory sensory substitution activates the lateral occipital complex. Nat. Neurosci. 2007, 10, 687–689. [Google Scholar] [CrossRef]
- Naue, N.; Rach, S.; Struber, D.; Huster, R.J.; Zaehle, T.; Korner, U.; Herrmann, C.S. Auditory Event-Related Response in Visual Cortex Modulates Subsequent Visual Responses in Humans. J. Neurosci. 2011, 31, 7729–7736. [Google Scholar] [CrossRef] [PubMed]
- Sound of Vision Outputs. Available online: https://soundofvision.net/outputs/ (accessed on 10 September 2020).
- Moldoveanu, A.D.B.; Ivascu, S.; Stanica, I.; Dascalu, M.-I.; Lupu, R.; Ivanica, G.; Morar, A. Mastering an advanced sensory substitution device for visually impaired through innovative virtual training. In Proceedings of the IEEE 7th International Conference on Consumer Electronics-Berlin (ICCE-Berlin), Berlin, Germany, 3–6 September 2017; pp. 120–125. [Google Scholar] [CrossRef]
- Dascalu, M.; Moldoveanu, A.; Balan, O.; Lupu, R.G.; Ungureanu, F.; Caraiman, S. Usability assessment of assistive technology for blind and visually impaired. In Proceedings of the 2017 E-Health and Bioengineering Conference (EHB), Sinaia, Romania, 22–24 June 2017; pp. 523–526. [Google Scholar] [CrossRef]
- Botezatu, N.; Caraiman, S.; Rzeszotarski, D.; Strumillo, P. Development of a versatile assistive system for the visually impaired based on sensor fusion. In Proceedings of the 2017 21st International Conference on System Theory, Control and Computing (ICSTCC), Sinaia, Romania, 19–21 October 2017; pp. 540–547. [Google Scholar] [CrossRef]
- Kristjánsson, A.; Moldoveanu, A.; Jóhannesson, O.I.; Balan, O.; Spagnol, S.; Valgeirsdóttir, V.V.; Unnthorsson, R. Designing sensory-substitution devices: Principles, pitfalls and potential. Restor. Neurol. Neurosci. 2016, 34, 769–787. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Moldoveanu, A.; Taloi, B.; Balan, O.; Stanica, I.; Flamaropol, D.; Dascalu, M.I.; Moldoveanu, F.; Mocanu, I. Virtual Mini-Games—A Serious Learning Tool for Sensory Substitution Devices. In Proceedings of the EduLearn Conference, Barcelona, Spain, 3–5 July 2017. [Google Scholar]
- Spagnol, S.; Wersényi, G.; Bujacz, M.; Bălan, O.; Martínez, M.H.; Moldoveanu, A.; Unnthorsson, R. Current Use and Future Perspectives of Spatial Audio Technologies in Electronic Travel Aids. Wirel. Commun. Mob. Comput. 2018, 2018. [Google Scholar] [CrossRef]
- Caraiman, S.; Morar, A.; Owczarek, M.; Burlacu, A.; Rzeszotarski, D.; Botezatu, N.; Moldoveanu, A. Computer Vision for the Visually Impaired: The Sound of Vision System. In Proceedings of the IEEE International Conference on Computer Vision Workshops (ICCVW), Venice, Italy, 22–29 October 2017; pp. 1480–1489. [Google Scholar] [CrossRef]
- Lab Streaming Layer (LSL), Multi-Modal Time-Synched Data Transmission over Local Network. Available online: https://github.com/sccn/labstreaminglayer (accessed on 10 September 2020).
- HDF5 Data Model, Library, and File Format for Storing and Managing Data. Available online: https://support.hdfgroup.org/HDF5/ (accessed on 10 September 2020).
- ChronoViz, Visualization of Time-Based Multimodal Data. Available online: https://chronoviz.com/ (accessed on 10 September 2020).
- ExifTool. Available online: https://exiftool.org/ (accessed on 10 September 2020).
- iMotions. Available online: https://imotions.com/ (accessed on 10 September 2020).
- Hawthorne, B.; Vella-Brodrick, D.; Hattie, J. Well-Being as a Cognitive Load Reducing Agent: A Review of the Literature. Front. Educ. 2019, 4, 121. [Google Scholar] [CrossRef]
- Alarcao, S.M.; Fonseca, M.J. Emotions Recognition Using EEG Signals: A Survey. IEEE Trans. Affect. Comput. 2019, 10, 374–393. [Google Scholar] [CrossRef]
- Sadato, N.; Pascual-Leone, A.; Grafman, J.; Ibañez, V.; Deiber, M.P.; Dold, G.; Hallett, M. Activation of the primary visual cortex by Braille reading in blind subjects. Nature 1996, 380, 526–528. [Google Scholar] [CrossRef]
- Burton, H. Visual cortex activity in early and late blind people. J. Neurosci. 2003, 23, 4005–4011. [Google Scholar] [CrossRef] [Green Version]
- Georgetown University Medical Center. People Blind from Birth Use Visual Brain Area to Improve Other Senses: Can Hear and Feel with Greater Acuity. ScienceDaily. 10 October 2010. Available online: www.sciencedaily.com/releases/2010/10/101006131203.htm (accessed on 1 September 2020).
- Campus, C.; Sandini, G.; Amadeo, M.B.; Gori, M. Stronger responses in the visual cortex of sighted compared to blind individuals during auditory space representation. Sci. Rep. 2019, 9, 1935. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Bălan, O.; Moise, G.; Petrescu, L.; Moldoveanu, A.; Leordeanu, M.; Moldoveanu, F. Emotion Classification Based on Biophysical Signals and Machine Learning Techniques. Symmetry 2020, 12, 21. [Google Scholar] [CrossRef] [Green Version]
- Zhang, J.; Chen, M.; Zhao, S.; Hu, S.; Shi, Z.; Cao, Y. Relief F-Based EEG Sensor Selection Methods for Emotion Recognition. Sensors 2016, 16, 1558. [Google Scholar] [CrossRef] [PubMed]
- iMotions, Frontal Asymmetry 101—How to Get Insights on Motivation and Emotions from EEG. Available online: https://imotions.com/blog/frontal-asymmetry-101-get-insights-motivation-emotions-eeg/ (accessed on 10 September 2020).
- Ray, W.J.; Cole, H.W. EEEG alpha activity reflects attentional demands, and beta activity reflects emotional and cognitive processes. Science 1985, 228, 750–752. [Google Scholar] [CrossRef] [PubMed]
- Saitis, C.; Kalimeri, K. Multimodal Classification of Stressful Environments in Visually Impaired Mobility Using EEG and Peripheral Biosignals. IEEE Trans. Affect. Comput. 2018. [Google Scholar] [CrossRef]
- Russell, J.A. Core affect and the psychological construction of emotion. Psychol. Rev. 2003, 110, 145–172. [Google Scholar] [CrossRef]
- Ferche, O.; Moldoveanu, A.; Cintezã, D.; Toader, C.; Moldoveanu, F.; Voinea, A.; Taslitchi, C. From neuromotor command to feedback: A survey of techniques for rehabilitation through altered perception. In Proceedings of the E-Health and Bioengineering Conference (EHB), Iasi, Romania, 19–21 November 2015; pp. 1–4. [Google Scholar]
- Moldoveanu, A.; Ferche, O.; Moldoveanu, F.; Lupu, R.; Cinteză, D.; Irimia, D.C.; Toader, C. The TRAVEE System for a Multimodal Neuromotor Rehabilitation. IEEE Access 2015, 7, 8151–8171. [Google Scholar] [CrossRef]
- Tayyaba, S.; Ashraf, M.W.; Alquthami, T.; Ahmad, Z.; Manzoor, S. Fuzzy-Based Approach Using IoT Devices for Smart Home to Assist Blind People for Navigation. Sensors 2020, 20, 3674. [Google Scholar] [CrossRef]
- Mala, N.S.; Thushara, S.; Subbiah, S. Navigation gadget for visually impaired based on IoT. In Proceedings of the 2nd International Conference on Computing and Communications Technologies (ICCCT), Chennai, India, 23–24 February 2017; pp. 334–338. [Google Scholar]
- Islam, M.M.; Sadi, M.S.; Zamli, K.Z.; Ahmed, M.M. Developing Walking Assistants for Visually Impaired People: A Review. IEEE Sens. J. 2019, 19, 2814–2828. [Google Scholar] [CrossRef]
- Chang, W.; Chen, L.; Chen, M.; Su, J.; Sie, C.; Yang, C. Design and Implementation of an Intelligent Assistive System for Visually Impaired People for Aerial Obstacle Avoidance and Fall Detection. IEEE Sens. J. 2020, 20, 10199–10210. [Google Scholar] [CrossRef]
- Jicol, C.; Lloyd-Esenkaya, T.; Proulx, M.J.; Lange-Smith, S.; Scheller, M.; O’Neill, E.; Petrini, K. Efficiency of Sensory Substitution Devices Alone and in Combination with Self-Motion for Spatial Navigation in Sighted and Visually Impaired. Front. Psychol. 2020, 11, 1443. [Google Scholar] [CrossRef] [PubMed]
Codification | Scenario Type | Collisions Total Number | Path Total Distance (m) | Total Time (s) |
---|---|---|---|---|
Audio | A | 12 | 33.69 | 293 |
B | 10 | 44.36 | 299 | |
C | 22 | 42.8 | 261 | |
D | 18 | 41.8 | 306 | |
E | 20 | 62.05 | 409 | |
Haptic | A | 13 | 28.3 | 179 |
B | 11 | 39.1 | 221 | |
C | 17 | 41 | 261 | |
D | 20 | 44.1 | 255 | |
E | 34 | 54 | 440 | |
Audio and Haptic | A | 7 | 33.35 | 236 |
B | 14 | 39 | 267 | |
C | 15 | 47.3 | 327 | |
D | 12 | 51.6 | 271 | |
E | 24 | 56.5 | 345 | |
White cane | A | 4 | 23.27 | 204 |
B | 5 | 28.8 | 205 | |
C | 6 | 27.8 | 213 | |
D | 2 | 30.16 | 206 | |
E | 6 | 37.77 | 255 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Lupu, R.-G.; Mitruț, O.; Stan, A.; Ungureanu, F.; Kalimeri, K.; Moldoveanu, A. Cognitive and Affective Assessment of Navigation and Mobility Tasks for the Visually Impaired via Electroencephalography and Behavioral Signals. Sensors 2020, 20, 5821. https://doi.org/10.3390/s20205821
Lupu R-G, Mitruț O, Stan A, Ungureanu F, Kalimeri K, Moldoveanu A. Cognitive and Affective Assessment of Navigation and Mobility Tasks for the Visually Impaired via Electroencephalography and Behavioral Signals. Sensors. 2020; 20(20):5821. https://doi.org/10.3390/s20205821
Chicago/Turabian StyleLupu, Robert-Gabriel, Oana Mitruț, Andrei Stan, Florina Ungureanu, Kyriaki Kalimeri, and Alin Moldoveanu. 2020. "Cognitive and Affective Assessment of Navigation and Mobility Tasks for the Visually Impaired via Electroencephalography and Behavioral Signals" Sensors 20, no. 20: 5821. https://doi.org/10.3390/s20205821
APA StyleLupu, R.-G., Mitruț, O., Stan, A., Ungureanu, F., Kalimeri, K., & Moldoveanu, A. (2020). Cognitive and Affective Assessment of Navigation and Mobility Tasks for the Visually Impaired via Electroencephalography and Behavioral Signals. Sensors, 20(20), 5821. https://doi.org/10.3390/s20205821