Wearable Hearing Assist System to Provide Hearing-Dog Functionality †
Abstract
:1. Introduction
2. Algorithm for Identifying ITDs
2.1. Ambiguity of Phase Difference of Sound Pressure
2.2. Identification of ITDs.
3. Humanoid Robot Turning Its Head Toward a Sound Source
3.1. System Configuration and Flowchart of Robot Head Motion
3.2. Experimental Results for Robot Head Motion
3.2.1. Relation between ITDs and Source Directions
3.2.2. Robot Head Motion
3.2.3. On the Effect of Sound Quality
3.2.4. On the Effect of Distance to Sound Source
4. Hearing Assist System for Turning Subject Toward a Sound Source
4.1. System Configuration and Flowchart of Subject’s Motion
4.2. Experimental Results for Subject’s Motion Wearing the Assist System
4.2.1. Relation between ITDs and Source Directions
4.2.2. On the Effect of Interval between Ears
4.2.3. On the Effect of Vision
4.2.4. On the Effect of Sound Quality
5. Discussion
6. Conclusions
- The proposed system, which is only equipped with two microphones, can distinguish a front source from a rear source, since it continues to repeatedly track the target in real time if the first tracking motion fails. The system continuously checks the subject’s head direction using ITDs, which corresponds to subject’s head direction as same as hearing-dog will do. The system also probably tracks moving sound sources.
- The absolute values of ITDs were larger for wider intervals between two ears. The sound source direction is probably overestimated when the interval between the two ears is smaller. There were no significant differences in ITDs between the intervals for smaller sound source direction angles.
- When the subject can utilize vision, this may help in tracking the location of the target sound source, especially if the target comes into view, and it may shorten the tracking period.
- For checking the proposed system suitability when sound quality is intermittent, the measurements were obtained when the subject heard a name called twice at 2-s intervals. The subject could orient himself toward the target source correctly within approximately 2.8 s although the activation of vibrators was intermittent.
Funding
Acknowledgments
Conflicts of Interest
References
- Symptoms of Hearing Loss & Degree of Hearing Loss, WebMS. Available online: https://www.webmd.com/a-to-z-guides/hearing-loss-causes-symptoms-treatment#1 (accessed on 17 June 2019).
- Furuta, S.T.; Nakamura, T.; Iwahiri, Y.; Fukui, S.; Kanoh, M.; Yamada, K. Consideration of life rhythm for hearing-dog robots searching for user. In Proceedings of the TAAI, Taichung, Taiwan, 30 November–2 December 2018. [Google Scholar]
- Kudo, H.; Koizumi, T.; Nakamura, T.; Kanoh, M.; Yamada, K. Behaviour model for hearing-dog robot. In Proceedings of the IEEE-ICIS&ISIS, Sapporo, Japan, 25–28 August 2016. [Google Scholar]
- Roman, N.; Wang, D.L. Binaural tracking of multiple moving sources. IEEE Trans. Audio Speech Lang. Process. 2008, 16, 728–739. [Google Scholar] [CrossRef]
- Knapp, C.; Carter, G. The generalized correlation method for estimation of time delay. IEEE Trans. Acoust. Speech Signal Process. 1976, 24, 320–327. [Google Scholar] [CrossRef] [Green Version]
- Shinn-Cunningham, B.G.; Stantarelli, S.; Kopco, N. Tori of confusion: Binaural localization cues for sources within reach of a listener. J. Acoust. Soc. Am. 2000, 107, 1627–1636. [Google Scholar] [CrossRef] [PubMed]
- Archer-Boyd, A.W.; Whitmer, W.M.; Brimijoin, W.O.; Soraghan, J.J. Biomimetic direction of arrival estimation for resolving front-back confusions in hearing aids. J. Acoust. Soc. Am. 2015, 137, 360–366. [Google Scholar] [CrossRef] [PubMed]
- Wallach, H. On sound localization. J. Acoust. Soc. Am. 1919, 10, 270–274. [Google Scholar] [CrossRef]
- Hartmann, W.M. How we localize sound. Phys. Today Am. Inst. Phys. 1999, 52, 24–28. [Google Scholar] [CrossRef]
- Wallach, H. The role of head movements and vestibular and visual cues in sound localization. J. Exp. Psychol. 1940, 27, 339–368. [Google Scholar] [CrossRef]
- Turlow, W.R.; Mangels, J.W.; Runge, P.S. Head movement during sound localization. J. Acoust. Soc. Am. 1967, 42, 489–493. [Google Scholar] [CrossRef] [PubMed]
- Pollack, I.; Rose, M. Effect of head movement on the localization of sounds in the equatorial plane. Percep. Psychophys. 1967, 2, 591–596. [Google Scholar] [CrossRef]
- Wightman, F.L.; Kistler, D.J. Resolution of front-back ambiguity in spatial hearing by listener and source movement. J. Acoust. Soc. Am. 1999, 105, 2841–2853. [Google Scholar]
- Usagawa, T.; Saho, A.; Imamura, K.; Chisaki, Y. A solution of front-back confusion within binaural processing by an estimation method of sound source direction on sagittal coordinate. In Proceedings of the IEEE/TENCON, Bali, Indonesia, 21–24 November 2011. [Google Scholar]
- Kyo-Sik, K.; Hyung-Tai, C. Distinction of front/back direction algorithm for effective detection of a speaker position. In Proceedings of the IEEE/CISP, Sanya, Hainan, China, 27–30 May 2008. [Google Scholar]
- Ma, N.; May, T.; Wierstorf, H.; Brown, G. A machine-hearing system exploiting head movements for binaural sound localization in reverberant conditions. In Proceedings of the IEEE/ICASSP, Brisbane, QLD, Australia, 19–24 April 2015. [Google Scholar]
- Bustamante, G.; Danes, P. Multi-step-ahead information-based feedback control for active binaural localization. In Proceedings of the IEEE/RSJ IROS, Vancouver, BC, Canada, 24–28 September 2017. [Google Scholar]
- Bustamante, G.; Danes, P.; Forgue, T.; Podlubne, A. A one-step-ahead information-based feedback control for binaural active localization. In Proceedings of the EUSIPCO, Budapest, Hungary, 29 August–2 September 2016; pp. 1013–1017. [Google Scholar]
- Perrett, S.; Noble, W. The contribution of head motion cues to localization of low-pass noise. Percept. Psychop. 1997, 59, 1018–1026. [Google Scholar] [CrossRef] [Green Version]
- Bustamante, G.; Portello, A.; Danes, P. A three-stage framework to active source localization from a binaural head. In Proceedings of the IEEE/ICASSP, Brisbane, QLD, Australia, 19–24 April 2015; pp. 5620–5624. [Google Scholar]
- Shinn-Cunningham, B.G.; Kopco, N.; Martin, T.J. Localizing nearby sound sources in a classroom: Binaural room impulse response. J. Acoust. Soc. Am. 2005, 117, 3100–3115. [Google Scholar] [CrossRef] [PubMed]
- Hu, J.S.; Liu, W.H. Location classification of nonstationary sound sources using binaural room distribution patterns. IEEE Trans. Audio Speech Lang. Process. 2009, 17, 682–692. [Google Scholar] [CrossRef]
- Nix, J.; Holmann, V. Sound source localization in real sound fields based on empirical statistics of interaural parameters. J. Acoust. Soc. Am. 2006, 119, 463–479. [Google Scholar] [CrossRef] [PubMed]
- Brown, G.J.; Harding, S.; Barker, J.P. Speech separation on the statistics of binaural auditory features. In Proceedings of the IEEE/ICASSP, Toulouse, France, 14–19 May 2006. [Google Scholar]
- Shimoyama, R.; Sho, I. Room volume estimation based on ambiguity of short-term interaural phase differences using humanoid robot head. Robotics 2016, 5, 16. [Google Scholar] [CrossRef]
- Chen, K.; Geiger, J.T.; Helwani, K.; Taghizadeh, M.J. Localization of sound source with known statistics in the presence of interferers. In Proceedings of the IEEE/ICASSP, Shanghai, China, 20–25 March 2016. [Google Scholar]
- Murota, Y.; Kitamura, D.; Koyama, S.; Saruwatari, H.; Nakamura, S. Statistical modeling of binaural signal and its application to binaural source separation. In Proceedings of the IEEE/ICASSP, Brisbane, QLD, Australia, 19–24 April 2015. [Google Scholar]
- Shimoyama, R.; Yamazaki, K. Multiple acoustic source localization using ambiguous phase differences under reveberative conditions. Acoust. Sci. Tech. 2004, 25, 446–456. [Google Scholar] [CrossRef]
- Shimoyama, R.; Yamazaki, K. Computational acoustic vision by solving phase ambiguity confusion. Acoust. Sci. Technol. 2009, 30, 199–208. [Google Scholar] [CrossRef] [Green Version]
- Wang, D.; Brown, G.J. Computational auditory scene analysis. In Piscataway; John Wiley & Sons: Hoboken, NJ, USA, 2006. [Google Scholar]
- Raspaud, M.; Viste, H.; Evangelista, G. Binaural localization by joint estimation of ILD and ITD. IEEE Trans. Audi Speech Lang. Process. 2010, 18, 68–77. [Google Scholar] [CrossRef]
- Shimoyama, R. Effect of sound diffraction on interaural time difference using head and torso. Proc. Forum on Information and Technology (In Japanese); Tottori, Japan, 5 August 2013; pp. 429–430. Available online: https://ipsj.ixsq.nii.ac.jp/ej/?action=repository_uri&item_id=152201&file_id=1&file_no=1 (accessed on 17 June 2019).
- Fujii, F.; Hogaki, N.; Watanabe, Y. A simple and robust binaural sound source localization system using interaural time difference as a cue. In Proceedings of the IEEE-ICMA, Takamatsu, Japan, 4–7 August 2013; pp. 1095–1101. [Google Scholar]
- Baumann, C.; Rogers, C.; Massen, F. Dynamic binaural sound localization based on variations of interaural time delays and system rotations. J. Acoust. Soc. Am. 2015, 138, 635–650. [Google Scholar] [CrossRef]
- Ross, D.A.; Blasch, B.B. Wearable interfaces for orientation and wayfinding. In Proceedings of the ASSETS’00, Decatur, GA, USA, 13–15 November 2000; pp. 193–200. [Google Scholar]
- Spelmezan, D.; Jacobs, M.; Hilgers, A.; Borchers, J. Tactile motion instructions for physical activities. In Proceedings of the CHI2009, Boston, MA, USA, 4–9 April 2009. [Google Scholar]
- Yatani, K.; Banovic, N.; Truong, K. SpaceSense: Representing geographical information to visually impaired people using spatial tactile feedback. In Proceedings of the CHI2012, Austin, TX, USA, 5–10 May 2012. [Google Scholar]
- Jiang, I.; Ishikawa, Y.; Lindsay, J.; Hannaford, B. Design and optimization of support structures for tactile feedback. In Proceedings of the IEEE/WHC, Daejeon, Korea, 14–17 April 2013. [Google Scholar]
- Vaijapurkar, V.B.; Ravinder, Y. A survey on recent trends and technologies in tactile interfaces with sensing perspective. In Proceedings of the IEEE/INDICON, New Delhi, India, 17–20 December 2015; pp. 1–5. [Google Scholar]
- Furuhashi, M.; Nakamura, T.; Kanoh, K.; Yamada, K. Haptic communication robot for urgent notification of Hearing-impaired people. In Proceedings of the IEEE/HRI, Christchurch, New Zealand, 7–10 March 2016. [Google Scholar]
- Khin, P.M.; Low, J.H.; Lee, W.W.; Kukreja, S.L.; Ren, H.L.; Thakor, N.V.; Yeow, C.H. Soft haptics using soft actuator and soft sensor. In Proceedings of the IEEE/BioRob, Singapore, 26–29 June 2016. [Google Scholar]
- Honda, T.; Okamoto, M. User interface design of sound tactile. In Lecture Notes in Computer Science in Computer Helping People with Special Needs; Springer: Cham, Switzerland, 2014; Volume 8548, pp. 382–385. [Google Scholar]
- Carr, C.E.; Konishi, M. A Circuit for detection of interaural time differences in the brain stem of the Barn Owl. J. Neurosci. 1990, 10, 3227–3246. [Google Scholar] [CrossRef]
- Konishi, M.; Takahashi, T.T.; Wagner, H.; Sullivan, W.E.; Carr, C.E. Neurophysiological and anatomical substrates of sound localization in the Owl. Neurobiol. Bases Hear 1988, 24, 721–745. [Google Scholar]
- Konishi, M. Study of sound localization by owls and its relevance to humans. Comp. Biochem. Physiol. 2000, 126, 459–469. [Google Scholar] [CrossRef]
- Shimoyama, R.; Syou, I. Wearable Hearing Support System Tapping toward Sound Source. In Proceedings of the 26 International Conference on Robotics in Alpes-Adria-Danube Region (RAAD), Torino, Italy, 21–23 June 2017. [Google Scholar]
- Kiyokawa, K. Technical trend on wide range high resolution display in the resent year. In Research Report of Technical Trend; Optoelectronic Industry and Technology Development Association: Tokyo, Japan, 2001; pp. 395–398. [Google Scholar]
Subject | Interval Between the Ears [m] |
---|---|
Humanoid robot | 0.108 |
Head and torso simulator | 0.129 |
Human A | 0.179 |
Human B | 0.197 |
Human C | 0.23 |
© 2019 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Shimoyama, R. Wearable Hearing Assist System to Provide Hearing-Dog Functionality. Robotics 2019, 8, 49. https://doi.org/10.3390/robotics8030049
Shimoyama R. Wearable Hearing Assist System to Provide Hearing-Dog Functionality. Robotics. 2019; 8(3):49. https://doi.org/10.3390/robotics8030049
Chicago/Turabian StyleShimoyama, Ryuichi. 2019. "Wearable Hearing Assist System to Provide Hearing-Dog Functionality" Robotics 8, no. 3: 49. https://doi.org/10.3390/robotics8030049
APA StyleShimoyama, R. (2019). Wearable Hearing Assist System to Provide Hearing-Dog Functionality. Robotics, 8(3), 49. https://doi.org/10.3390/robotics8030049