Next Article in Journal
Postnatal Migration of Cerebellar Interneurons
Next Article in Special Issue
Subtitling for d/Deaf and Hard-of-Hearing Children: Current Practices and New Possibilities to Enhance Language Development
Previous Article in Journal
Validation of Acoustic Wave Induced Traumatic Brain Injury in Rats
Previous Article in Special Issue
Verbs in Mothers’ Input to Six-Month-Olds: Synchrony between Presentation, Meaning, and Actions Is Related to Later Verb Acquisition
Article Menu

Export Article

Open AccessArticle
Brain Sci. 2017, 7(6), 60; doi:10.3390/brainsci7060060

Electrophysiological Indices of Audiovisual Speech Perception in the Broader Autism Phenotype

1
Haskins Laboratories, New Haven, CT 06511, USA
2
Department of Psychology, Southern Connecticut State University, New Haven, CT 06515, USA
3
Department of Communication Disorders, Southern Connecticut State University, New Haven, CT 06515, USA
4
Psychological Sciences, University of Connecticut, Storrs, CT 06269, USA
*
Authors to whom correspondence should be addressed.
Academic Editor: Heather Bortfeld
Received: 19 February 2017 / Revised: 16 May 2017 / Accepted: 26 May 2017 / Published: 2 June 2017
(This article belongs to the Special Issue Audiovisual Integration in Early Language Development)
View Full-Text   |   Download PDF [3216 KB, uploaded 2 June 2017]   |  

Abstract

When a speaker talks, the consequences of this can both be heard (audio) and seen (visual). A novel visual phonemic restoration task was used to assess behavioral discrimination and neural signatures (event-related potentials, or ERP) of audiovisual processing in typically developing children with a range of social and communicative skills assessed using the social responsiveness scale, a measure of traits associated with autism. An auditory oddball design presented two types of stimuli to the listener, a clear exemplar of an auditory consonant–vowel syllable /ba/ (the more frequently occurring standard stimulus), and a syllable in which the auditory cues for the consonant were substantially weakened, creating a stimulus which is more like /a/ (the infrequently presented deviant stimulus). All speech tokens were paired with a face producing /ba/ or a face with a pixelated mouth containing motion but no visual speech. In this paradigm, the visual /ba/ should cause the auditory /a/ to be perceived as /ba/, creating an attenuated oddball response; in contrast, a pixelated video (without articulatory information) should not have this effect. Behaviorally, participants showed visual phonemic restoration (reduced accuracy in detecting deviant /a/) in the presence of a speaking face. In addition, ERPs were observed in both an early time window (N100) and a later time window (P300) that were sensitive to speech context (/ba/ or /a/) and modulated by face context (speaking face with visible articulation or with pixelated mouth). Specifically, the oddball responses for the N100 and P300 were attenuated in the presence of a face producing /ba/ relative to a pixelated face, representing a possible neural correlate of the phonemic restoration effect. Notably, those individuals with more traits associated with autism (yet still in the non-clinical range) had smaller P300 responses overall, regardless of face context, suggesting generally reduced phonemic discrimination. View Full-Text
Keywords: audiovisual speech perception; development; broader autism phenotype; ERP audiovisual speech perception; development; broader autism phenotype; ERP
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Irwin, J.; Avery, T.; Turcios, J.; Brancazio, L.; Cook, B.; Landi, N. Electrophysiological Indices of Audiovisual Speech Perception in the Broader Autism Phenotype. Brain Sci. 2017, 7, 60.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Brain Sci. EISSN 2076-3425 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top