Visual-Tactile Speech Perception and the Autism Quotient

dc.contributor.authorDerrick, Donald
dc.contributor.authorBicevskis K
dc.contributor.authorGick B
dc.date.accessioned2019-11-27T22:38:29Z
dc.date.available2019-11-27T22:38:29Z
dc.date.issued2019en
dc.date.updated2019-11-21T09:25:49Z
dc.description.abstractMultisensory information is integrated asymmetrically in speech perception: An audio signal can follow video by 240ms, but can precede video by only 60ms, without disrupting the sense of synchronicity (Munhall et al., 1996). Similarly, air flow can follow either audio (Gick et al., 2010) or video (Bicevskis et al., 2016) by a much larger margin than it can precede either while remaining perceptually synchronous. These asymmetric windows of integration have been attributed to the physical properties of the signals; light travels faster than sound (Munhall et al., 1996), and sound travels faster than air flow (Gick et al., 2010). Perceptual windows of integration narrow during development (Hillock-Dunn and Wallace, 2012), but remain wider among people with autism (Wallace and Stevenson, 2014). Here we show that, even among neurotypical adult perceivers, visual-tactile windows of integration are wider and flatter the higher the participant’s Autism Quotient (AQ) (Baron-Cohen et al., 2001), a self-report measure of autistic traits. As “pa” is produced with a tiny burst of aspiration (Derrick et al., 2009), we applied light and inaudible air puffs to participants’ necks while they watched silent videos of a person saying “ba” or “pa,” with puffs presented both synchronously and at varying degrees of asynchrony relative to the recorded plosive release burst, which itself is time-aligned to visible lip opening. All syllables seen along with cutaneous air puffs were more likely to be perceived as “pa.” Syllables were perceived as “pa” most often when the air puff occurred 50–100ms after lip opening, with decaying probability as asynchrony increased. Integration was less dependent on time-alignment the higher the participant’s AQ. Perceivers integrate event-relevant tactile information in visual speech perception with greater reliance upon event-related accuracy the more they self-describe as neurotypical, supporting the Happé and Frith (2006) weak coherence account of autism spectrum disorder (ASD).en
dc.identifier.citationDerrick D, Bicevskis K, Gick B Visual-Tactile Speech Perception and the Autism Quotient. Frontiers in Communication. 3.en
dc.identifier.doihttps://doi.org/10.3389/fcomm.2018.00061
dc.identifier.issn2297-900X
dc.identifier.urihttp://hdl.handle.net/10092/17683
dc.language.isoen
dc.publisherFrontiers Media SAen
dc.subjectspeech perceptionen
dc.subjectmultisensory speech perceptionen
dc.subjectmultimodal speech perceptionen
dc.subjectaudio-tactile perceptionen
dc.subjectautism spectrum disordersen
dc.subject.anzsrcField of Research::17 - Psychology and Cognitive Sciences::1702 - Cognitive Science::170204 - Linguistic Processes (incl. Speech Production and Comprehension)en
dc.titleVisual-Tactile Speech Perception and the Autism Quotienten
dc.typeJournal Articleen
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
fcomm-03-00061.pdf
Size:
2.5 MB
Format:
Adobe Portable Document Format
Description:
Published version