Visual-Tactile Speech Perception and the Autism Quotient

Type of content
Journal Article
Thesis discipline
Degree name
Publisher
Frontiers Media SA
Journal Title
Journal ISSN
Volume Title
Language
Date
2019
Authors
Derrick, Donald
Bicevskis K
Gick B
Abstract

Multisensory information is integrated asymmetrically in speech perception: An audio signal can follow video by 240ms, but can precede video by only 60ms, without disrupting the sense of synchronicity (Munhall et al., 1996). Similarly, air flow can follow either audio (Gick et al., 2010) or video (Bicevskis et al., 2016) by a much larger margin than it can precede either while remaining perceptually synchronous. These asymmetric windows of integration have been attributed to the physical properties of the signals; light travels faster than sound (Munhall et al., 1996), and sound travels faster than air flow (Gick et al., 2010). Perceptual windows of integration narrow during development (Hillock-Dunn and Wallace, 2012), but remain wider among people with autism (Wallace and Stevenson, 2014). Here we show that, even among neurotypical adult perceivers, visual-tactile windows of integration are wider and flatter the higher the participant’s Autism Quotient (AQ) (Baron-Cohen et al., 2001), a self-report measure of autistic traits. As “pa” is produced with a tiny burst of aspiration (Derrick et al., 2009), we applied light and inaudible air puffs to participants’ necks while they watched silent videos of a person saying “ba” or “pa,” with puffs presented both synchronously and at varying degrees of asynchrony relative to the recorded plosive release burst, which itself is time-aligned to visible lip opening. All syllables seen along with cutaneous air puffs were more likely to be perceived as “pa.” Syllables were perceived as “pa” most often when the air puff occurred 50–100ms after lip opening, with decaying probability as asynchrony increased. Integration was less dependent on time-alignment the higher the participant’s AQ. Perceivers integrate event-relevant tactile information in visual speech perception with greater reliance upon event-related accuracy the more they self-describe as neurotypical, supporting the Happé and Frith (2006) weak coherence account of autism spectrum disorder (ASD).

Description
Citation
Derrick D, Bicevskis K, Gick B Visual-Tactile Speech Perception and the Autism Quotient. Frontiers in Communication. 3.
Keywords
speech perception, multisensory speech perception, multimodal speech perception, audio-tactile perception, autism spectrum disorders
Ngā upoko tukutuku/Māori subject headings
ANZSRC fields of research
Field of Research::17 - Psychology and Cognitive Sciences::1702 - Cognitive Science::170204 - Linguistic Processes (incl. Speech Production and Comprehension)
Rights