University of Canterbury Home
    • Admin
    UC Research Repository
    UC Library
    JavaScript is disabled for your browser. Some features of this site may not work without it.
    View Item 
    1. UC Home
    2. Library
    3. UC Research Repository
    4. UC Research Centres
    5. NZ Institute of Language, Brain and Behaviour
    6. NZILBB: Journal Articles
    7. View Item
    1. UC Home
    2.  > 
    3. Library
    4.  > 
    5. UC Research Repository
    6.  > 
    7. UC Research Centres
    8.  > 
    9. NZ Institute of Language, Brain and Behaviour
    10.  > 
    11. NZILBB: Journal Articles
    12.  > 
    13. View Item

    Visual-Tactile Speech Perception and the Autism Quotient (2019)

    Thumbnail
    View/Open
    Published version (2.501Mb)
    Type of Content
    Journal Article
    UC Permalink
    http://hdl.handle.net/10092/17683
    
    Publisher's DOI/URI
    https://doi.org/10.3389/fcomm.2018.00061
    
    Publisher
    Frontiers Media SA
    ISSN
    2297-900X
    Collections
    • NZILBB: Journal Articles [18]
    Authors
    Derrick, Donald cc
    Bicevskis K
    Gick B
    show all
    Abstract

    Multisensory information is integrated asymmetrically in speech perception: An audio signal can follow video by 240ms, but can precede video by only 60ms, without disrupting the sense of synchronicity (Munhall et al., 1996). Similarly, air flow can follow either audio (Gick et al., 2010) or video (Bicevskis et al., 2016) by a much larger margin than it can precede either while remaining perceptually synchronous. These asymmetric windows of integration have been attributed to the physical properties of the signals; light travels faster than sound (Munhall et al., 1996), and sound travels faster than air flow (Gick et al., 2010). Perceptual windows of integration narrow during development (Hillock-Dunn and Wallace, 2012), but remain wider among people with autism (Wallace and Stevenson, 2014). Here we show that, even among neurotypical adult perceivers, visual-tactile windows of integration are wider and flatter the higher the participant’s Autism Quotient (AQ) (Baron-Cohen et al., 2001), a self-report measure of autistic traits. As “pa” is produced with a tiny burst of aspiration (Derrick et al., 2009), we applied light and inaudible air puffs to participants’ necks while they watched silent videos of a person saying “ba” or “pa,” with puffs presented both synchronously and at varying degrees of asynchrony relative to the recorded plosive release burst, which itself is time-aligned to visible lip opening. All syllables seen along with cutaneous air puffs were more likely to be perceived as “pa.” Syllables were perceived as “pa” most often when the air puff occurred 50–100ms after lip opening, with decaying probability as asynchrony increased. Integration was less dependent on time-alignment the higher the participant’s AQ. Perceivers integrate event-relevant tactile information in visual speech perception with greater reliance upon event-related accuracy the more they self-describe as neurotypical, supporting the Happé and Frith (2006) weak coherence account of autism spectrum disorder (ASD).

    Citation
    Derrick D, Bicevskis K, Gick B Visual-Tactile Speech Perception and the Autism Quotient. Frontiers in Communication. 3.
    This citation is automatically generated and may be unreliable. Use as a guide only.
    Keywords
    speech perception; multisensory speech perception; multimodal speech perception; audio-tactile perception; autism spectrum disorders
    ANZSRC Fields of Research
    17 - Psychology and Cognitive Sciences::1702 - Cognitive Science::170204 - Linguistic Processes (incl. Speech Production and Comprehension)

    Related items

    Showing items related by title, author, creator and subject.

    • Visual-tactile integration in speech perception : evidence for modality neutral speech primitives. 

      Bicevskis K; Derrick, Donald; Gick B (Acoustical Society of America (ASA), 2016)
      © 2016 Acoustical Society of America. Audio-visual [McGurk and MacDonald (1976). Nature 264, 746-748] and audio-tactile [Gick and Derrick (2009). Nature 462(7272), 502-504] speech stimuli enhance speech perception over ...
    • Tri-modal Speech: Audio-visual-tactile Integration in Speech Perception 

      Derrick, Donald; Hansmann D; Theys C (2019)
      Speech perception is a multi-sensory experience. Visual information enhances [Sumby and Pollack (1954). J. Acoust. Soc. Am. 25, 212–215] and interferes [McGurk and MacDonald (1976). Nature 264, 746–748] with speech perception. ...
    • Recording and reproducing speech airflow outside the mouth 

      Derrick, Donald; De Rybel, T.; Fiasson, R. (University of Canterbury. New Zealand Institute of Language, Brain & Behaviour, 2015)
    Advanced Search

    Browse

    All of the RepositoryCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsThesis DisciplineThis CollectionBy Issue DateAuthorsTitlesSubjectsThesis Discipline

    Statistics

    View Usage Statistics
    • SUBMISSIONS
    • Research Outputs
    • UC Theses
    • CONTACTS
    • Send Feedback
    • +64 3 369 3853
    • ucresearchrepository@canterbury.ac.nz
    • ABOUT
    • UC Research Repository Guide
    • Copyright and Disclaimer
    • SUBMISSIONS
    • Research Outputs
    • UC Theses
    • CONTACTS
    • Send Feedback
    • +64 3 369 3853
    • ucresearchrepository@canterbury.ac.nz
    • ABOUT
    • UC Research Repository Guide
    • Copyright and Disclaimer