University of Canterbury Home
    • Admin
    UC Research Repository
    UC Library
    JavaScript is disabled for your browser. Some features of this site may not work without it.
    View Item 
    1. UC Home
    2. Library
    3. UC Research Repository
    4. UC Research Centres
    5. NZ Institute of Language, Brain and Behaviour
    6. NZILBB: Journal Articles
    7. View Item
    1. UC Home
    2.  > 
    3. Library
    4.  > 
    5. UC Research Repository
    6.  > 
    7. UC Research Centres
    8.  > 
    9. NZ Institute of Language, Brain and Behaviour
    10.  > 
    11. NZILBB: Journal Articles
    12.  > 
    13. View Item

    Visual-tactile integration in speech perception : evidence for modality neutral speech primitives. (2016)

    Thumbnail
    View/Open
    Published version (857.3Kb)
    Type of Content
    Journal Article
    UC Permalink
    http://hdl.handle.net/10092/17682
    
    Publisher's DOI/URI
    https://doi.org/10.1121/1.4965968
    
    Publisher
    Acoustical Society of America (ASA)
    ISSN
    0001-4966
    1520-8524
    Language
    English
    Collections
    • NZILBB: Journal Articles [18]
    Authors
    Bicevskis K
    Derrick, Donald cc
    Gick B
    show all
    Abstract

    © 2016 Acoustical Society of America. Audio-visual [McGurk and MacDonald (1976). Nature 264, 746-748] and audio-tactile [Gick and Derrick (2009). Nature 462(7272), 502-504] speech stimuli enhance speech perception over audio stimuli alone. In addition, multimodal speech stimuli form an asymmetric window of integration that is consistent with the relative speeds of the various signals [Munhall, Gribble, Sacco, and Ward (1996). Percept. Psychophys. 58(3), 351-362; Gick, Ikegami, and Derrick (2010). J. Acoust. Soc. Am. 128(5), EL342-EL346]. In this experiment, participants were presented video of faces producing /pa/ and /ba/ syllables, both alone and with air puffs occurring synchronously and at different timings up to 300 ms before and after the stop release. Perceivers were asked to identify the syllable they perceived, and were more likely to respond that they perceived /pa/ when air puffs were present, with asymmetrical preference for puffs following the video signal - consistent with the relative speeds of visual and air puff signals. The results demonstrate that visual-tactile integration of speech perception occurs much as it does with audio-visual and audio-tactile stimuli. This finding contributes to the understanding of multimodal speech perception, lending support to the idea that speech is not perceived as an audio signal that is supplemented by information from other modes, but rather that primitives of speech perception are, in principle, modality neutral.

    Citation
    Bicevskis K, Derrick D, Gick B (2016). Visual-tactile integration in speech perception: Evidence for modality neutral speech primitives. Journal of the Acoustical Society of America. 140(5). 3531-3539.
    This citation is automatically generated and may be unreliable. Use as a guide only.
    ANZSRC Fields of Research
    17 - Psychology and Cognitive Sciences::1702 - Cognitive Science::170204 - Linguistic Processes (incl. Speech Production and Comprehension)
    47 - Language, communication and culture::4704 - Linguistics::470410 - Phonetics and speech science

    Related items

    Showing items related by title, author, creator and subject.

    • Tri-modal Speech: Audio-visual-tactile Integration in Speech Perception 

      Derrick, Donald; Hansmann D; Theys C (2019)
      Speech perception is a multi-sensory experience. Visual information enhances [Sumby and Pollack (1954). J. Acoust. Soc. Am. 25, 212–215] and interferes [McGurk and MacDonald (1976). Nature 264, 746–748] with speech perception. ...
    • Visual-Tactile Speech Perception and the Autism Quotient 

      Derrick, Donald; Bicevskis K; Gick B (Frontiers Media SA, 2019)
      Multisensory information is integrated asymmetrically in speech perception: An audio signal can follow video by 240ms, but can precede video by only 60ms, without disrupting the sense of synchronicity (Munhall et al., ...
    • Comparing virtual patients with synthesized and natural speech 

      Heitz, A.; Dünser, A.; Seaton, P.; Seaton, L.; Basu, A. (University of Canterbury. Human Interface Technology LaboratoryUniversity of Canterbury. School of Health Sciences, 2012)
      Virtual Patient (VP) simulations are often designed to use pre-recorded speech in order to provide more realism and immersion. However, using actors for recording these utterances has certain downsides. It can add to the ...
    Advanced Search

    Browse

    All of the RepositoryCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsThesis DisciplineThis CollectionBy Issue DateAuthorsTitlesSubjectsThesis Discipline

    Statistics

    View Usage Statistics
    • SUBMISSIONS
    • Research Outputs
    • UC Theses
    • CONTACTS
    • Send Feedback
    • +64 3 369 3853
    • ucresearchrepository@canterbury.ac.nz
    • ABOUT
    • UC Research Repository Guide
    • Copyright and Disclaimer
    • SUBMISSIONS
    • Research Outputs
    • UC Theses
    • CONTACTS
    • Send Feedback
    • +64 3 369 3853
    • ucresearchrepository@canterbury.ac.nz
    • ABOUT
    • UC Research Repository Guide
    • Copyright and Disclaimer