• Admin
    UC Research Repository
    View Item 
       
    • UC Home
    • Library
    • UC Research Repository
    • College of Science
    • Science: Journal Articles
    • View Item
       
    • UC Home
    • Library
    • UC Research Repository
    • College of Science
    • Science: Journal Articles
    • View Item
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Browse

    All of the RepositoryCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsThis CollectionBy Issue DateAuthorsTitlesSubjects

    Statistics

    View Usage Statistics

    Visual-tactile integration in speech perception : evidence for modality neutral speech primitives.

    Thumbnail
    View/Open
    Published version (857.3Kb)
    Author
    Bicevskis K
    Derrick D
    Gick B
    Date
    2016
    Permanent Link
    http://hdl.handle.net/10092/17682

    © 2016 Acoustical Society of America. Audio-visual [McGurk and MacDonald (1976). Nature 264, 746-748] and audio-tactile [Gick and Derrick (2009). Nature 462(7272), 502-504] speech stimuli enhance speech perception over audio stimuli alone. In addition, multimodal speech stimuli form an asymmetric window of integration that is consistent with the relative speeds of the various signals [Munhall, Gribble, Sacco, and Ward (1996). Percept. Psychophys. 58(3), 351-362; Gick, Ikegami, and Derrick (2010). J. Acoust. Soc. Am. 128(5), EL342-EL346]. In this experiment, participants were presented video of faces producing /pa/ and /ba/ syllables, both alone and with air puffs occurring synchronously and at different timings up to 300 ms before and after the stop release. Perceivers were asked to identify the syllable they perceived, and were more likely to respond that they perceived /pa/ when air puffs were present, with asymmetrical preference for puffs following the video signal - consistent with the relative speeds of visual and air puff signals. The results demonstrate that visual-tactile integration of speech perception occurs much as it does with audio-visual and audio-tactile stimuli. This finding contributes to the understanding of multimodal speech perception, lending support to the idea that speech is not perceived as an audio signal that is supplemented by information from other modes, but rather that primitives of speech perception are, in principle, modality neutral.

    Subjects
    Field of Research::17 - Psychology and Cognitive Sciences::1702 - Cognitive Science::170204 - Linguistic Processes (incl. Speech Production and Comprehension)
     
    Field of Research::20 - Language, Communication and Culture::2004 - Linguistics::200404 - Laboratory Phonetics and Speech Science
    Collections
    • Science: Journal Articles [912]

    UC Research Repository
    University Library
    University of Canterbury
    Private Bag 4800
    Christchurch 8140

    Phone
    364 2987 ext 8718

    Email
    ucresearchrepository@canterbury.ac.nz

    Follow us
    FacebookTwitterYoutube

    © University of Canterbury Library
    Send Feedback | Contact Us