Visual-tactile integration in speech perception : evidence for modality neutral speech primitives. (2016)

View/ Open
Type of Content
Journal ArticlePublisher
Acoustical Society of America (ASA)ISSN
0001-49661520-8524
Language
EnglishCollections
Abstract
© 2016 Acoustical Society of America. Audio-visual [McGurk and MacDonald (1976). Nature 264, 746-748] and audio-tactile [Gick and Derrick (2009). Nature 462(7272), 502-504] speech stimuli enhance speech perception over audio stimuli alone. In addition, multimodal speech stimuli form an asymmetric window of integration that is consistent with the relative speeds of the various signals [Munhall, Gribble, Sacco, and Ward (1996). Percept. Psychophys. 58(3), 351-362; Gick, Ikegami, and Derrick (2010). J. Acoust. Soc. Am. 128(5), EL342-EL346]. In this experiment, participants were presented video of faces producing /pa/ and /ba/ syllables, both alone and with air puffs occurring synchronously and at different timings up to 300 ms before and after the stop release. Perceivers were asked to identify the syllable they perceived, and were more likely to respond that they perceived /pa/ when air puffs were present, with asymmetrical preference for puffs following the video signal - consistent with the relative speeds of visual and air puff signals. The results demonstrate that visual-tactile integration of speech perception occurs much as it does with audio-visual and audio-tactile stimuli. This finding contributes to the understanding of multimodal speech perception, lending support to the idea that speech is not perceived as an audio signal that is supplemented by information from other modes, but rather that primitives of speech perception are, in principle, modality neutral.
Citation
Bicevskis K, Derrick D, Gick B (2016). Visual-tactile integration in speech perception: Evidence for modality neutral speech primitives. Journal of the Acoustical Society of America. 140(5). 3531-3539.This citation is automatically generated and may be unreliable. Use as a guide only.
ANZSRC Fields of Research
17 - Psychology and Cognitive Sciences::1702 - Cognitive Science::170204 - Linguistic Processes (incl. Speech Production and Comprehension)47 - Language, communication and culture::4704 - Linguistics::470410 - Phonetics and speech science
Related items
Showing items related by title, author, creator and subject.
-
Tri-modal Speech: Audio-visual-tactile Integration in Speech Perception
Derrick, Donald; Hansmann D; Theys C (2019)Speech perception is a multi-sensory experience. Visual information enhances [Sumby and Pollack (1954). J. Acoust. Soc. Am. 25, 212–215] and interferes [McGurk and MacDonald (1976). Nature 264, 746–748] with speech perception. ... -
Visual-Tactile Speech Perception and the Autism Quotient
Derrick, Donald; Bicevskis K; Gick B (Frontiers Media SA, 2019)Multisensory information is integrated asymmetrically in speech perception: An audio signal can follow video by 240ms, but can precede video by only 60ms, without disrupting the sense of synchronicity (Munhall et al., ... -
Comparing virtual patients with synthesized and natural speech
Heitz, A.; Dünser, A.; Seaton, P.; Seaton, L.; Basu, A. (University of Canterbury. Human Interface Technology LaboratoryUniversity of Canterbury. School of Health Sciences, 2012)Virtual Patient (VP) simulations are often designed to use pre-recorded speech in order to provide more realism and immersion. However, using actors for recording these utterances has certain downsides. It can add to the ...