Tri-modal Speech: Audio-visual-tactile Integration in Speech Perception (2019)
Type of ContentJournal Article
- Science: Journal Articles 
Speech perception is a multi-sensory experience. Visual information enhances [Sumby and Pollack (1954). J. Acoust. Soc. Am. 25, 212–215] and interferes [McGurk and MacDonald (1976). Nature 264, 746–748] with speech perception. Similarly, tactile information, transmitted by puffs of air arriving at the skin and aligned with speech audio, alters [Gick and Derrick (2009). Nature 462, 502–504] auditory speech perception in noise. It has also been shown that aero-tactile information inﬂuences visual speech perception when an auditory signal is absent [Derrick, Bicevskis, and Gick (2019a). Front. Commun. Lang. Sci. 3(61), 1–11]. However, researchers have not yet identiﬁed the combined inﬂuence of aero-tactile, visual, and auditory information on speech perception. The effects of matching and mismatching visual and tactile speech on two-way forced-choice auditory syllable-in-noise classiﬁcation tasks were tested. The results showed that both visual and tactile information altered the signal-to-noise threshold for accurate identiﬁcation of auditory signals. Similar to previous studies, the visual component has a strong inﬂuence on auditory syllable-in-noise identiﬁcation, as evidenced by a 28.04dB improvement in SNR between matching and mismatching visual stimulus presentations. In comparison, the tactile component had a small inﬂuence resulting in a 1.58dB SNR match-mismatch range. The effects of both the audio and tactile information were shown to be additive.
CitationDonald Derrick, Doreen Hansmann, Catherine Theys. (2019) Tri-modal speech: Audio-visual-tactile integration in speech perception. The Journal of the Acoustical Society of America 146:5, 3495-3504.
This citation is automatically generated and may be unreliable. Use as a guide only.
ANZSRC Fields of Research17 - Psychology and Cognitive Sciences::1702 - Cognitive Science::170204 - Linguistic Processes (incl. Speech Production and Comprehension)
Showing items related by title, author, creator and subject.
Visual-tactile integration in speech perception : evidence for modality neutral speech primitives. Bicevskis K; Derrick D; Gick B (Acoustical Society of America (ASA), 2016)© 2016 Acoustical Society of America. Audio-visual [McGurk and MacDonald (1976). Nature 264, 746-748] and audio-tactile [Gick and Derrick (2009). Nature 462(7272), 502-504] speech stimuli enhance speech perception over ...
Derrick D; Bicevskis K; Gick B (Frontiers Media SA, 2019)Multisensory information is integrated asymmetrically in speech perception: An audio signal can follow video by 240ms, but can precede video by only 60ms, without disrupting the sense of synchronicity (Munhall et al., ...
Speech-language pathology student participation in verbal reflective practice groups: perceptions of development, value and group condition differences. Tillard G; Cook KJ; Gerhard D; Keast L; McAuliffe M (2018)In summary, there is widespread support for the use of group discussion in reflective practice (Caty et al., 2015). The interaction with peers allows for the exchange and comparison of beliefs and behaviours, perspectives ...