Audio-Visual-Tactile integration in speech perception.

Type of content
Conference Contributions - Other
Publisher's DOI/URI
Thesis discipline
Degree name
Publisher
Journal Title
Journal ISSN
Volume Title
Language
Date
2018
Authors
Derrick, Donald
Hansmann D
Haws Z
Theys C
Abstract

Behavioural audio-visual research has shown enhancement1 and interference2 in speech perception, as has behavioural audio-tactile research3. However, to date, we have not encountered any experimental behavioural research into tri-modal integration in speech perception. (But see Alcorn4 for clinical techniques incorporating both vision and touch.) Based on the relative influence of visual and aero-tactile stimuli, we expect cumulative effects of both, with the most influence from auditory information, then visual information1, and lastly airflow3. Here we present a two-way forced-choice study of tri-modal integration in speech perception, showing the effects of both congruent and incongruent stimuli on accurate identification of auditory speech-in-noise.

Description
Citation
Derrick D, Hansmann D, Haws Z, Theys C (2018). Audio-Visual-Tactile integration in speech perception.. Lisbon, Portugal: LabPhon16. 19/06/2018.
Keywords
Ngā upoko tukutuku/Māori subject headings
ANZSRC fields of research
Fields of Research::47 - Language, communication and culture::4704 - Linguistics::470410 - Phonetics and speech science
Rights