Audio-Visual-Tactile integration in speech perception. (2018)
AuthorsDerrick D, Hansmann D, Haws Z, Theys Cshow all
Behavioural audio-visual research has shown enhancement1 and interference2 in speech perception, as has behavioural audio-tactile research3. However, to date, we have not encountered any experimental behavioural research into tri-modal integration in speech perception. (But see Alcorn4 for clinical techniques incorporating both vision and touch.) Based on the relative influence of visual and aero-tactile stimuli, we expect cumulative effects of both, with the most influence from auditory information, then visual information1, and lastly airflow3. Here we present a two-way forced-choice study of tri-modal integration in speech perception, showing the effects of both congruent and incongruent stimuli on accurate identification of auditory speech-in-noise.
CitationDerrick D, Hansmann D, Haws Z, Theys C (2018). Audio-Visual-Tactile integration in speech perception.. Lisbon, Portugal: LabPhon16. 19/06/2018.
This citation is automatically generated and may be unreliable. Use as a guide only.