Audio-Visual-Tactile integration in speech perception.

dc.contributor.authorDerrick, Donald
dc.contributor.authorHansmann D
dc.contributor.authorHaws Z
dc.contributor.authorTheys C
dc.date.accessioned2020-01-24T01:33:41Z
dc.date.available2020-01-24T01:33:41Z
dc.date.issued2018en
dc.date.updated2019-07-01T07:47:02Z
dc.description.abstractBehavioural audio-visual research has shown enhancement1 and interference2 in speech perception, as has behavioural audio-tactile research3. However, to date, we have not encountered any experimental behavioural research into tri-modal integration in speech perception. (But see Alcorn4 for clinical techniques incorporating both vision and touch.) Based on the relative influence of visual and aero-tactile stimuli, we expect cumulative effects of both, with the most influence from auditory information, then visual information1, and lastly airflow3. Here we present a two-way forced-choice study of tri-modal integration in speech perception, showing the effects of both congruent and incongruent stimuli on accurate identification of auditory speech-in-noise.en
dc.identifier.citationDerrick D, Hansmann D, Haws Z, Theys C (2018). Audio-Visual-Tactile integration in speech perception.. Lisbon, Portugal: LabPhon16. 19/06/2018.en
dc.identifier.urihttp://hdl.handle.net/10092/17895
dc.language.isoen
dc.subject.anzsrcFields of Research::47 - Language, communication and culture::4704 - Linguistics::470410 - Phonetics and speech scienceen
dc.titleAudio-Visual-Tactile integration in speech perception.en
dc.typeConference Contributions - Otheren
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Audio-Visual-Tactile integration in Speech Perception_V2.pdf
Size:
248.24 KB
Format:
Adobe Portable Document Format
Description:
Accepted version