Audio-Visual-Tactile integration in speech perception.
dc.contributor.author | Derrick, Donald | |
dc.contributor.author | Hansmann D | |
dc.contributor.author | Haws Z | |
dc.contributor.author | Theys C | |
dc.date.accessioned | 2020-01-24T01:33:41Z | |
dc.date.available | 2020-01-24T01:33:41Z | |
dc.date.issued | 2018 | en |
dc.date.updated | 2019-07-01T07:47:02Z | |
dc.description.abstract | Behavioural audio-visual research has shown enhancement1 and interference2 in speech perception, as has behavioural audio-tactile research3. However, to date, we have not encountered any experimental behavioural research into tri-modal integration in speech perception. (But see Alcorn4 for clinical techniques incorporating both vision and touch.) Based on the relative influence of visual and aero-tactile stimuli, we expect cumulative effects of both, with the most influence from auditory information, then visual information1, and lastly airflow3. Here we present a two-way forced-choice study of tri-modal integration in speech perception, showing the effects of both congruent and incongruent stimuli on accurate identification of auditory speech-in-noise. | en |
dc.identifier.citation | Derrick D, Hansmann D, Haws Z, Theys C (2018). Audio-Visual-Tactile integration in speech perception.. Lisbon, Portugal: LabPhon16. 19/06/2018. | en |
dc.identifier.uri | http://hdl.handle.net/10092/17895 | |
dc.language.iso | en | |
dc.subject.anzsrc | Fields of Research::47 - Language, communication and culture::4704 - Linguistics::470410 - Phonetics and speech science | en |
dc.title | Audio-Visual-Tactile integration in speech perception. | en |
dc.type | Conference Contributions - Other | en |
Files
Original bundle
1 - 1 of 1
Loading...
- Name:
- Audio-Visual-Tactile integration in Speech Perception_V2.pdf
- Size:
- 248.24 KB
- Format:
- Adobe Portable Document Format
- Description:
- Accepted version