Learning effects in multimodal perception with real and simulated faces (2019)
Type of ContentConference Contributions - Published
PublisherAustralian Speech Science and Technology Association Inc.
- Staging 
EditorsCalhoun SEscudero PTabain MWarren P
We have all learned to associate real voices with animated faces since childhood. Researchers use this association, employing virtual faces in audiovisual speech perception tasks. However, we do not know if perceivers treat those virtual faces the same as real faces, or if instead integration of speech cues from new virtual faces must be learned at the time of contact. We test this possibility using speech information that perceivers have never had a chance to associate with simulated faces – aerotactile somatosensation. With human faces, silent bilabial articulations (“ba” and “pa”), accompanied by synchronous cutaneous airflow, shift perceptual bias towards “pa”. If visual-tactile integration is unaffected by the visual stimuli’s ecological origin, results with virtual faces should be similar. Contra previous reports , our results show perceivers do treat computer-generated faces and human faces in a similar fashion - visually aligned cutaneous airflow shifts perceptual bias towards “pa” equally well with virtual and real faces.
CitationKeough M, Derrick D, Taylor RC, Gick B (2019). Learning effects in multimodal perception with real and simulated faces. Melbourne: International Congress of the Phonetic Sciences 2019. 05/08/2019-09/08/2019. Proceedings of the 19th International Congress of Phonetic Sciences, Melbourne, Australia 2019. 1189-1192.
This citation is automatically generated and may be unreliable. Use as a guide only.
KeywordsSpeech Perception; Speech Acoustics; Multimodal Phonetics
ANZSRC Fields of Research47 - Language, communication and culture::4704 - Linguistics
Showing items related by title, author, creator and subject.
Aero-tactile integration during speech perception: Effect of response and stimulus characteristics on syllable identification Derrick D; Madappallimattam J; Theys C (Acoustical Society of America (ASA), 2019)Integration of auditory and aero-tactile information during speech perception has been documented during two-way closed-choice syllable classification tasks [Gick and Derrick (2009). Nature 462, 502–504], but not during ...
Visual-tactile integration in speech perception : evidence for modality neutral speech primitives. Bicevskis K; Derrick D; Gick B (Acoustical Society of America (ASA), 2016)© 2016 Acoustical Society of America. Audio-visual [McGurk and MacDonald (1976). Nature 264, 746-748] and audio-tactile [Gick and Derrick (2009). Nature 462(7272), 502-504] speech stimuli enhance speech perception over ...
Derrick D; Hansmann D; Haws Z; Theys C (2018)Behavioural audio-visual research has shown enhancement1 and interference2 in speech perception, as has behavioural audio-tactile research3. However, to date, we have not encountered any experimental behavioural research ...