Aero-tactile integration in fricatives: Converting audio to air flow information for speech perception enhancement (2014)
EditorsLi HMeng HMMa BChng EXie L
We follow up on research demonstrating that aero-tactile information can enhance or interfere with accurate auditory perception among uninformed and untrained perceivers [1, 2, 3]. We computationally extract aperiodic information from auditory recordings of speech, which represents turbulent air-flow produced from the lips [4, 5]. This extracted signal is used to drive a piezoelectric air-pump producing air-flow to the right temple simultaneous with presentation of auditory recordings. Using forced-choice experiments, we replicate previous results with stops, finding enhanced perception of /pa/ in /pa/ vs. /ba/ pairs, and /ta/ in /ta/ vs. /da/ pairs [1, 6, 2, 3]. We also found enhanced perception of /fa/ in /ba/ vs. /fa/ pairs, and /sha/ in /da/ vs. /sha/ pairs, demonstrating that air flow during fricative production contacting the skin can also enhance speech perception. The results show that aero-tactile information can be extracted from the audio signal and used to enhance speech perception of a large class of speech sounds found in many languages of the world.
CitationDerrick D, O'Beirne G, De Rybel T, Hay J (2014). Aero-tactile integration in fricatives: Converting audio to air flow information for speech perception enhancement. Singapore: 15th Annual Conference of the International Speech Communication Association (InterSpeech 2014). 14/09/2014-18/09/2014. INTERSPEECH 2014. 2580-2584.
This citation is automatically generated and may be unreliable. Use as a guide only.
Keywordsspeech perception; aero-tactile integration; embodiment theory; audio perception enhancement
ANZSRC Fields of Research47 - Language, communication and culture::4704 - Linguistics
Showing items related by title, author, creator and subject.
Aero-tactile integration during speech perception: Effect of response and stimulus characteristics on syllable identification Derrick D; Madappallimattam J; Theys C (Acoustical Society of America (ASA), 2019)Integration of auditory and aero-tactile information during speech perception has been documented during two-way closed-choice syllable classification tasks [Gick and Derrick (2009). Nature 462, 502–504], but not during ...
Derrick D; Hansmann D; Haws Z; Theys C (2018)Behavioural audio-visual research has shown enhancement1 and interference2 in speech perception, as has behavioural audio-tactile research3. However, to date, we have not encountered any experimental behavioural research ...
Gick B; Ikegami Y; Derrick D (Acoustical Society of America (ASA), 2010)Asynchronous cross-modal information is integrated asymmetrically in audio-visual perception. To test whether this asymmetry generalizes across modalities, auditory (aspirated "pa" and unaspirated "ba" stops) and tactile ...