Listening to disordered speech results in early modulations of auditory event-related potentials (2017)
In the last decade, research on motor speech disorders has increasingly taken into account the bidirectionality between speaker and listener. Listening to disordered speech (e.g., dysarthria) may result in substantial phonemic uncertainty. In turn, a larger set of potential word target candidates may be activated—contributing to intelligibility deficits. To resolve this uncertainty, a combination of both bottom-up and top-down processes are thought to play a role (Liss, 2007). The goal of the present study was to investigate the contribution of these processes by analysing listeners’ neurophysiological processing when listening to dysarthric speech. Thirty healthy native English speakers (12 males, 18-44 years) participated in a speech perception experiment while undergoing 32-channel EEG recording. Similarly to Obleser and Kotz (2011), we focused on the auditory N100 as a marker for earlier sensory processing and the N400- like peak representing information on later cognitive-linguistic processes. Participants listened to 55 moderate hypokinetic dysarthric sentences and 55 control sentences. The experiment was repeated one week later to investigate the effects of repeated exposure to disordered speech. The amplitudes and latencies of the event-related potentials over Cz were analysed. Repeated measures GLM statistics of the N100 with speech type (dysarthria vs. control) and test session as independent variables showed a main effect of speech type, with increased amplitude (Fampl(28)=12.18, p<.01) and decreased latency (Flat(28)=6.77, p=.02) when listening to dysarthric versus control speech. There was no significant main effect of test session or interaction effect. In contrast, no significant effects of speech type and test session were observed on the amplitude of the N400-like peak. For latency, only a significant interaction effect was present (Flat(28)=4.16, p=.05), evidenced by decreased latency for dysarthric sentences during the first test session, and the reverse during the second session. The N100 results show that the quality of the auditory signal in naturally degraded dysarthric speech influences early sensory auditory processing, indicating an increase in the initial allocation of neurophysiological resources (Obleser & Kotz, 2011). The N400 latency results show that later, more cognitive-linguistic processes are not only influenced by the degradation of the signal itself but also by the amount of exposure to that signal, a finding consistent with previous behavioural research on dysarthric speech (Borrie et al., 2012).
CitationTheys C, McAuliffe MJ (2017). Listening to disordered speech results in early modulations of auditory event-related potentials. Groningen, The Netherlands: 7th International Conference on Speech Motor Control. 5/7/2017-8/7/2017. Stem-, Spraak- en Taalpathologie.
This citation is automatically generated and may be unreliable. Use as a guide only.
ANZSRC Fields of Research47 - Language, communication and culture::4704 - Linguistics::470410 - Phonetics and speech science
17 - Psychology and Cognitive Sciences::1702 - Cognitive Science::170204 - Linguistic Processes (incl. Speech Production and Comprehension)
Showing items related by title, author, creator and subject.
Effect of speaker age on speech recognition and perceived listening effort in older adults with hearing loss McAuliffe, M.J.; Wilding, P.J.; Rickard, N.A.; O'Beirne, Greg A. (University of Canterbury. Communication Disorders, 2012)Purpose: Older adults exhibit difficulty understanding speech that has been experimentally degraded. Age-related changes to the speech mechanism lead to natural degradations in signal quality. We tested the hypothesis that ...
McAuliffe M; Fletcher A; Kerr S; Sinex D (2018)
Visual-tactile integration in speech perception : evidence for modality neutral speech primitives. Bicevskis K; Derrick, Donald; Gick B (Acoustical Society of America (ASA), 2016)© 2016 Acoustical Society of America. Audio-visual [McGurk and MacDonald (1976). Nature 264, 746-748] and audio-tactile [Gick and Derrick (2009). Nature 462(7272), 502-504] speech stimuli enhance speech perception over ...