LING/COGS Colloquium: Audio-visual alignment in speech perception: Long- and short-term experience

Abstract

In audiovisual speech, visual information has been well established to facilitate speech perception. In natural speech, articulation initiates the audio signal, such that visible articulation may be predictive of auditory speech. Sensitivity to audio-visual alignment nevertheless varies extensively within and across populations. In a series of behavioral and EEG experiments, we consider how individual differences in audio-visual synchrony sensitivity in speech perception may arise through experience, including diverse long-term experience and fresh findings from short-term experience (i.e. fast recalibration).  In the course of carrying out this research, methodological considerations have arisen. These challenges have also offered long- and short-term experience, in this case, to our humbled research group. In retrospect, these issues have given insight into both the sensitivity and adaptability of audiovisual synchrony perception, and will be taken up in the course of presenting the research itself.