The LPL is pleased to announce that the project "From lip- to script-reading: An integrative view of audio-visual associations in language processing" (AVA), submitted by Chotiga Pattamadilok, LPL researcher, was validated in the framework of the last ANR PRC selection.
The early exposure to speech and speakers’ articulatory gestures is the basis of language acquisition and is a fingerprint of audiovisual association learning. Is this initial ability to associate speech sounds and visual inputs a precursor of infants’ reading ability? Answering this question requires a good understanding of the cognitive/neural bases of both language abilities and whether they interact within the language system. Studies comparing task performance and spatio-temporal dynamics of brain activity associated with these abilities will be conducted. At the theoretical level, the outcome should lead to an elaboration of a unified framework explaining how multi-modal inputs jointly contribute to form a coherent language representation. At the practical level, the new perspective of a link between the developmental trajectories of “lip-reading” and “script-reading” should contribute to language learning and facilitate early detection and remediation of reading deficits.
Laboratoire Parole et Langage, Aix-Marseille Univ. (coordinator)
Laboratoire D'Etude des Mécanismes Cognitifs, Univ. Lyon 2
Laboratoire de Psychologie et NeuroCognition, Univ. Grenoble Alpes
SFR Santé Lyon-Est, Univ. Lyon 1