Listen or watch each other speaking

Marc Sato, CNRS researcher at LPL, has just published an article in the Cortex journal on the distinct influence of motor and visual predictive processes on auditory cortical processing during speech production and perception.

 Reference: Marc Sato. Motor and visual influences on auditory neural processing during speaking and listening. Cortex, 2022, 152, 21-35 (https://doi.org/10.1016/j.cortex.2022.03.013)

 You will find the full text of the article under this direct link or via the AMU search interface.

 

Photos credits: Antoine Doinel

How does the brain process visual information associated with speech sounds?

We are pleased to announce the publication of the latest article by Chotiga Pattamadilok and Marc Sato, CNRS researchers at LPL, entitled “How are visemes and graphemes integrated with speech sounds during spoken word recognition? ERP evidence for supra-additive responses during audiovisual compared to auditory speech processing” in the journal Brain and Language.

Reference:
Chotiga Pattamadilok, Marc Sato. How are visemes and graphemes integrated with speech sounds during spoken word recognition? ERP evidence for supra-additive responses during audiovisual compared to auditory speech processing. Brain and Language, Elsevier, 2022, 225, ⟨10.1016/j.bandl.2021.105058⟩⟨hal-03472191v2⟩

Full text on open science database HAL: https://hal.archives-ouvertes.fr/hal-03472191v2

Contact: chotiga.pattamadilok@lpl-aix.fr

Can we predict what is happening in the brain while we are speaking?

Youssef Hmamouche (LPL post-doc) and Laurent Prévot (AMU professor and director of the LPL) - in collaboration with Magalie Ochs (LIS) and Thierry Chaminade (INS) - have just published an article about the BrainPredict tool, which aims to predict and visualize brain activity during human-human or human-robot conversations. The first experiments were carried out with 24 adult participants engaging in natural conversation, which lasted approximately 30 minutes. The first promising results open the way for future studies where there is integration, for example, of other sociolinguistic parameters, or aspects linked to certain language pathologies.

European project COBRA – Call for 15 PhD projects now open!

As part of the European COBRA project, a call for applications is open for 15 doctoral contracts. Application files must be submitted before March 31, 2020 on the website http://conversationalbrains.eu.

COBRA (Conversational Brains) is a project carried out within the framework of the European Marie Skłodowska-Curie Innovative Training Networks program. It brings together 14 partners in 10 countries (France, Great Britain, Italy, Slovakia, Belgium, Germany, Sweden, Netherlands, Finland, Hong Kong), including 10 academic partners and 4 industrial partners. COBRA is a continuation of the European MULTI project previously carried out by the LPL, and is closely linked to the ILCB Institute. It aims to develop research and advanced training in the field of relationships between brain and language, in human-human and human-machine conversational interactions, and in a wide variety of languages. COBRA is coordinated by Noël Nguyen.