Augmented reality at the service of deaf people

We are pleased to announce that Brigitte Bigi, CNRS researcher at LPL, has been awarded for the project "Seeing sounds with automated 'Cued Speech': augmented reality at the service of deaf people".

Funded by the International Foundation for Applied Research on Disability (FIRAH) to the tune of €50,000, it has been developed in partnership with the Datha Association (Parents of deaf children and friends of deaf people) and the International Academy Supporting Cued Adaptations (AISAC).

The LPL project team includes the following members: Brigitte Bigi (project manager), Núria Gala, Michel Pitermann and Carine André.

More information:
Elaboration of the first LPL Cued speech corpus (2021) :
Link to the Cued speech corpus (CLeLfPC) :

Credits: 2021 B. Bigi and M. Zimmermann

Corpus release about French Cued Speech

Brigitte Bigi - CNRS researcher at the LPL - has just submitted a corpus recorded in August 2021 during the internship organized by the French National Cued Speech Association (ALPC, :

Brigitte Bigi, Maryvonne Zimmermann (2021). CLeLfPC [Corpus]. ORTOLANG (Open Resources and TOols for LANGuage) -, v1,

Produced in collaboration with Maryvonne Zimmermann (ALPC-Datha) and Carine André (LPL-CNRS), this corpus contains audio / video recordings and read aloud annotations simultaneously coded in French Cued Speech (LfPC). LfPC coding results in hand movements that accompany speech. Its purpose is to facilitate lip reading for deaf people - by means of a more detailed understanding of phonemes - and thus to allow them access to the spoken French language.

Link to the corpus CLeLfPC:

Credits: 2021 B. Bigi et M. Zimmermann