EARS

 

According to the WHO’s World Report on Hearing (2021), over 1.5 billion people currently experience some degree of hearing loss, which could grow to 2.5 billion by 2050. However, nearly all hearing aid products on the market are designed for single-talker speech environments and perform poorly in multi-talker situations, when multiple speakers are talking concurrently. This can cause reduced speech comprehension and fatigue for hearing-impaired people. Our EEG-Assisted heaRing aidS (EARS) use EEG signals to decode selective attention from the human brain and improve the hearing aid’s voice amplification algorithm in a multi-talker environment, thus enhancing the hearing experience.

 

Team member(s)

Mr Qing Hongbin* (PhD student, Department of Linguistics and Translation, City University of Hong Kong)
Dr Wang Qixuan (Shanghai Ninth People's Hospital, Shanghai JiaoTong University School of Medicine)
Mr Zhou Qian (Shanghai Ninth People's Hospital, Shanghai JiaoTong University School of Medicine)
Miss Wang Chengcheng (Alumna, Department of Linguistics and Translation, City University of Hong Kong)
Mr Liu Honghao (Res Asst, Department of Linguistics and Translation, City University of Hong Kong)
Miss Ma Zhengwu (Alumna, Department of Linguistics and Translation, City University of Hong Kong)
Miss Wu Shuyi (Res Asst, Department of Linguistics and Translation, City University of Hong Kong)

* Person-in-charge
(Info based on the team's application form)

 

Achievement(s)
  1. CityU HK Tech 300 Seed Fund (2023)
  2. CityU Lau Tat Chuen Outstanding Startup Idea Award