KAIST professors Lee Keon-jae (KAIST)
South Korean researchers have developed an artificial intelligence-based acoustic sensor that resembles the human cochlea, the Korea Advanced Institute of Science and Technology said Monday.
The research team, led by KAIST professors Lee Keon-jae and Wang Hee-seung, said they used an ultrathin piezoelectric membrane to mimic the basilar membrane in the cochlea, which acts as an analyzer to translate frequency of sounds.
Equipped with machine learning-based speaker recognition, the new acoustic sensor reduced the error rate by 75 percent, compared to a conventional condenser type acoustic sensor.
The research team expects the latest product to be applied in an array of AI-based systems that use voice user interfaces, including virtual assistants, smart home appliances, mobile devices, and other solutions using biometrics.
A prototype sensor developed by the research team was first introduced at the 2020 Consumer Electronics Show.
Fronics, a company founded by professor Lee, has been also collaborating with Silicon Valley-based companies to commercialize the sensor. Lee added that the company is looking to begin mass production of the acoustic sensor soon.
The research was funded by the National Research Foundation of Korea. The team’s report was published on the journal Science Advances last week.
By Shim Woo-hyun (email@example.com