Starting in 1495 when Leonardo Da Vinci first termed and discovered ‘robots’ as we know it, these machines became a large part of doing everyday mundane tasks. Forward to the next centuries, robots now come in different forms, sizes, and hearing abilities!

The Likes of Siri and Alexa, for example, are testaments to the great advances robotics have gained in terms of being humanoid or human-like. But when it comes to robots’ ability to hear and respond like how humans do in communication, is a little different.

Siri, Alexa, and Google voices are all pre-installed and pre-programmed ‘fill-in-the-blanks’ voice machines that are hardwired to give answers to already-installed questions. For short, they’re not really hearing the speaker and to an extent, not really talking to them.

The researchers at Carnegie Mellon University’s (CMU) Robotics Institute states that Robot perception can be improved by adding another element to its reliability: Hearing

The researchers used a square tray, which was attached to the arms of the Swayer Robot to create an apparatus known as Tilt bot, that captured interactions

20 views