It is no longer a ‘thing of the future’ when we talk about the robotics industry in general. Artificial intelligence has been incorporated in our day-to-day lives be it in simple technologies or an actual robot in some factories, there is no denying the reality of machines.

However they appear to be just that, machines. Made up of alloys and metals, this is what they look like and what remains fantasy is the ‘humane’ aspects of it all. An android looking and feeling a lot like a human being is still a thing for the imagination until today.

Ph.D. student Patricia Xu through the Organics Robotics Lab at Cornell University has made a step forward in bringing humane qualities to robots with a new synthetic material that creates linked sensory network similar to a biological nervous system that could enable soft robots to sense how they interact with their environment and adjust their actions accordingly.

“We want to have a way to measure stresses and strains for highly deformable objects, and we want to do it using the hardware itself, not vision,” said lab director Rob Shepherd, associate professor of mechanical and aerospace engineering and the paper’s senior author. “A good way to think about it is from a biological perspective. A blind person can still feel because they have sensors in their fingers that deform when their finger deforms. Robots don’t have that right now.” Xu explained.

Although this is still a far from the goal of creating a hundred percent feeling artificial intelligence, this is still a step forward in that direction.