Researchers give robots a sense of touch by ‘listening’ to vibrations, allowing them to identify materials, understand shapes and recognize objects just like human hands. The ability to interpret the world through acoustic vibrations emanating from an object — like shaking a cup to see how much soda is left or tapping on a desk to see if it’s made out of real wood — is something humans do without thinking. And it’s an ability that researchers are on the cusp of bringing to robots to augment their rapidly growing set of sensing abilities.
Researchers have developed a novel framework named WildFusion that fuses vision, vibration and touch to enable robots to 'sense' and navigate complex outdoor environments much like humans do.
Researchers have developed an L3 F-TOUCH sensor to enhance tactile capabilities in robots, allowing it to 'feel' objects and adjust its grip accordingly.
Exploring a new way to teach robots, Princeton researchers have found that human-language descriptions of tools can accelerate the learning of a simulated robotic arm lifting and using a variety of tools.