A new approach to improve robot navigation in crowded environments
While robots have become increasingly advanced over the past few years, most of them are still unable to reliably navigate very crowded spaces, such as public areas or roads in urban environments. To be implemented on a large-scale and in the smart cities of the future, however, robots will need to be able to navigate these environments both reliably and safely, without colliding with humans or nearby objects.
While the capabilities of robots have improved significantly over the past decades, they are not always able to reliably and safely move in unknown, dynamic and complex environments. To move in their surroundings, robots rely on algorithms that process data collected by sensors or cameras and plan future actions accordingly.
Researchers have developed a novel framework named WildFusion that fuses vision, vibration and touch to enable robots to 'sense' and navigate complex outdoor environments much like humans do.
Over the past decade, deep learning has transformed how artificial intelligence (AI) agents perceive and act in digital environments, allowing them to master board games, control simulated robots and reliably tackle various other tasks. Yet most of these systems still depend on enormous amounts of direct experience—millions of trial-and-error interactions—to…