Performing human-like motions that involve multiple contacts is challenging for robots. In this regard, a researcher has envisioned an interactive cyber-physical human (iCPH) platform with complementary humanoid (physical twin) and simulation (digital twin) elements. iCPH combines human measurement data, musculoskeletal analysis, and machine learning for data collection and augmentation. As a result, iCPH can understand, predict, and synthesize whole-body contact motions.
When exploring their surroundings, communicating with others and expressing themselves, humans can perform a wide range of body motions. The ability to realistically replicate these motions, applying them to human and humanoid characters, could be highly valuable for the development of video games and the creation of animations, content that…
Humans can innately perform a wide range of movements, as this allows them to best tackle various tasks in their day-to-day life. Automatically reproducing these motions in virtual avatars and 3D animated human-like characters could be highly advantageous for many applications, ranging from metaverse spaces to digital entertainment, AI interfaces…
Teleoperation for robot imitation learning is bottlenecked by hardware availability. Can high-quality robot data be collected without a physical robot? We present a system for augmenting Apple Vision Pro with real-time virtual robot feedback. By providing users with an intuitive understanding of how their actions translate to robot motions, we…