Crossing the Uncanny Valley: Breakthrough in technology for lifelike facial expressions in androids
Even highly realistic androids can cause unease when their facial expressions lack emotional consistency. Traditionally, a ‘patchwork method’ has been used for facial movements, but it comes with practical limitations. A team developed a new technology using ‘waveform movements’ to create real-time, complex expressions without unnatural transitions. This system reflects internal states, enhancing emotional communication between robots and humans, potentially making androids feel more humanlike.
A team of researchers has developed a computer program that creates realistic videos that reflect the facial expressions and head movements of the person speaking, only requiring an audio clip and a face photo. DIverse yet Realistic Facial Animations, or DIRFA, is an artificial intelligence-based program that takes audio…
Cornell University researchers have developed two technologies that track a person's gaze and facial expressions through sonar-like sensing. The technology is small enough to fit on commercial smartglasses or virtual reality or augmented reality headsets yet consumes significantly less power than similar tools using cameras.
Large neural networks, a form of artificial intelligence, can generate thousands of jokes along the lines of "Why did the chicken cross the road?" But do they understand why they're funny?