An artificial intelligence model can make virtual avatars gesture naturally to match spoken words – possibly paving the way for AI-generated newsreaders or influencers that move more realistically as they speak.
As humans talk, we gesture to help convey our meaning. But when video game characters or digital avatars attempt similar behaviour, they often make generic movements regardless of what they are actually saying. To make virtual figures gesture more realistically, researchers first had to teach an AI model the connection between speech and body language – and the emotions…