Many comparable systems project faces onto the front of a mask – following the same concept as cinema projection. "Walt Disney was a pioneer in this field back in the 1960s," explains Kuratate. "He made the installations in his Haunted Mansion by projecting the faces of grimacing actors onto busts." Whereas Walt Disney projected images from the front, the makers of Mask-bot use on-board rear projection to ensure a seamless face-to-face interaction.
This means that there is only a twelve centimeter gap between the high-compression, x0.25 fish-eye lens with a macro adapter and the face mask. The CoTeSys team therefore had to ensure that an entire face could actually be beamed onto the mask at this short distance. Mask-bot is also bright enough to function in daylight thanks to a particularly strong and small projector and a coating of luminous paint sprayed on the inside of the plastic mask. "You don't have to keep Mask-bot behind closed curtains," laughs Kuratate.
This part of the new system could soon be deployed in video conferences. "Usually, participants are shown on screen. With Mask-bot, however, you can create a realistic replica of a person that actually sits and speaks with you at the conference table. You can use a generic mask for male and female, or you can provide a custom-made mask for each person," explains Takaaki Kuratate.
In order to be used as a robot face, Mask-bot must be able to function without requiring a video image of the person speaking. A new program already enables the system to convert a normal two-dimensional photograph into a correctly proportioned projection for a three-dimensional mask. Further algorithms provide the facial expressions and voice.
To replicate facial expressions, Takaaki Kuratate developed a talking head animation engine – a system in which a computer filters an extensive series of face motion data from people collected by a motion capture system and selects the facial expressions that best match a specific sound, called a phoneme, when it is being spoken. The computer extracts a set of facial coordinates from each of these expressions, which it can then assign to any new face, thus bringing it to life. Emotion synthesis software delivers the visible emotional nuances that indicate, for example, when someone is happy, sad or angry.
SF movie fans are reminded of the robots depicted in the 2004 film I, Robot directed by Alex Proyas, based on Isaac Asimov's eponymous short-story collection. Take a look at this short excerpt showing Sonny, the most expressive of robots.