A new system for tracking rapid changes in facial expression and head position using just an ordinary computer's USB camera has been developed by a Keio University group, led by Associate Professor Yasue Mitsukura.
The system uses time-series signal processing to find the position of the eyes, nose and mouth in real time with remarkable precision.
(Real-Time Avatars Mirror Your Expression)
"We think this system could be used by CG animation hobbyists, in Web dialog systems that show a character instead of the person's face, and for making characters move in real time at events. Because the system uses just one PC and one camera, it can be applied in many situations very easily."
"We're using an algorithm that gets updated in line with the motion of the face. So it can track the face very fast, with very high precision. That's the basic technology for this avatar system."
This system, which apparently can be implemented inexpensively, would enable far more realistic avatars in virtual worlds like Second Life or in gaming systems.
Although the use of the ancient Sanskrit word "avatar" was used years before in multi-user domains, many science fiction readers recognize it from Neal Stephenson's 1992 novel Snow Crash:
As Hiro approaches the street, he sees two couples probably using their parent's computer for a double date in the Metaverse. He's not seeing real people, of course. It's all part of a moving illustration created by his computer from specifications coming down the fiber optic cable. These people are pieces of software called avatars.
(Read more about avatar)
Via DigInfo; thanks to BajaB for the tip and a reference on this story.
Scroll down for more stories in the same category. (Story submitted 6/12/2012)