Computer games of tomorrow are going to feature even more-lifelike characters if Nvidia has anything to do with it.

Coming off the back of an impressive demo by Quantic Dream at the PlayStation 4 launch in February, this demo, this time from Nvidia, shows how it is working to make things even more lifelike.

The software, dubbed FaceWorks, was shown off by Jen-Hsun Huang, the company's CEO, at its GPU Technology Conference, and shows a bald-headed man called Ira being created in real-time with numerous lighting effects showing off all the detail even down to the pores in his skin and the wrinkles in his forehead. 

"To create ‘Ira’ we partnered with Dr. Paul Debevec of the Institute for Creative Technology (ICT) at the University of Southern California," explains Nivida. "He and his team have been building next-generation systems that can capture facial data to within a tenth of a millimeter without the need for special make-up, awkward markers or specialised cameras."

ICT’s "light stage" technology uses photographic techniques to derive not only the 3D shape of an actor’s face, but the critical elements needed to represent human skin properly. These include light reflection and transmission through the skin, reflectivity from oils in the skin, and the nearly microscopic lines and bumps in the skin surface - oh, and he talks too.

According to Nivida those complex skin shaders are executing more than 8,000 instructions per pixel to allow realistic rendering of the face in any lighting situation which, as you can imagine, creates a lot of data.

"At full HD resolution that’s 82 billion floating point operations per second (FLOPs) per frame – or 4.9 trillion FLOPs per second at 60 frames per second. And all that doesn’t include the 161 filtered texture fetches per pixel."


No word however, on when we can expect to see this in new games, but Nvidia hopes that the technology will be used not only for gaming but also other use cases like avatars.