To make a really functional, believable simulation you have incorporate an enormous amount of functions: physics, destructible objects (buildings to chairs and bottles), global illumination that casts shadows and reflects light off of objects creating even more shadows and illuminations (colored by the object), hair (one of the most difficult and complex elements), smoke, fire, rain, water, and fog, particles for sparks, hair, and smoke, fabric (clothing, flags, and curtains), motion blur, tessellation (fewer triangles in the distance than up close, dynamically altered), physically correct rendering using ray-tracing and ambient occlusions, and skin with subsurface diffused reflections and coloring, and that's the short list.
Nvidia has been collecting and developing software to do all of these things and more. It has acquired several leading CG software firms and few startups, and has over the years been supporting game engine developers like Ubisoft, Unity, and Epic in adding realistic simulation capabilities to their engines. At the same time Nvidia has been working with game developers helping them incorporate those functions in their games, in many cases actually writing the code to exploit such features. AMD offers similar services to the game developers, and even Intel has contributed to game development and simulations.
But Nvidia had and has the biggest library and the biggest team of CG software engineers. Last year Nvidia pulled together all the stuff it had in physics, using its proprietary GPU PhysX software (which also runs on CPUs), its ray tracing and other rendering tools from OptiX and it Mental Images groups, and its VisualFX algorithmic work and bundled it up in GameWorks. The company first announced it somewhat unceremoniously at GDC, and they showed it again this year. At its own conference GTC, in San Jose this week, the company gave full scale demonstrations and classes of the middleware tools and libraries.
Real time, 30 to 60 fps HD+ demonstrations of simulated smoke, a car driving down a slick wet street and skidding, a giant blue whale busting out of a luminous plankton rich ocean causing enormous wave reactions while stars twinkled in the background -- it was a scene from the Life of PI that took days to render, and Nvidia was doing it in real time -- an amazing demonstration.
The kicker was Epic's latest demo of a future soldier in a fight in a subway hall. There was lighting, shadows, water and reflections, skin, destruction of wall tiles (due to heads being banged against them), and it was gripping.
The president of VMware said after seeing it (and not knowing what he was seeing), "Wow, what movie is that?" And that's what it's all about -- dispersion of disbelief. You've heard me talk about this before, and we're almost there. I famously predicted at a prestigious event three years ago that by 2015 there would be no more human actors, it would be all CG. Well I may end up being 52% or better right (phew).
Games are now going to be full-fledged total simulation systems with real time everything, and the tools to get them there are largely going to come from Nvidia.
And... that's 2014 and 2015. By 2016 to 2017, they will run on our tablets and phones.
— Jon Peddie is president of Jon Peddie Research.