Nvidia has pulled together its seemingly disparate software tools and technology and bundled them into a product it is calling GameWorks. It's a somewhat modest name for such extraordinary technology, but easy to remember.
GameWorks is the Industrial Light and Magic (ILM) of the game development industry. It consists of a huge library of middleware software tools and algorithms for simulation, special effects, and rendering functions. Much of it is free, and open, some of it is licensable, all of it is amazing.
In the past game developers would hire artists to paint images of things like smoke, or water, and of course faces and buildings. Those images would be modulated to give the appearance of movement and in some cases destruction. Such fakery is stiff, and always in the same sequence. And there's no insult in that, almost all computer graphics (CG) is fakery, trying to create an impression of the real world. That was due to the limited resources available and the extraordinary calculations needed -- 30 to 60 times a second -- to simulate simple looking things like a ball bouncing, and the light reflecting off it, and the shadow(s) created by it.
In the movies these issues were dealt with by big studios like DreamWorks, Industrial Light and Magic, Pixar/Disney, and others could use super computers and spend hours to days on a single frame. The results were -- and are -- spectacular and worth the effort. Also, the computer and CG scientists, engineers and programmer at the studios led the industry in developing and discovering advanced algorithms and author a huge number of the papers delivered at Siggraph, FMX, View, and other CG conferences.
The second group of CG scientists, engineers, and programmers deeply involved in such work are the developers of simulations systems used by the military, airframe companies like Boeing and Lockheed, and leading edge scientists in nuclear energy (and weapons), oil and gas exploration, and esoteric projects using finite element and well analysis. These folks are delighted if they can get three to five frames a second and have thought 20 fps at any decent resolution was science fiction.
But time moves on and the algorithmic engineers in the movies and simulation industries keep tweaking their math, and keep discovering clever tricks and short cuts, while at the same time semiconductor processes and processor designs have become more powerful, smaller, cheaper, and less power consuming. Things like the simulation of fire and smoke for a five-second viewing at HD resolution would take hours of computation on a machine costing $100,000 or more and use kilowatts of power just five years ago.
Today, with a modern GPU you can generate physically perfect smoke and run it forever at 30+ fps in a machine costing less than $2,000 and using less than 300w. But that's just one example.