The problem is further complicated by the intensity in the images from frame to frame. The liquid crystals in a display have a hysteresis, and if a pixel is at, or near, maximum brightness in one scene, and then in the next it should be dark or black, getting that pixel's crystal to go from all on to all off requires overdriving it to overcome the hysteresis. How much overdrive is also a function for how long the pixel has been on one state. And LCD is like a sample and hold switch. The amount of time the pixel has been in one state also affects its color shift.
With the advent of gaming monitors moving from simple HD (1920 x 1080) to WQHD (2560 x 1400) and in a few years UHD (3840 x 2160) the need for an intelligent monitor controller is compounded. Gaming monitors are also transitioning from simple 21-inch to 24- and 32-inch screens, and, as I am fond of saying, the more you can see the more you can do. It's also true that the more you can see, the more you will see, and it better look good or you won't be a happy camper.
Nvidia's G-sync enables monitors to drive these higher resolutions, and it will be needed to make them look good, and correct in a game.
Nvidia has a bunch of partners (Asus, BenQ, Viewsonic, Philips, and AOC) that will bring out gaming monitors of various sizes and resolutions in the second half of the year. The monitors will be slightly more expensive (~$120) but given that a gamer buys a new monitor maybe every four years, that's a reasonably cheap uplift to get rid of the most annoying problems in the display. The monitors will only support display port. All the new AIBs come with DP. If you have an older AIB there is not a cost-effective way to get from DVI to DP. The adaptors that are available only work from a DP computer to a VGA, DVI, or HDMI monitor and not the other way around.
This is a solution for a niche market: high-end gamers. However, the monitor and PC suppliers are recognizing that with the slowdown in the PC market, they are going to have to appeal more to niche users than before and stop trying to shove one-size-fits-all solutions on consumers.
The problem with stutter/frame-drop and tearing isn't just experienced in games. Real-time CAD animations and simulators also suffer from it. So the monitor and Nvidia folks should be able to get a pretty good ROI.