Design Con 2015
Breaking News
Comments
Newest First | Oldest First | Threaded View
Page 1 / 2   >   >>
Denis.Giri
User Rank
Manager
that makes sense
Denis.Giri   2/3/2014 3:47:28 AM
NO RATINGS
Thanks, your point on stutter & harmonics is really helpful.

mdrejhon
User Rank
Rookie
Re: waste of money
mdrejhon   1/29/2014 5:46:42 PM
NO RATINGS
It does not appear to be a waste of money: I have a G-SYNC monitor here.  My experience is that G-SYNC makes stutter-free 40fps look better than stuttery 75fps -- for certain use cases -- so the GSYNC upgrade ends up being cheaper than doubling-up to a faster GPU running in SLI, at least when playing certain games (that benefit a lot from GSYNC), such as Battlefield 4 or Crysis 3.

Random/fluctuating framerates looks perfectly consistent on G-SYNC, which is one of the interesting behaviors in G-SYNC, and explained by some animations at http://www.blurbusters.com/gsync/preview/

It's also that G-SYNC also reduces input lag without needing VSYNC OFF, and some high-speed video input latency tests have shown this to be correct -- at http://www.blurbusters.com/gsync/preview2/

Not all software and games benefit from GSYNC, but the benefits appear to be very real.

In addition, all GSYNC monitors include additional non-GSYNC modes that enhances games.  One mode is a fixed-refresh-rate low persistence mode via optional strobe backlight.  This motion-blur-reducing mode is a LightBoost sequel called ULMB (Ultra Low Motion Blur) which strobes the backlight once per refresh like a CRT.  This is a form of ultra-highly efficient black frame insertion, to reduce motion blur, for those people who prefer CRT motion clarity on an LCD.  This is useful for high end users, who want an easy pushbutton method of enabling/disabling a strobe backlight.  It's been the rage lately in 2014-model gaming monitors (NVIDIA LightBoost, NVIDIA ULMB, EIZO Turbo240, and BENQ Blur Reduction), as they have 80-95% less motion blur than a common 60Hz LCD LCD monitor, and less motion blur than most plasmas (something lately miraculous for an LCD).  At 90% reduction in blur -- where there was 10 pixel of display-induced motion blurring, it becomes 1 pixel worth of display-induced motion blur. (The improvements are very clearly seen when games don't interfere by adding software-based motion blur to the mix).  If you are not familiar with the newly invented forms of gaming monitor LCD motion blur reduction, it's worth reading up.  Unlike previous years, the efficiencies of newer strobe backlights that just arrived on desktop monitors -- it has recently become staggeringly high, with measurements that showed some new strobe-backlight LCD panels now had less motion blur than certain CRTs (such as the Sony GDM-W900 CRT). It's been a popular feature in the gaming monitor niche market.

Also, triple buffering is still prone to microstutters, for those people who are sensitive to it.  This is because of the varying time between the most recent back buffer and the vsync interval.  Sometimes the most recent frame is done half-a-frametime before the vsync, sometimes quarter-frametime, sometimes most of a frametime.  Whenever framerate does not match refreshrate, there are stutter harmonics involved -- even with triple buffering.  Thusly, mathematically, the microstutter amplitude is (1/fps)th of motionspeed -- e.g. at 100 frames per second during 1000 pixels/second panning/strafing/turning motion, you have microstutters of (1000/100) = 10 pixel amplitude vibrating-edge stutter.  This assumes the game and the mouse isn't throwing you additional stutters.  A good test case is running a good smooth gaming mouse (500Hz/1000Hz mouse) on old game engines or using a very powerful graphics card (with game engines that aren't buggy with stutters).  When you do that, you've eliminted the other stutter weak links.  Now, the stuttter weak link is mostly isolated down to the harmonics between frame rate and refresh rate (even when triple buffered).  The number of stutters per second is the function of the harmonic frequency between refresh rate and frame rate.  A good self-experiment is enabling triple buffering in an old game, then setting an exact framerate cap limiter setting close to refresh (e.g. such as an "fps_max" setting), 62 fps at 60 Hz creates a harmonic of 2 stutters per second (the beat frequency).  GSYNC solves this stutter harmonics problem, and it mathematically makes sense, according to my calculations.  When I received the monitor, my napkin math was proven correct, and my eyes saw 40-50fps motion look better than stuttery 70-75fps motion.  Even fluctuating random(40fps to 70fps) framerate looked like perfectly VSYNC ON 60fps@60Hz as the randomization in the framerate was stutterfree as the display changed refresh rate every single frame to keep frametimes renderings exactly in sync with refreshtimes (frames hitting human eyes).  The framerate could vary smoothly (like a CVT transmission) with no noticeable transitions between framerate changes; which was a rather interesting visual property of GSYNC or VRR (Variable Refresh Rate) behavior.

Mark Rejhon
Display Researcher / Blur Busters

Denis.Giri
User Rank
Manager
Re: waste of money
Denis.Giri   1/29/2014 2:55:44 AM
NO RATINGS
3 buffers, the one the GPU is drawing in (t), and the 2 previous ones, t-1 & t-2.

The GPU ends drawing (t), the system looks at whether the display has used t-2 (and is now working on t-1) or is already on t-1.

If the LCD is on t-1, the the buffer t-2 can immediately be reused for thr GPU to start working on t+1.

If the LCD is still on t-2, then the LCD's bandwidth is indeed the limit and we might as well work on VSYNC-ON (double buffer). At this point you have to chose whether you want t-1 to be skipped entirely or if you want to VSYNC and stall the GPU.

Where's the added latency? At which point of a use case where the LCD is not the limiting factor will tripple buffering stall the GPU or add latency?

 

 

Also, as we're already doing frame skip/duplication all the time shouldn't your point about variable FPS apply to that as well?

rfindley
User Rank
Rookie
Re: waste of money
rfindley   1/28/2014 3:04:09 PM
NO RATINGS
In the "old" days, a good graphics engine would monitor the time it takes to render each frame, and would make some dynamic adjustments to the rendering, so frames would (almost) always be completed before the next refresh.

Gsync lets you get away with not having to do such planning, but it still comes at a cost (in terms of quality).  Each frame you render represents a specific snapshot in time.  That snapshot should reflect what the scene would look like at the specific time that it hits the monitor.  But if you don't know how long it will take to render a frame, then what time 't' should the image represent!?

The result is motion jitter.  Sure, it's better than image tearing or duplicate frames, but still not the best solution -- which is to dynamically adjust the engine to always meet a static frame rendering interval.  It's difficult to do that, but what's NOT difficult about designing a graphics engine? :-)

ost0
User Rank
CEO
Re: waste of money
ost0   1/28/2014 8:02:24 AM
NO RATINGS
Depending on what resolution you run, 60fps is slow on modern GPU hardware. The uber gamers prefer more. Triple buffering will only add latency, wich seriously draws down the gaming experience. Also, triple buffer will stall the same way if GPU is faster than the LCD bandwidth. When 3rd frame is drawn, it has to wait for 1st to be sent out before it can start drawing a new one. Its like moving one car ahead of the queue in a traffick jam.

But the variable FPS can make it more difficult for the programmers to render correct object positions  in a properly timed manner. It will need to know the approximate moving distance of all objects at the start of next frame to get smooth motion.

200Hz monitors is not a problem with modern LCD, but the display interface can quickly become the bottleneck on higher resolution. Even modern DP1.2/HDMI2.0 can only do 60Hz at QuadHD resolution, so its always a tradeoff between resolution and FPS.

 

Denis.Giri
User Rank
Manager
Re: waste of money
Denis.Giri   1/28/2014 4:22:28 AM
NO RATINGS
Let's assume a 100Hz standard monitor. On a modern AAA game, the GPU will get 60fps at best... Basically, on the games played by hard-core gamers (announced target of this product) and on highest detail (for highest "immersion"), the GPU will never be able to keep up with the monitor's framerate... Not unless you're using octo-AIB (or some crazy setup like that), in which case you're probably using 200Hz monitors... So, no, I can't agree that the LCD is the limiting factor.

And if you're running on tripple buffering, the GPU will always be active generating the next frame (not blocked by VSYNC-ON mode) and there will never be any tearing as you always have a valid frame to display (t-1 or t-2). At this point, G-sync becomes mostly useless.

 

After some reflection, the only benefit I can see to G-sync over tripple buffering is latency (assuming a given G-sync-ready monitor working at a given framerate).
 And unless you're making AR glasses, you wouldn't care that much about latency.

ost0
User Rank
CEO
Re: waste of money
ost0   1/28/2014 3:54:45 AM
NO RATINGS
Triple buffering doesnt help you much here. The LCD is the bottleneck.

G-sync (*) helps you maximize the cable's bandwidth to get max refresh rate at any time. Modern hw has much larger performance than can be shown due to vertical sync limitations. Working with projectors and monitors this old CRT compatible v and hsync domain can really be painful when doing framelocking across sources.

But imho NVidia should take this even one futher step. It would help on bandwidth if only parts of the display could be updated on request. Why do the full frame every time, if just some of the data is modified?

(*) It is very confusing to me that Nvidia picket this name. They already used the name for their quadro framelocker hardware. That will likely confuse a few googlers.

 

Denis.Giri
User Rank
Manager
waste of money
Denis.Giri   1/28/2014 3:39:01 AM
NO RATINGS
Has someone finally noticed that you don't have to refresh a LCD at a constant rate? Amazing! Unfortunately, you seem to need specific panels, today...

better explanation of what G-sync is here: http://www.anandtech.com/show/7582/nvidia-gsync-review

 

But honestly, I don't see any huge improvement over "tripple buffering" (and that could be done on any GPU/panel today. It only takes more GDDR)

I'd rather buy a second AIB and double my available GDDR (should enable tripple buffering on anything I already play) and not have to buy new panels.

ost0
User Rank
CEO
Re: Useful for 4k tvs
ost0   1/28/2014 2:56:46 AM
NO RATINGS
I disagree. The more pixel the more you expand the possibilities. You are simply not forced to look straight into the screen, but can move your eyes to a corner to look at something else. There is no longer a need to get a full overview of what is happening at the full screen simultaneously. You may argue that when watching a movie, you need to get most of the pixels into the viewfield, but that is very content dependant, and isnt always true.

To the extreme, we would really like active pixel-paint and paint our walls, so we could choose where, how big and what orientation to watch whatever we wanted to watch.

 

Etmax
User Rank
Rookie
Re: Useful for 4k tvs
Etmax   1/27/2014 6:51:15 PM
NO RATINGS
I have to say I believe 4k is a waste of money. In fact HD is almost a waste of time.

The exceptions are perhaps in gaming where you sit very close to the screen and with projection systems with really large screens.

If you do the calculations of how big a pixel is for a given screen and then calculate how close the screen has to be for the eye to resolve that pixel it becomes evident that at normal viewing distances you actually can't see individual pixels.

The only reason why HD is even worth it is because for a TV transmission the SD bit rate is so low that a mass of artifacts are evident that have nothing to do with the resolution.

There is also a slightly better pixel positioning accuracy with HD due to Nyquist sampling criteria.

but when it comes to 4k you really can't tell the difference (at normal viewing distances)

Page 1 / 2   >   >>


Flash Poll
Top Comments of the Week
Like Us on Facebook
EE Times on Twitter
EE Times Twitter Feed

Datasheets.com Parts Search

185 million searchable parts
(please enter a part number or hit search to begin)
EE Life
Frankenstein's Fix, Teardowns, Sideshows, Design Contests, Reader Content & More
Max Maxfield

DIY Practical Joke Project Idea
Max Maxfield
12 comments
I just received a rather interesting email from a member of the EETimes community who prefers to remain anonymous. This message was as follows:

Jolt Judges and Andrew Binstock

Jolt Awards: The Best Books
Jolt Judges and Andrew Binstock
1 Comment
As we do every year, Dr. Dobb's recognizes the best books of the last 12 months via the Jolt Awards -- our cycle of product awards given out every two months in each of six categories. No ...

Engineering Investigations

Air Conditioner Falls From Window, Still Works
Engineering Investigations
2 comments
It's autumn in New England. The leaves are turning to red, orange, and gold, my roses are in their second bloom, and it's time to remove the air conditioner from the window. On September ...

David Blaza

The Other Tesla
David Blaza
5 comments
I find myself going to Kickstarter and Indiegogo on a regular basis these days because they have become real innovation marketplaces. As far as I'm concerned, this is where a lot of cool ...