Nvidia has confirmed the presence of a bug in its G-Sync implementation for high-refresh-rate displays, causing the power draw and heat output of its cards to skyrocket - but promises a fix is incoming.
Launched by Nvidia
back in 2013, G-Sync is designed to put an end to stutter and screen tearing in both media playback and gaming. Featuring a dedicated control module built into compatible monitors, G-Sync varies the physical refresh rate of the display to precisely match that of the incoming signal. It's a great selling point for Nvidia's high-end GPUs, but one which has hit a major roadblock: a massive and unexpected increase in power draw.
The problem was first spotted by
PC Perspective: when pairing a GeForce GTX with a high-refresh-rate G-Sync compatible display, in this case an Asus model with 165Hz refresh rate, the power draw of the card jumped dramatically. The issue stemmed from the dynamic clock speed feature of the card: at refresh rates of up to 120Hz, the maximum early G-Sync displays could support, the clock stays at 135MHz and the system power draw at around 76 watts; above this, though, the clock speed jumps to 885MHz and the power draw up to 201W at the top 165MHz setting - even when the system was sat at the Windows desktop doing absolutely nothing.
PC Perspective's investigation pointed to a bug, and it's one Nvidia has now confirmed in an
email to the site. '
You were right! That new monitor (or you) exposed a bug in the way our GPU was managing clocks for G-Sync and very high refresh rates,' an unnamed spokesperson for Nvidia explained. '
As a result of your findings, we are fixing the bug which will lower the operating point of our GPUs back to the same power level for other displays. We’ll have this fixed in an upcoming driver.'
Nvidia has not offered a release date for the updated driver.
Want to comment? Please log in.