Nvidia G-SYNC

Mangler

Well-known member
In this time of technological marvels, there are few advances one can truly call “innovative”, or “revolutionary”. NVIDIA G-SYNC, however, is one of the few, revolutionizing outmoded monitor technology with a truly innovative, groundbreaking advancement that has never before been attempted.

G-SYNC’s elimination of input lag, tearing, and stutter delivers a stunning visual experience on any G-SYNC-enhanced monitor; one so stunning that you’ll never want to use a ‘normal’ monitor ever again. In addition to cutting-edge changes to the viewing experience, multiplayer gamers will receive a significant competitive advantage when G-SYNC is paired with a fast GeForce GTX GPU, and low-lag input devices, something that’ll surely pique the interest of shooter aficionados. For eSports players, NVIDIA G-SYNC is an essential upgrade. With G-SYNC’s removal of input lag, successes and failures are squarely in the hands of players, differentiating the pros from the amateurs.

When the biggest names in the business are blown away, and the architect of Unreal Engine calls G-SYNC “the biggest leap forward in gaming monitors since we went from standard definition to high-def”, you know that G-SYNC will raise the bar for display technology. Somehow, if that testimony isn’t proof enough, keep your eyes peeled on your favorite hardware sites for hands-on impressions of NVIDIA G-SYNC monitors, which are currently being shown at a press event in Montreal.

The legendary John Carmack, architect of id Software’s engine, was similarly excited, saying “Once you play on a G-SYNC capable monitor, you’ll never go back.” Coming from a pioneer of the gaming industry, who’s also a bonafide rocket scientist, that’s high praise indeed
http://www.geforce.com/whats-new/ar...evolutionary-ultra-smooth-stutter-free-gaming

The press members at the event seems to be pretty impressed too.

Seems pretty awesome, it is a shame that it will be nvidia only.

****, forgot to mention monitors in the title.
 
Last edited:
yes i'll rush right out and replace a 1200 buck monitor for this :rolleyes:
and I won't buy a monitor that would lock me into nv cards only

John Carmack:lol: fix rage legendary fail
 
Last edited:
Well, I hope someone makes a non-vendor locked version of it, since everyone who tried it at the event seemed to be impressed by it.
 
Well, I hope someone makes a non-vendor locked version of it, since everyone who tried it at the event seemed to be impressed by it.
we can only hope or it will go no where

and looks to be no only a moded VG248QE crap TN screen :nuts:
 
Yeah, but since the variable refresh rate is useful at 30-1xxHZ, maybe there will be some ips monitors using it too.

They do mention 4k displays, I doubt those will be using tn panels.
 
Last edited:
From the anandtech live blog:

11:18AM EDT - It's seriously a huge difference

11:18AM EDT - The one on the right had gsync on, and they ran the same content to demonstrate the differences between traditional gaming PC monitors and g-sync displays

11:17AM EDT - I'm going to write up all of this in greater detail, but they had two identical systems/displays setup side by side



11:15AM EDT - The difference is pretty dramatic, seriously it has a tremendous impact on game smoothness

11:15AM EDT - Wow that was pretty awesome

10:53AM EDT - Brb clustering by demo stations

10:52AM EDT - We're about to go see a demo of this

It's a ****ing shame that this is vendor locked.
 
yes i'll rush right out and replace a 1200 buck monitor for this :rolleyes:
and I won't buy a monitor that would lock me into nv cards only

John Carmack:lol: fix rage legendary fail

How does this lock you into nv only? Either you use an nV card and can enable G-Sync, or you use AMD and don't.

It's like saying you can't buy a 3DVision monitor because you run an HD7970. :bleh:
 
No idea, they only mentioned that 144hz asus monitor at the event.

From a nv rep at neogaf:

The kit is only compatible with the ASUS VG248QE at this time. I am asking engineering about 'unofficial' compatibility and kits for other monitors.
 
Last edited:
The first demo was a swinging pendulum. NVIDIA's demo harness allows you to set min/max frame times, and for the initial test case we saw both systems running at a fixed 60 fps. The performance on both systems was identical as was the visual experience. I noticed no stuttering, and since v-sync was on there was no visible tearing either. Then things got interesting.

NVIDIA then dropped the frame rate on both systems down to 50 fps, once again static. The traditional system started to exhibit stuttering as we saw the effects of having a mismatched GPU frame rate and monitor refresh rate. Since the case itself was pathological in nature (you don't always have a constant mismatch between the two), the stuttering was extremely pronounced. The same demo on the g-sync system? Flawless, smooth.

NVIDIA then dropped the frame rate even more, down to an average of around 45 fps but also introduced variability in frame times, making the demo even more realistic. Once again, the traditional setup with v-sync enabled was a stuttering mess while the G-Sync system didn't skip a beat.



Next up was disabling v-sync with hopes of reducing stuttering, resulting in both stuttering (still refresh rate/fps mismatch) and now tearing. The G-Sync system, once again, handled the test case perfectly. It delivered the same smoothness and visual experience as if the we were looking at a game rendering perfectly at a constant 60 fps. It's sort of ridiculous and completely changes the overall user experience. Drops in frame rate no longer have to be drops in smoothness. Game devs relying on the presence of G-Sync can throw higher quality effects at a scene since they don't need to be as afraid of drops in frame rate excursions below 60 fps.

Switching gears NVIDIA also ran a real world demonstration by spinning the camera around Lara Croft in Tomb Raider. The stutter/tearing effects weren't as pronounced as in NVIDIA's test case, but they were both definitely present on the traditional system and completely absent on the G-Sync machine. I can't stress enough just how smooth the G-Sync experience was, it's a game changer.

http://www.anandtech.com/show/7436/nvidias-gsync-attempting-to-revolutionize-gaming-via-smoothness

As I said before, it is a ****ing shame that this is proprietary tech.
 
Great feature! Instead of having monitor's refresh clock be free-running, it will now be tied to graphic card output. It basically solves almost all the problems related to v-sync.

This is what I consider a true adaptive v-sync, and the best part is that the adapting work is done by the monitor. Good job, nVidia!

The idea has been mentioned before, so I am pretty sure any patentable part of this tech is in the implementation and not the actual concept, which means this will soon by available by on non nVidia platforms too.
 

it is a bit funny (ironic kind) that there are a lack of the "OMG, proprietary tech.. fragmenting the market,, usable only one one vendor" comments that many of the usual suspects clamored about in the recently announced clost to metal API threads.. particularly since this had an added cost on top the the locked to one vendor where as the other solution has no added cost, is cross platform and not a closed proprietary solution that supplements in place APIs and potentially speeds development... yeah.. no bias there.. nope..

eidt: IIRC (been a while since looked it up) but doesn't DisplayPort have the ability to do so natively (DirectDrive?)
edit of edit: sounds a bit more like Direct Drive using the (AUX) Auxiliary Channel:

DDM Displays do not have complex timing or internal display controllers but instead connect directly
to the graphics subsystems and convey native timing to the graphics subsystem for correct configuration

http://www.paradetech.com/2009/09/p...ct-drive-monitor-spec-at-4-mpixel-resolution/

Image quality is also improved since the display scaling function is done by the host graphic processor unit (GPU), which performs scaling with higher precision.

Sounds a lot like some of the features being touted
particularly the "image enhancement without color loss" AND the display specificially mentioning Display POrt only (no audio) which would seem to imply it is using (Again) the AUX channel.
 
Last edited:
No idea, they only mentioned that 144hz asus monitor at the event.

From a nv rep at neogaf:

The kit is only compatible with the ASUS VG248QE at this time. I am asking engineering about 'unofficial' compatibility and kits for other monitors.

Awesome news, I aready own a VG248QE :D

This tech is a much needed step imo, tearing and judder will be a thing of the past iiuc.


Yeah I don't see any reason for it to be either. If Nv were smart they could allow it to be open and get royalties from AMD users too.
 
Last edited:
The VG248QE looks pretty good, I will be in the market for updating my shitty LG LCD by the time the G-Sync kits come out so this is perfect for me.
 
The sad thing is that nVidia is working themselves into a corner. Not sure they have much of an alternative though but basically they're setting themselves up to be a sole provider or niche products to an increasingly niche market.

Cool technology, sucks it is vendor locked though. I'm not a competitive MP gamer so this naturally doesn't hold as much appeal to me as it it would for others.
 
Nvidia does have a choice, they can license out their tech instead of trying to make it a value add on. But that's not how Nvidia rolls.
 
Last edited:
No way Nvidia would license this to AMD when they just announced their own proprietary tech, Mantle. The timing is perfect to steal the spotlight from AMD.
 
Yeah they're pissing on AMD's parade that's for sure. New GPU's and now this. Extra performance from mantle is nice, but this G-sync looks like one of those things you can't live without after you try it. No input lag, no tearing, no judder, perfect 1:1 frame to refresh ratio no matter your frame rate. Awesome.

Monitor compatibility is a weak point though, I wonder if people will hack it down the track.
 
Back
Top