Almost no game can render at exactly a target frame rate, and if so it means the game isn't pushing the GPU to the limit. For any graphically awesome game the frame rate will vary and G-Sync should help in that case.
Yeah, a computer can easily render frames at the same rate that the monitor accepts them, while being horribly out of phase with the timing that the monitor expects to receive the frames. Imagine you and your friend playing on a swingset, where both of you are swinging at the same speed, but you are always at the top when your friend is at the bottom. That's likely the most common source of tearing: the frame rendered within enough time, but due to poorly configured graphics card drivers, the game process being briefly swapped out for something else at the wrong moment, naive timing in the game's rendering loop, or something else, the framebuffer is swapped out too late anyway.
The traditional way to solve this is with v-sync, which works great on realtime systems like game consoles, but on multitasking operating systems, it's rather difficult to synchronize an application perfectly to the display's refresh rate. As such, every graphics card I've ever used adds a frame or two of buffer when v-sync is enabled, resulting in a buttery-smooth but very noticeably laggy display. It doesn't matter how intensive a scene you are rendering, either: I was playing a Quake source port the other day, a game released over 17 years ago, and even that was unplayable for me with v-sync enabled.
I'm very excited for this technology, because allowing programs to control both the frequency and phase of display updates has the potential to eliminate the vast majority of display artifacts I've experienced.
I thought I was pretty clear about that. Synchronization is relatively straightforward in a realtime system, where your program has total control over the timing of execution. On a desktop operating system, however, the graphics card and its drivers can lie, the operating system can lie, the operating system can swap your process out for something else whenever it wants, there is a delay associated with accessing a PC's high accuracy timer (so even the clock lies!), and on top of that, everyone's PC is different. It's still possible to get some rough degree of synchronization going, but it's very difficult and imperfect.
If you still don't think I'm being serious about the timing stuff, consider that some folks still keep around old machines running DOS for real-time communication with microcontrollers.
Now this is interesting. What can operating systems, drivers, and GPU manufacturers do to restore a DOS like real time sync of frame generation/transfer/display.
Now that every gamer has a multi-core processor, couldn't you allocate n-1 of them to the game with guaranteed non-preemption, and have one core where pre-emption can occur?
it's kind of a cool idea. I wonder if OS's would ever implement something like this. You would probably need to put some iOS like restrictions on it- only one app at a time, it must be running full screen in focus, and cannot be given that kind of priority in the background. Like having a dedicated built in console system.
If the game ever drops below 120 there will be a benefit.
Another way of putting it -- you can crank up graphics settings so that you don't need to stay way above 120 just to maintain a minimum 120, because dipping to 90 for a second no longer produce a huge drop in quality.