Welcome Stranger to OCC!Login | Register

ASUS ROG SWIFT PG278Q G-SYNC Monitor Review

ccokeman    -   August 11, 2014
» Discuss this article (8)

Lowest Prices

ASUS ROG Swift PG278Q Closer Look:

The 2560x1440 resolution, ASUS-specific functionality, and 144Hz refresh cycle all make this a really nice looking and functional gaming monitor on paper, but ASUS took it a step further by incorporating NVIDIA's G-Sync technology to the mix, providing the ultimate gaming experience. G-Sync, in layman's terms, is a technology whereby the NVIDIA GPU is used to not only render and output the finished frames to the display, but to manage the refresh cycle of the display to coincide with the delivery of the finished frames to the display. As such, you never run into a situation where you have over- or under-delivered frames that result in screen tearing with v-sync disabled or enabled. When v-sync is enabled, you end up with stuttering and input lag thanks to the over-delivered frames not matching the scan sequence of the monitor. G-Sync is enabled on monitors by the inclusion of a G-Sync module to allow the magic to happen.

 

To take a further dive into the technology, I'll let NVIDIA handle the description: "In the following chart we show the way frames are delivered to the monitor from the GPU in a traditional monitor with V-Sync Off. A frame is delivered to the monitor as soon as it is done rendering on the GPU. The next frame is then presented to the display, even if the display of the prior frame is not yet done. This process continues, with tearing appearing where every new frame begins. We see that 'Frame 1' is being rendered faster than the length of one display scan period (16ms), which means that 'Frame 2' will interrupt the display of 'Frame 1.' This results in a tear – the simultaneous display of more than one partial frame at a time. This process constantly repeats as the frames will never line up with the refresh rate of the monitor without synchronization."

 

With v-sync on, we run into other issues: "Frame rates are rarely ever fixed at a consistent cadence. When playing a game in a given system, frame rates actually constantly vary due to the changing complexity of each scene, and also by the graphical settings selected by the user. When using V-Sync, the frames being produced by the GPU typically arrive either earlier or later than the V-Sync pulse, and are rarely if ever perfectly aligned, which causes stutter and/or input delay. This is shown by the chart below. In the chart below, we show how frames are delivered to the monitor with V-Sync enabled. As opposed to V-Sync Off, we can see that 'Frame 1' is rendered faster than 16ms, and instead of immediately displaying 'Frame 2', V-Sync makes the GPU wait to send 'Frame 2' until the vertical blanking period of the monitor occurs, and the prior frame is completely displayed. When 'Frame 2' takes longer than 16ms to render, 'Frame 1' is presented again (repeated) and 'Frame 2' is not displayed until the next vertical blanking period. This long period of displaying the same frame twice causes a delay that is displayed as an animation stutter. This also produces input lag because any new input data from the gamer that was meant for the previous frame is now also delayed. The gamer continues to see the input data from 'Frame 1,' and they won’t see the result of new input data until 'Frame 3.' For frames that cannot be rendered at a 60 FPS or higher rate (frame times greater than 16.7ms) and with V-Sync On, the GPU will need to wait at least until the next monitor refresh cycle to update the screen, sometimes even longer. The result is that the same frame is displayed multiple times for each refresh cycle. This not only affects the animation of the screen and producing stutter, it also increases the input latency because any new data from the gamer’s input takes longer and longer to reach the display."

 

Now with NVIDIA G-Sync enabled: "The GeForce GPU sends a signal to the G-SYNC controller built into the monitor and tells the monitor when to update the display. This allows the GPU to render the frame at whatever speed it requires, because the monitor will wait until the full frame is delivered. This guarantees no tearing. With G-Sync, the refresh rate of the monitor is determined by the frame delivery from the GPU. The G-SYNC monitor is capable of a variable, rather than fixed refresh rate. In the chart below, the GPU renders the first frame, and then sends it to the display, at which point the monitor delivers the frame to the screen. Instead of waiting on the vertical blanking period of the monitor, the GPU is free to send the next frame as soon as it’s available. This is because the monitor is always in direct communication with the GPU via the DisplayPort connector. This allows for an amazingly smooth and responsive experience that is truly the way that games are meant to be played."

 

Another function that comes along for the ride with G-Sync technology is ULMB or Ultra Low Motion Blur. This technology works to decrease motion blur and the ghosting commonly seen with quick moving objects on screen. ULMB does not work at the same time as G-Sync, but does offer improvements in fluidity even when running refresh rates as high as 120Hz. Once again I'll let NVIDIA get you the nuts and bolts: "To understand how ULMB works, we must first understand what causes motion blur on LCD monitors. Slow pixel-to-pixel response times—the time it takes for a pixel to change values—is one factor in blurring. However, as monitors have increased response time speeds over the last few years, this is less of a factor. These days, most blurring comes from the 'hold' nature of LCD displays. As pixels are refreshed, the cell keeps emitting the same light value for the entire refresh rate, until it is addressed again. This persistent lit state causes our brains to perceive the pixel transition as blur. ULMB mode achieves its sharper image by forcing the backlight of the G-SYNC monitor to strobe in synchronized time with the monitor’s refresh rate. The backlight is flashed when new pixels are drawn. After this strobe, the backlight is then darkened so that pixels no longer hold, similar to the 'pulse' behavior in CRT monitors. By clearly delineating pixel transitions using the backlight strobe, ULMB creates distinct and sharp moving images. However, one drawback to darkening the backlight in between strobes is that overall monitor brightness is reduced. To compensate for this, ULMB mode automatically maximizes the monitor brightness level. Users may want to further modify their monitor options, such as gamma and color, to tweak the image quality to their liking."

 

Setting up and using G-Sync is as simple as opening up the control panel and enabling the feature under the display settings. Once enabled, there is one more step: open the Global game settings and under V-Sync, choose the option for G-Sync, and you will be well on your way to enjoying your games in a whole new light.

 

 

All this sounds good on paper, but the key here is, does the technology really work as advertised? Having heard a lot of positive buzz on G-Sync, I am curious as to how it will really work and if there is any performance hit from the tech. If that turns out to be minimal at best, then G-Sync should be impressive to say the least on ASUS' new ROG Swift PG278Q 2560x1440 beauty.




  1. ASUS ROG Swift PG278Q: Introduction & Closer Look
  2. ASUS ROG Swift PG278Q: Closer Look (G-Sync)
  3. ASUS ROG Swift PG278Q: Specifications & Features
  4. ASUS ROG Swift PG278Q: Testing
  5. ASUS ROG Swift PG278Q: Conclusion
Random Pic
© 2001-2014 Overclockers Club ® Privacy Policy

Also part of our network: TalkAndroid, Android Forum, iPhone Informer, Neoseeker, and Used Audio Classifieds

Elapsed: 0.0313470364