NVIDIA's Computex Announcements & The Test

Alongside the launch of the GTX 980 Ti, NVIDIA is also taking advantage of Computex to make a couple of other major technology announcements. Given the scope of these announcements we’re covering these in separate articles, but we’ll quickly go over the high points here as they pertain to the GTX 980 Ti.

G-Sync Variable Overdrive & Windowed Mode G-Sync

NVIDIA is announcing a slew of G-Sync products/technologies today, the most important of which is Mobile G-Sync for laptops. However as part of that launch, NVIDIA is also finally confirming that all G-Sync products, including existing desktop G-Sync products, feature support for G-Sync variable overdrive. As the name implies, this is the ability to vary the amount of overdrive applied to a pixel based on a best-effort guess of when the next frame will arrive. This allows NVIDIA to continue to use pixel overdrive on G-Sync monitors to improve pixel response times and reduce ghosting, at a slight cost to color accuracy while in motion from errors in the frame time predictions.

Variable overdrive has been in G-Sync since the start, however until now NVIDIA has never confirmed its existence, with NVIDIA presumably keeping quiet about it for trade secret purposes. However now that displays supporting AMD’s Freesync implementation of DisplayPort Adaptive-Sync are out, NVIDIA is further clarifying how G-Sync works.

Meanwhile being freshly rolled out in NVIDIA’s latest drivers is support for Windowed Mode G-Sync. Before now, running a game in Windowed mode could cause stutters and tearing because once you are in Windowed mode, the image being output is composited by the Desktop Window Manager (DWM) in Windows. Even though a game might be outputting 200 frames per second, DWM will only refresh the image with its own timings. The off-screen buffer for applications can be updated many times before DWM updates the actual image on the display.

NVIDIA will now change this using their display driver, and when Windowed G-Sync is enabled, whichever window is the current active window will be the one that determines the refresh rate. That means if you have a game open, G-Sync can be leveraged to reduce screen tearing and stuttering, but if you then click on your email application, the refresh rate will switch back to whatever rate that application is using. Since this is not always going to be a perfect solution - without a fixed refresh rate, it's impossible to make every application perfectly line up with every other application - Windowed G-Sync can be enabled or disabled on a per-application basis, or just globally turned on or off.

GameWorks VR & Multi-Res Shading

Also being announced at Computex is a combination of new functionality and an overall rebranding for NVIDIA’s suite of VR technologies. First introduced alongside the GeForce GTX 980 in September as VR Direct, NVIDIA will be bringing their VR technologies in under the GameWorks umbrella of developer tools. The collection of technologies will now be called GameWorks VR, adding to the already significant collection of GameWorks tools and libraries.

On the feature front, the newly minted GameWorks VR will be getting a new feature dubbed Multi-Resolution Shading, or Multi-Res Shading for short. With multi-res shading, NVIDIA is looking to leverage the Maxwell 2 architecture’s Multi-Projection Acceleration in order to increase rendering efficiency and ultimately the overall performance of their GPUs in VR situations.

By reducing the resolution of video frames at the edges where there is already the most optical distortion/compression and the human eye is less sensitive, NVIDIA says that using multi-res shading can result in a 1.3x to 2x increase in pixel shader performance without noticeably compromising the image quality. Like many of the other technologies in the GameWorks VR toolkit this is an implementation of a suggested VR practice, however in NVIDIA’s case the company believes they have a significant technological advantage in implementing it thanks to multi-projection acceleration. With MPA to bring down the rendering cost of this feature, NVIDIA’s hardware can better take advantage of the performance advantages of this rendering approach, essentially making it an even more efficient method of VR rendering.

Getting Behind DirectX Feature Level 12_1

Finally, though not an outright announcement per-se, from a marketing perspective we should expect to see NVIDIA further promote their current technological lead in rendering features. The Maxwell 2 architecture is currently the only architecture to support DirectX feature level 12_1, and with DirectX 12 games due a bit later this year, NVIDIA sees that as an advantage to press.

For promotional purposes NVIDIA has put together a chart listing the different tiers of feature levels for DirectX 12, and to their credit this is a simple but elegant layout of the current feature level situation. The bulk of the advanced DirectX 12 features we saw Microsoft present at the GTX 980 launch are part of feature level 12_1, while the rest, and other functionality not fully exploited under DirectX 11 are part of the 12_0 feature level. The one exception to this is volume tiled resources, which is not part of either feature level and instead is part of a separate feature list for tiled resources that can be implemented at either feature level.

The Test

The press drivers for the launch of the GTX 980 Ti are release 352.90, which other than formally adding support for the new card is otherwise identical to the standing 352.86 drivers.

CPU: Intel Core i7-4960X @ 4.2GHz
Motherboard: ASRock Fatal1ty X79 Professional
Power Supply: Corsair AX1200i
Hard Disk: Samsung SSD 840 EVO (750GB)
Memory: G.Skill RipjawZ DDR3-1866 4 x 8GB (9-10-9-26)
Case: NZXT Phantom 630 Windowed Edition
Monitor: Asus PQ321
Video Cards: AMD Radeon R9 295X2
AMD Radeon R9 290X
AMD Radeon HD 7970
NVIDIA GeForce GTX Titan X
NVIDIA GeForce GTX 980 Ti
NVIDIA GeForce GTX 980
NVIDIA GeForce GTX 780 Ti
NVIDIA GeForce GTX 780
NVIDIA GeForce GTX 680
NVIDIA GeForce GTX 580
Video Drivers: NVIDIA Release 352.90 Beta
AMD Catalyst Cat 15.5 Beta
OS: Windows 8.1 Pro
Meet The GeForce GTX 980 Ti Battlefield 4
Comments Locked

290 Comments

View All Comments

  • chizow - Monday, June 1, 2015 - link

    Yes, its unprecedented to launch a full stack of rebrands with just 1 new ASIC, as AMD has done not once, not 2x, not even 3x, but 4 times with GCN (7000 to Boost/GE, 8000 OEM, R9 200, and now R9 300) Generally it is only the low-end, or a gap product to fill a niche. The G92/b isn't even close to this as it was rebranded numerous times over a short 9 month span (Nov 2007 to July 2008), while we are bracing ourselves for AMD rebrands going back to 2011 and Pitcairn.
  • Gigaplex - Monday, June 1, 2015 - link

    If it's the 4th time as you claim, then by definition, it's most definitely not unprecedented.
  • chizow - Monday, June 1, 2015 - link

    The first 3 rebrands were still technically within that same product cycle/generation. This rebrand certainly isn't, so rebranding an entire stack with last-gen parts is certainly unprecedented. At least, relative to Nvidia's full next-gen product stack. Hard to say though given AMD just calls everything GCN 1.x, like inbred siblings they have some similarities, but certainly aren't the same "family" of chips.
  • Refuge - Monday, June 1, 2015 - link

    Thanks Gigaplex, you beat me to it... lol
  • chizow - Monday, June 1, 2015 - link

    Cool maybe you can beat each other and show us the precedent where a GPU maker went to market with a full stack of rebrands against the competition's next generation line-up. :)
  • FlushedBubblyJock - Wednesday, June 10, 2015 - link

    Nothing like total fanboy denial
  • Kevin G - Monday, June 1, 2015 - link

    The G92 got its last prebrand in 2009 and was formally replaced on in 2010 by the GTX 460. It had a full three year life span on the market.

    The GTS/GTX 200 series as mostly rebranded. There was the GT200 chip on the high end that was used for the GTX 260 and up. The low end silently got the GT216 for the Geforce 210 a year after the GTX 260/280 launch. At this time, AMD was busy launching the Radeon 4000 series which brought a range of new chips to market as a new generation.

    Pitcairn came out in 2012, not 2011. This would mimic the life span of the G92 as well as the number of rebrands. (It never had a vanilla edition, it started with the Ghz edition as the 7870.)
  • chizow - Monday, June 1, 2015 - link

    @Kevin G, nice try at revisionist history, but that's not quite how it went down. G92 was rebranded numerous times over the course of a year or so, but it did actually get a refresh from 65nm to 55nm. Indeed, G92 was even more advanced than the newer GT200 in some ways, with more advanced hardware encoding/decoding that was on-die, rather than on a complementary ASIC like G80/GT200.

    Also, at the time, prices were much more compacted at the time due to economic recession, so the high-end was really just a glorified performance mid-range due to the price wars started by the 4870 and the economics of the time.

    Nvidia found it was easier to simply manipulate the cores on their big chip than to come out with a number of different ASICs, which is how we ended up with GTX 260 core 192, core 216 and the GTX 275:

    Low End: GT205, 210, GT 220, GT 230
    Mid-range: GT 240, GTS 250
    High-end: GTX 260, GTX 275
    Enthusiast: GTX 280, GTX 285, GTX 295

    The only rebranded chip in that entire stack is the G92, so again, certainly not the precedent for AMD's entire stack of Rebrandeon chips.
  • Kevin G - Wednesday, June 3, 2015 - link

    @chizow
    Out of that list of GTS/GTX200 series, the new chip in that line up in 2008 was the GT200 and the GT218 that was introduced over a year later in late 2009. For 9 months on the market the three chips used in the 200 series were rebrands of the G94, rebrands of the G92 and the new GT200. The ultra low end at this time was filled in by cards still carrying the 9000 series branding.

    The G92 did have a very long life as it was introduced as the 8800GTS with 512 MB in late 2007. In 2008 it was rebranded the 9800GTX roughly six months after it was first introduced. A year later in 2009 the G92 got a die shrink and rebranded as both the GTS 150 for OEMs and GTS 250 for consumers.

    So yeah, AMD's R9 300 series launch really does mimic what nVidia did with the GTS/GTX 200 series.
  • FlushedBubblyJock - Wednesday, June 10, 2015 - link

    G80 was not G92 not G92b nor G94 mr kevin g

Log in

Don't have an account? Sign up now