The NVIDIA GeForce GTX 1080 Ti Founder's Edition Review: Bigger Pascal for Better Performance
by Ryan Smith on March 9, 2017 9:00 AM ESTDriver Performance & The Test
Alongside the launch of the GTX 1080 Ti, NVIDIA is also touting the performance of their drivers. For most users who have been regularly updating their drivers to begin with, I don’t think there’s anything too surprising here. But because of NVIDIA’s talk of driver performance gains, I’ve already seen some confusion here over whether the GTX 1080 Ti launch driver (378.78) is a special performance driver or not. For the record, it is not.
In their presentation, NVIDIA outlined their driver performance gains in DX12 since the launch of various DX12 games, including Ashes of the Singularity, Hitman, and Rise of the Tomb Raider. All of these games have seen performance improvements, but what’s critical here is that this is over the long-run, since the launch of the GTX 1080 and these respective games.
The 378.78 driver in that respect is nothing special. In terms of driver release, NVIDIA is already a few releases into the R378 branch, so any big code changes for this branch have already been released to the public in earlier driver builds.
In any case, for reference purposes, here’s how performance of the GTX 1080 stacks up now compared to performance at launch.
GeForce GTX Driver Performance Gains: July 2016 vs. March 2017 (4K) | |||
Game | GTX 1080 | GTX 980 Ti | |
Rise of the Tomb Raider |
Even
|
Even
|
|
DiRT Rally |
+8%
|
+7%
|
|
Ashes of the Singularity |
+11%
|
+14%
|
|
Battlefield 4 |
Even
|
Even
|
|
Crysis 3 |
Even
|
Even
|
|
The Witcher 3 |
|
Even
|
|
The Division* |
-7%
|
-9%
|
|
Grand Theft Auto V |
+2%
|
Even
|
|
Hitman (DX12) |
+26%
|
+24%
|
As was the case with NVIDIA’s data, the performance gains vary from game to game. Some games have not budged, whereas others like Hitman have improved significantly, and outlier The Division has actually regressed a bit due to some major updates that have happened to the game in the same time period. But at the end of the day, these are performance gains that have accumulated over the months and are already available in the latest drivers from NVIDIA.
The Test
For our review of the GTX 1080 Ti, we’re using NVIDIA’s 378.78 driver.
CPU: | Intel Core i7-4960X @ 4.2GHz |
Motherboard: | ASRock Fatal1ty X79 Professional |
Power Supply: | Corsair AX1200i |
Hard Disk: | Samsung SSD 840 EVO (750GB) |
Memory: | G.Skill RipjawZ DDR3-1866 4 x 8GB (9-10-9-26) |
Case: | NZXT Phantom 630 Windowed Edition |
Monitor: | Asus PQ321 |
Video Cards: | NVIDIA GeForce GTX 1080 Ti Founders Edition NVIDIA GeForce GTX 1080 Founders Edition NVIDIA GeForce GTX 980 Ti NVIDIA GeForce GTX 780 Ti AMD Radeon Fury X |
Video Drivers: | NVIDIA Release 378.78 AMD Radeon Software Crimson 17.3.1 |
OS: | Windows 10 Pro |
161 Comments
View All Comments
close - Monday, March 13, 2017 - link
I was talking about optimizing Nvidia's libraries. When you're using an SDK to develop a game you'er relying a lot on that SDK. And if that's exclusively optimized for one GPU/driver combination you're not going to develop an alternate engine that's also optimized for a completely different GPU/driver. And there's a limit to how much you can optimize for AMD when you're building a game using Nvidia SDK.Yes, the developer could go ahead and ignore any SDK out there (AMD or Nvidia) just so they're not lazy but that would only bring worse results equally spread across all types of GPUs, and longer development times (with the associated higher costs).
You have the documentation here:
https://docs.nvidia.com/gameworks/content/gamework...
AMD offers the same services technically but why would developers go for it? They're optimizing their game for just 25% of the market. Only now is AMD starting to push with the Bethesda partnership.
So to summarize:
-You cannot touch Nvidia's *libraries and code* to optimize them for AMD
-You are allowed to optimize your game for AMD without losing any kind of support from Nvidia but when you're basing it on Nvidia's SDK there's only so much you can do
-AMD doesn't really support developers much with this since optimizing a game based on Nvidia's SDK seems to be too much effort even for them, and AMD would rather have developers using the AMD libraries but...
-Developers don't really want to put in triple the effort to optimize for AMD also when they have only 20% market share compared to Nvidia's 80% (discrete GPUs)
-None of this is illegal, it's "just business" and the incentive for developers is already there: Nvidia has the better cards so people go for them, it's logical that developers will follow
eddman - Monday, March 13, 2017 - link
Again, most of those gameworks effects are CPU only. It does NOT matter at all what GPU you have.As for GPU-bound gameworks, they are limited to just a few in-game effects that can be DISABLED in the options menu.
The main code of the game is not gameworks related and the developer can optimize it for AMD. Is it clear now?
Sure, it sucks that GPU-bound gameworks effects cannot be optimized for AMD and I don't like it either, but they are limited to only a few cosmetic effects that do not have any effect on the main game.
eddman - Monday, March 13, 2017 - link
Not to mention that a lot of gameworks game do not use any GPU-bound effects at all. Only CPU.eddman - Monday, March 13, 2017 - link
Just one example: http://www.geforce.com/whats-new/articles/war-thun...Look for the word "CPU" in the article.
Meteor2 - Tuesday, March 14, 2017 - link
Get a room you two!MrSpadge - Thursday, March 9, 2017 - link
AMD demonstrated they "cache thing" (which seems to be tile based rendering, as in Maxwell and Pascal) to result in a 50% performance increase. So 20% IPC might be far too conservative. I wouldn't bet on a 50% clock speed increase, though. nVidia designed Pascal for high clocks, it's not just the process. AMD seems to intend the same, but can they get it similarly well? If so I'm inclined to ask "why did it take you so long"?FalcomPSX - Thursday, March 9, 2017 - link
I look forward to vega and seeing how much performance it brings, and i really hope it does end up giving performance around a 1080 level for typically lower and more reasonable AMD pricing, but honestly, i expect it to probably come close to but not quite match a 1070 in dx11, surpass it in dx12, and at a much lower price.Midwayman - Thursday, March 9, 2017 - link
Even if its just 2 polaris chips of performance you're past 1070 level. I think conservative is 1080 @ $400-450. Not that there won't be a cut down part at 1070 level, but I'd be really surprised if that is the full die version.Meteor2 - Tuesday, March 14, 2017 - link
I think that sometimes Volta is over-looked. Whatever Vega brings, I feel Volta is going to top it.AMD is catching up with Intel and Nvidia, but outside of mainstream GPUs and HEDT CPUs, they've not done it yet.
Meteor2 - Tuesday, March 14, 2017 - link
Mind you Volta is only coming to Tesla this year, and not consumer until next year. Do AMD should have a competitive full stack for a year. Good times!