Meet The New Future of Gaming: Different Than The Old One

Up until last month, NVIDIA had been pushing a different, more conventional future for gaming and video cards, perhaps best exemplified by their recent launch of 27-in 4K G-Sync HDR monitors, courtesy of Asus and Acer. The specifications and display represented – and still represents – the aspired capabilities of PC gaming graphics: 4K resolution, 144 Hz refresh rate with G-Sync variable refresh, and high-quality HDR. The future was maxing out graphics settings on a game with high visual fidelity, enabling HDR, and rendering at 4K with triple-digit average framerate on a large screen. That target was not achievable by current performance, at least, certainly not by single-GPU cards. In the past, multi-GPU configurations were a stronger option provided that stuttering was not an issue, but recent years have seen both AMD and NVIDIA take a step back from CrossFireX and SLI, respectively.

Particularly with HDR, NVIDIA expressed a qualitative rather than quantitative enhancement in the gaming experience. Faster framerates and higher resolutions were more known quantities, easily demoed and with more intuitive benefits – though in the past there was the perception of 30fps as cinematic, and currently 1080p still remains stubbornly popular – where higher resolution means more possibility for details, higher even framerates meant smoother gameplay and video. Variable refresh rate technology soon followed, resolving the screen-tearing/V-Sync input lag dilemma, though again it took time to catch on to where it is now – nigh mandatory for a higher-end gaming monitor.

For gaming displays, HDR was substantively different than adding graphical details or allowing smoother gameplay and playback, because it meant a new dimension of ‘more possible colors’ and ‘brighter whites and darker blacks’ to gaming. Because HDR capability required support from the entire graphical chain, as well as high-quality HDR monitor and content to fully take advantage, it was harder to showcase. Added to the other aspects of high-end gaming graphics and pending the further development of VR, this was the future on the horizon for GPUs.

But today NVIDIA is switching gears, going to the fundamental way computer graphics are modelled in games today. Of the more realistic rendering processes, light can be emulated as rays that emit from their respective sources, but computing even a subset of the number of rays and their interactions (reflection, refraction, etc.) in a bounded space is so intensive that real time rendering was impossible. But to get the performance needed to render in real time, rasterization essentially boils down 3D objects as 2D representations to simplify the computations, significantly faking the behavior of light.

It’s on real time ray tracing that NVIDIA is staking its claim with GeForce RTX and Turing’s RT Cores. Covered more in-depth in our architecture article, NVIDIA’s real time ray tracing implementation takes all the shortcuts it can get, incorporating select real time ray tracing effects with significant denoising but keeping rasterization for everything else. Unfortunately, this hybrid rendering isn’t orthogonal to the previous concepts. Now, the ultimate experience would be hybrid rendered 4K with HDR support at high, steady, and variable framerates, though GPUs didn’t have enough performance to get to that point under traditional rasterization.

There’s a still a performance cost incurred with real time ray tracing effects, except right now only NVIDIA and developers have a clear idea of what it is. What we can say is that utilizing real time ray tracing effects in games may require sacrificing some or all three of high resolution, ultra high framerates, and HDR. HDR is limited by game support more than anything else. But the first two have arguably minimum performance standards when it comes to modern high-end gaming on PC – anything under 1080p is completely unpalatable, and anything under 30fps or more realistically 45 to 60fps hurts the playability. Variable refresh rate can mitigate the latter and framedrops are temporary, but low resolution is forever.

Ultimately, the real time ray tracing support needs to be implemented by developers via a supporting API like DXR – and many have been working hard on doing so – but currently there is no public timeline of application support for real time ray tracing, Tensor Core accelerated AI features, and Turing advanced shading. The list of games with support for Turing features - collectively called the RTX platform - will be available and updated on NVIDIA's site.

The RTX 2080 Ti & 2080 Review The RTX Recap: A Brief Overview of the Turing RTX Platform
Comments Locked

337 Comments

View All Comments

  • Qasar - Wednesday, September 19, 2018 - link

    just checked a local store, the lowest priced 2080 card, a gigabyte rtx 2080 is $1080, and thats canadian dollars... the most expensive RTX card ..EVGA RTX 2080 Ti XC ULTRA GAMING 11GB is $1700 !!!! again that's canadian dollars !! to make things worse.. that's PRE ORDER pricing, and have this disclaimer : Please note that the prices of the GeForce RTX cards are subject to change due to FX rate and the possibility of tariffs. We cannot guarantee preorder prices when the stock arrives - prices will be updated as needed as stock become available.
    even if i could afford these cards.. i think i would pass.. just WAY to expensive.. id prob grab a 1080 or 1080ti and be done with it... IMO... nvida is being a tad bit greedy just to protect and keep its profit margins.. but, they CAN do this.. cause there is no one else to challenge them...
  • PopinFRESH007 - Wednesday, September 19, 2018 - link

    would you care to share the bill of materials for the tu102 chip? Since you seem to suggest you know the production costs, and therefor know the profit margin which you suggest is a bit greedy.
  • Qasar - Wednesday, September 19, 2018 - link

    popin.. all i am trying to say is nvidia doesnt have to charge the prices they are charging.. but they CAN because there is nothing else out there to provide competition...
  • tamalero - Thursday, September 20, 2018 - link

    Please again explain how the cost of materials is somehow relevant on the price performance argument for consumers?

    Chips like R600, Fermi, similars.. were huge.. did it matter? NO, did performance matter? YES.
  • PopinFRESH007 - Thursday, September 20, 2018 - link

    I specifically replied to Qasar's claim "nvida is being a tad bit greedy just to protect and keep its profit margins.. but, they CAN do this" which is baseless unless they have cost information to know what their profit margins are.
  • Nagorak - Thursday, September 20, 2018 - link

    Nvidia is a public company. You can look up their profit margin and it is quite high.
  • Qasar - Thursday, September 20, 2018 - link

    PopinFRESH * sigh * i guess you will never understand the concept of " no competition, we can charge what ever we want, and people will STILL buy it cause it is the only option if you want the best or fastest " it has NOTHING to do with knowing cost info or what a companies profit margins are... but i guess you will never understand this....
  • just4U - Thursday, September 20, 2018 - link

    Ofcourse their being greedy. Since they saw their cards flying off the shelfs at 50% above MSRP earlier this year they know people are willing to pay.. so their pushing the limit. As they normally do.. this isn't new with Nvidia. Not sure why any are defending them.. or getting excessively mad about it. (..shrug)
  • mapesdhs - Wednesday, September 26, 2018 - link

    Effectively, gamers are complaining about themselves. Previous cards sold well at higher prices, so NVIDIA thinks it can push up the pricing further, and reduce product resources at the same time even when the cost is higher. If the cards do sell well then gamers only have themselves to blame, in which case nothing will change until *gamers* stop making it fashionable and cool to have the latest and greatest card. Likewise, if AMD does release something competitive, whether via price, performance or both, then gamers need to buy the damn things instead of just exploiting the lowered NVIDIA pricing as a way of getting a cheaper NVIDIA card. There's no point AMD even being in this market if people don't buy their products even when it does make sense to do so.
  • BurntMyBacon - Thursday, September 20, 2018 - link

    @PopinFRESH007: "would you care to share the bill of materials for the tu102 chip? Since you seem to suggest you know the production costs, and therefor know the profit margin which you suggest is a bit greedy."

    You have a valid point. It is hard to establish a profit margin without a bill of materials (among other things). We don't have a bill of materials, but let me establish some knowns so we can better assess.

    Typically, most supporting components on a graphics card are pretty similar to previous generation cards. Often times different designs used to do the same function are a cost cutting measure. I'm going to make an assumption that power transistors, capacitors, output connectors, etc. will remain nominally the same cost. So I'll focus on differences. The obvious is the larger GPU. This is not the first chip made on this process (TSMC 12nm) and the process appears to be a half node, so defect rates should be lower and the wafer cost should be similar to TSMC 14nm. On the other hand, the chip is still very large which will likely offset some of that yield gain and reduce the number of chips fabricated per wafer. Pascal was first generation chip produced on a new full node process (TSMC 14nm), but quite a bit smaller, so yields may have been higher and there were more chips fabricated per wafer. Also apparent is the newer GDDR6 memory tech, which will naturally cost more than GDDR5(X) at the moment, but clearly not as much as HBM2. The chips also take more power, so I'd expect a marginal increase for power related circuitry and cooling relative to pascal. I'd expect about the same cost here as for maxwell based chips, given similar power requirements.

    From all this, it sounds like graohics cards based on Turing chips will cost more to manufacture than Pascal equivalents. I it is probably not unreasonable to suggest that a TU106 may have a similar cost bill of materials to a GP102 with the note that the cost to produce the GP102 has most certainly dropped since introduction.

    I'll leave the debate on how greedy or not this is to others.

Log in

Don't have an account? Sign up now