This week, NVIDIA launched their new "The Hunt Begins" game bundle, adding on Monster Hunter: World with qualifying purchases of the GeForce GTX 1070 Ti, GTX 1070, and GTX 1060 6GB (1280 core). The promotion is effective now and runs until November 29th. It appears to cover the usual AIB graphics cards, details are rather sparse from the terms and conditions so any prospective buyers should be sure to double-check the seller/retailer.

For Monster Hunter: World, NVIDIA recommends the GTX 1070 for 1080p 60fps at High Quality, so the bundle is firmly targeted at mainstream type resolutions and gaming scenarios.

No other NVIDIA bundles are running at this time, which is unsurprising given the recent launch of the RTX 20 series. While the promotion clearly covers the upper mid-range of Pascal GeForce, the GeForce RTX 2070, 2080, and 2080 Ti are the intended successor for the enthusiast range. Especially with the RTX 2070 in the GTX 1080 performance bracket, NVIDIA won't be looking to encourage cannibalization via GTX 1080 (Ti) sales over RTX cards. Either way for Pascal, NVIDIA will be looking to clear out their channel of last-gen inventory. It remains to be seen how NVIDIA will pursue the mainstream and entry-level tiers, whether it is with Turing as we currently know it or some other approach.

NVIDIA Current Game Bundles
(Q4 2018)
Video Card
(incl. systems and laptops)
Bundle
GeForce RTX 20 Series None
GeForce GTX 1080 Ti & 1080 None
GeForce GTX 1070 Ti & 1070 Monster Hunter: World Bundle
GeForce GTX 1060 6GB (1280 cores)
GeForce GTX 1060 3GB (1152 cores) and GTX 1050 (Ti) None

Another note is the revision of GTX 1060 6GB models to include GDDR5X memory, though clocked the same as the original GDDR5. Providing the same amount of memory bandwidth, the logic points to a supply/inventory reason rather than a performance refresh.

Codes must be redeemed via NVIDIA Redemption portal on a desktop with qualifying graphics card installed. More information and details can be found in the terms and conditions. Be sure to verify the participation of any vendors purchased from as NVIDIA will not give codes for purchases made from non-participating sellers.

Source: NVIDIA

Comments Locked

12 Comments

View All Comments

  • DanNeely - Thursday, October 25, 2018 - link

    "Another note is the revision of GTX 1060 6GB models to include GDDR5X memory, though clocked the same as the original GDDR5. Providing the same amount of memory bandwidth, the logic points to a supply/inventory reason rather than a performance refresh."

    Basically NVidia is making these cards to use up GP104 dies that either had at least 6 bad groups of cores or 1 bad memory controller and thus were unsuitable to make into GTX1070s. It's the same deal as with the 5GB GT1060 that came out earlier in the year. It'll be interesting to see how available these end up being, AFAIK the 5GB 1060's only showed up in a few price sensitive Asian markets. As a higher performing variant these might show up in the west instead; OTOH since they're being primarily made as a way to use up dud parts there's no guarantee they'll have a lot of them to sell.
  • Flunk - Thursday, October 25, 2018 - link

    No, probably not. GDDR5 is backwards-compatible with GDDR5 controllers (although it only runs in double-data rate mode).Yields on the GP104 have to be pretty good or Nvidia wouldn't have launched the 1070 TI.
  • ImSpartacus - Thursday, October 25, 2018 - link

    They probably are excellent by now, but that doesn't mean Nvidia hasn't spent the past three years stockpiling "broken" GP104s.

    This is a common practice for GPU makers. We know how this works.
  • TheinsanegamerN - Thursday, October 25, 2018 - link

    Two and a half years later, STILL pushing pascal.

    Remember when we had a new generation of GPUs every 12 months?
  • PeachNCream - Thursday, October 25, 2018 - link

    Product refreshes every 12 months were nice, but when the industry was stuck on 28nm, those new GPUs were often mild tweaks of existing chips or just rebranded models. Those models that did offer performance increases often came from bigger dies that increased TDP and costs. Thus the dual slot blower and vapor chamber cooling that are status quo in the present day became normalized across all products rather than positioned at the upper end of the stack (though I would be remiss not to accept the idea that there were competitive market forces and demand-based pull from consumers that also played a role).
  • TheinsanegamerN - Thursday, October 25, 2018 - link

    True, but even then going from the 680 to the 780, then the 780ti and then the 980ti, over the course of three years, offered great performance improvements with just tweaking.

    Same as the 7970 to the beastly 290x.
  • PeachNCream - Thursday, October 25, 2018 - link

    All good points. I often overlook the highest end graphics cards since gaming hasn't been a priority in over a decade or so. The rebrands were mainly a thing in the mobile space and in the lower end. I recall seeing multiple iterations of the same Radeon GPU. For instance, the HD 6450 is basically identical to the HD 7450 despite the implied higher performance from the increased model number. It was much the same situation with the chipset integrated HD 3200 and 4200 IIRC. At the top end, I would imagine that significant engineering effort was invested because the halo effect of having higher performance was helpful across a given graphics generation as was the trickle-down nature of developing a good GPU that could be later incorporated into subsequent lesser models.
  • Cellar Door - Thursday, October 25, 2018 - link

    What you are missing here is how complex gpus have become. Go ahead and compare the transistor counts then think about your comment.
  • TheinsanegamerN - Thursday, October 25, 2018 - link

    Hows about you compare the GTX 480 and GTX 680, released a whopping 3 years apart, and get back to me bud? Or hos about the 780ti_>980TI->1080ti. Seems there were plenty of improvements on those "complex GPUs".
  • yeeeeman - Friday, October 26, 2018 - link

    You didn't understand a thing from what the guy said.
    GTX680 GPU was a pretty small GPU for being the high end model - it was only ~300mm2. This was because AMD highend part was pretty poor, so nvidia just sold something small and cheap for big money.
    GTX780TI is much bigger because it had where to grow. Starting from a 300mm2 GPU, you can work your way up until ~600mm2 on the same node. So this GPU was 550mm2, basically the highend GPU that nvidia was planning for the 600 series that never happened.
    Now, maxwell's 980TI was over 600mm2 in size and it got faster just because they improved by a big margin the memory compression algorithms and stripped the GPU of the FP units. They used that new space for more graphics resources.
    From Pascal onwards, things have changed. 1080TI is on 16nm, but it is still pretty big at 470mm2.
    GP100, which is the full fat GPU is very big in size (610mm2) so as big as the 980Ti which was the last increase in size for 28nm. It had HBM2, which was very expensive at the time, so they only sold it in Quadro line-up. Volta is another beast, over 800mm2. That is a HUGE chip and they could launch it on the market but it would cost over 2k.
    Now, with the RTX line-up, the process is again pushed to the limits. The 2080ti is a huge chip at 750mm2 and considering the price, I think it is quite cheap for the amount of silicon it has.

    So bottom line is that nvidia released better cards than the 1080Ti, but they would be so expensive that almost nobody could afford to buy them.

Log in

Don't have an account? Sign up now