NVIDIA Confirms 12-pin GPU Power Connector
by Dr. Ian Cutress on August 26, 2020 9:00 AM ESTToday as part of a video showcasing NVIDIA’s mechanical and industrial design of its GPUs, and how it gets a large GPU to dissipate heat, the company went into some detail about how it needed to improve the design of all mechanical and electrical aspects of the board to aid cooling. This means implementing leaf springs for a back plate solution, as well as vapor chamber technology and using the right sorts of fans and fan management software.
As part of this video showcase, the company also shows its new 12-pin power connector. It also shows the 12-pin connector running perpendicular to the PCB, which is very interesting indeed.
Users who follow the tech news may have seen a few posts circling the internet regarding this 12-pin power connector, with a Seasonic cable that puts together two of the standard PCIe 8-pin connectors into one of NVIDIA’s new 12-pin designs.
Image from Hardwareluxx
NVIDIA states in the video that this 12-pin design is of its own creation. It isn’t clear if this is set to become a new standard in power cable connectivity for power supplies, however going forward we assume that most graphics cards that have this 12-pin power design will have to come with an additional 2x8-pin to 12-pin power cable included. We wait to see if that's the case, or if it will be up to future power supplies to bundle the appropriate connector.
More details about the connector are expected to appear on September 1st during NVIDIA’s GeForce Special Event.
79 Comments
View All Comments
Rictorhell - Wednesday, August 26, 2020 - link
I'm excited about this card, but concerned about the rumored $1400 msrp for the 3090, and then, on top of that, what the final TDP will be for the card itself, let alone whatever added power will be required for whatever motherboard and cpu I would choose to run it with.I haven't gamed in a long time so I don't know what the current status is, but last time I checked into it most of the quality games were being released for consoles first, and then PCs several months later, if at all, and then the games that were ported to PC weren't optimized at all, and were buggy half of the time, upon release.
Has this changed at all in the last year or so? If I actually can get the money to invest in this card and a new PC, is my money going to be well spent or am I going to be frustrated with a lot of poor console ports that make me feel as if my money was wasted?
For me, building a new pc and then having to worry that the TDP for playing a game a few hours a couple of days a week might skyrocket the power bill, dampens my enjoyment. I need to know that this is a worthwhile investment and not just me buying something "cool" for the sake of buying something "cool".
Nobody has the answers right now, I know, because the card hasn't been released yet, but I'm hoping for a TDP for the actual card that is reasonable and not totally insane, and on the flip side of that I am hoping for a performance increase from the card that justifies the over a grand price that it seems Nvidia will be asking for.
How many games are lined up that will actually support ray tracing and use it evenly throughout the game and not just sporadically in a level, here and there? I'm seriously asking, because I don't know.
I haven't heard people raving online about all these awesome ray traced games that are must buy.
I hear a lot about FortNite, which I think will play on most PCs with just average graphics cards.
Anyway.....
Kjella - Wednesday, August 26, 2020 - link
You "haven't gamed in a long time" but still have hopes for an ultra-enthusiast card and mixes it up with Fortnite while being worried about the few cents it'll cost in power? I smell troll.Spunjji - Thursday, August 27, 2020 - link
This is an odd post, but to summarise:There are still a bunch of solid games developed PC-first.
There aren't really any "must-buy" ray-traced games. It's still very much optional.
High-end cards have become a stupid race to the limits of affordability.
blzd - Monday, August 31, 2020 - link
In the last year or so? That hasn't been the case for 10+ years.If you don't know why you would want a $1400 graphics card chances are you don't need one lol.
edzieba - Thursday, August 27, 2020 - link
"NVIDIA states in the video that this 12-pin design is of its own creation"Its Molex' Micro-Fit, almost certainly Micro-Fit+ (12.5A per pin, so potentially 900W per connector).
Zoolook - Thursday, August 27, 2020 - link
12.5A are for dual pin Microfit, maximum for 12 pin is 9.5A with gauge 16 wires (ca 1.3 mm).Still at 684W a lot of headroom compared to the ATX std 300W over two 8-pin.
Spunjji - Thursday, August 27, 2020 - link
So what I gathered from this is that Nvidia has gone full Apple - by which I mean, they're claiming to have invented things they didn't (the connector), are focusing on the herculean efforts required by their engineers to make an absurdly impractical design work to distract from the fact that it's an absurdly impractical design (300W+ GPU), and are doing all of that in service of hyping up a product which will be overpriced and yet somehow still released to rave reviews and bought by legions of dedicated fans.SkyBill40 - Thursday, August 27, 2020 - link
And to think I just bought a Seasonic PX-750 not that long ago. I think the 850w recommendation is a bit absurd but I guess it all depends on just how power hungry or inefficient these cards happen to be. I'm thinking I would be fine with a Seasonic sourced cable and that the power requirement would be less on anything 3080 or below. Still, it sucks to have to either use an adapter and the potential risk that comes with it or to buy a new PSU that includes it.dj_aris - Wednesday, September 2, 2020 - link
How hard would it be to route the power cables of every motherboard and component THROUGH the motherboard? Thus placing the pcie aux power connector next to the pcie slot, improving aesthetics and airflow?