Wow NV redefining how power is delivered to a graphics card….
Way to go!
So much hype,and I can't remember Maxwell,Pascal or Turing being hyped so much. AFAIK,unless next week its the actual launch,its meant to be a “product reveal”. The product reveal is hype for the launch so this is 21 days of hype for the hype for the launch. So either Ampere is going to be the next 8800GTX or the next Fermi.
Interesting that they're selling the new power connector as a packaging benefit and not on the higher power delivery capacity. More amusing that they felt the need to make 2x8pin look as big as possible in comparison by including both sides of the connection:
https://i.imgur.com/OJE8L2y.png vs
https://i.imgur.com/cq0WXWR.png
edmundhonda
Interesting that they're selling the new power connector as a packaging benefit and not on the higher power delivery capacity. More amusing that they felt the need to make 2x8pin look as big as possible in comparison by including both sides of the connection:
https://i.imgur.com/OJE8L2y.png vs https://i.imgur.com/cq0WXWR.png
They are trying to make the PCB bigger so they can increase the mass of cooling they can use in a specific card size. This means either the cards are using a ton of power,or there is some very hot running parts(which leaks say is the RAM which has been pushed to the edge).
It's a lot of effort into marketing first party cards. 1) I can't imagine board partners are too impressed by this. 2) When they're focusing hype on cooling/power delivery, is that telling us the underlying GPU isn't that amazing?
So, surely they can't make a 12 pin power connector proprietary?
(I'm sure they'll try regardless)
CAT-THE-FIFTH
edmundhonda
Interesting that they're selling the new power connector as a packaging benefit and not on the higher power delivery capacity. More amusing that they felt the need to make 2x8pin look as big as possible in comparison by including both sides of the connection:
https://i.imgur.com/OJE8L2y.png vs https://i.imgur.com/cq0WXWR.png
They are trying to make the PCB bigger so they can increase the mass of cooling they can use in a specific card size. This means either the cards are using a ton of power,or there is some very hot running parts(which leaks say is the RAM which has been pushed to the edge).
This is what makes me think they are worried about AMD, perhaps they finally have a good lineup *top end too*.
Seems they've been forced to push them more than they were comfortable with, now they need preemptive damage control to “justify ”
Is vertical mounting to make it as fragile as possible?
Hopefully the cooler's shroud will support the connector, but are very good reasons why most laptops do not have the power socket attacked the PCB any more.
If you look at the american sites/bloggers/youtubers they all seem to think this is already a bit of an own goal by NV
Too hot and too power hungry and too expensive all seem pretty much what they are calling. Mind you they all reckon that Intel are dead in the water too so go figure. Figures are that the 12-pin power connector is also there because the card can pull 390-400 watts and requires to be connected to 3 x 8-pin connectors to provide that juice reliably or correctly whilst others are not so sure. The very hot ram and possibly 2 chip/gpu solution also means that it is a big card and the fans are on both sides. I am also reading a few who say it requires pci-e gen 4 to reach full potential so you Intel lovers are gonna be a bit out of luck as well…
edmundhonda
Interesting that they're selling the new power connector as a packaging benefit and not on the higher power delivery capacity. More amusing that they felt the need to make 2x8pin look as big as possible in comparison by including both sides of the connection:
https://i.imgur.com/OJE8L2y.png vs https://i.imgur.com/cq0WXWR.png
Nope. That size difference is accurate. Both the old and new connectors are made by Molex.
8pin:-
Series: “Mini Fit Jr”
Pin Spacing: 4.2mm
Pin Current: 13A
SizeL: 18mm x 9.6mm
12pin:-
Series: “Micro Fit +”
Pin Spacing: 3mm
Pin Current: 12.5A
Size: 18.85mm x 7.4mm
TL;DR - The 12 pin is just about the same size as ONE of the 8 pins.
How soon until we return to the CPU being on a daughter board with a single power connector and the GPU on the mainboard?
yawn…the ultimate cooling solution is liquid cooling, the rest is marketing hype. The guyz who will buy a 3090 can simply afford when it comes with a closed loop solution just like Fury x.
If Nvidia are having to pump lots of power into their GPU to get the performance they feel they need it tells you two things:
1. They have a process problem/inefficiency with Samsung silicon and are using brute force to over come it just like AMD had too in the past with Global Foundry silicon
2. They are worried about AMD performance
BigBANGerZ
If Nvidia are having to pump lots of power into their GPU to get the performance they feel they need it tells you two things:
1. They have a process problem/inefficiency with Samsung silicon and are using brute force to over come it just like AMD had too in the past with Global Foundry silicon
2. They are worried about AMD performance
Both according to many….
lumireleon
yawn…the ultimate cooling solution is liquid cooling, the rest is marketing hype. The guyz who will buy a 3090 can simply afford when it comes with a closed loop solution just like Fury x.
Light weight, real (wo)men use LN2. ;)
They rotated the power connector so it lines up with the fins! That's adorable. It looks like a very competent cooling design, so I'm interested to see how it compares to aftermarket coolers. Nvidia are clearly gunning to remove as many other companies profiting between them and the consumer as possible
edmundhonda
Interesting that they're selling the new power connector as a packaging benefit and not on the higher power delivery capacity. More amusing that they felt the need to make 2x8pin look as big as possible in comparison by including both sides of the connection:
https://i.imgur.com/OJE8L2y.png vs https://i.imgur.com/cq0WXWR.png
Those 8 pin connectors are also clipping through a couple of the inductors
kompukare
Is vertical mounting to make it as fragile as possible?
Hopefully the cooler's shroud will support the connector, but are very good reasons why most laptops do not have the power socket attacked the PCB any more.
Is that really an issue? It's a GPU, 5 connection cycles would be a lot
kompukare
Is vertical mounting to make it as fragile as possible?
Hopefully the cooler's shroud will support the connector, but are very good reasons why most laptops do not have the power socket attacked the PCB any more.
We seem to have managed well enough with plenty of well-used vertically-stacked IO ports on ATX motherboards for decades now, so I should think a “tall” power socket can be mechanically attached to a PCB just fine.
The reason why laptop power sockets are vulnerable is that the plug is usually a sizeable solid protrusion that can be levered or impacted laterally with quite some force and done so repeatedly over a reasonable lifespan; this concern doesn't apply to connectors situated inside a chassis.
Specifications of the RTX3090,RTX3080 and RTX3070 have been leaked:
https://videocardz.com/newz/nvidia-geforce-rtx-3090-and-geforce-rtx-3080-specifications-leakedThe specifications of the RTX3080 hint at yields being very poor. Despite being massively cut down,and having 40% of the VRAM,it only has a 30W lower board power. I suspect the RTX3090 has 24GB of VRAM,to justify a much higher price. What is the likelihood that RTX3090 stocks,will be much lower than the RTX3080??
CAT-THE-FIFTH
Specifications of the RTX3090,RTX3080 and RTX3070 have been leaked:
https://videocardz.com/newz/nvidia-geforce-rtx-3090-and-geforce-rtx-3080-specifications-leaked
The specifications of the RTX3080 hint at yields being very poor. Despite being massively cut down,and having 40% of the VRAM,it only has a 30W lower board power. I suspect the RTX3090 has 24GB of VRAM,to justify a much higher price. What is the likelihood that RTX3090 stocks,will be much lower than the RTX3080??
Much as was expected. Can see that the GDDR6X memory is the issue here as the 3070 with non-X GDDR6 has a much lower TDP
I expect there to be very few 3090's about
3dcandy
Much as was expected. Can see that the GDDR6X memory is the issue here as the 3070 with non-X GDDR6 has a much lower TDP
I expect there to be very few 3090's about
But look at the RTX3090 against the RTX3080. 40% of the VRAM,smaller memory bus,17% less shaders,and yet only a 30W reduction in board power. So that tells me the GA102 is probably yielding very poorly. Its most likely the RTX3090 is just a rebranded Quadro,with the best dies. It is the same problem with the GA100 which is made on TSMC 7NM,it has well over a 350W TDP,but is significantly cut down,and TSMC 7NM is very mature by now.
I suspect the GA104 is using Samsung 7NM/5NM. If its on TSMC 7NM,something is really up with Ampere as a design.
CAT-THE-FIFTH
But look at the RTX3090 against the RTX3080. 40% of the VRAM,smaller memory bus,17% less shaders,and yet only a 30W reduction in board power. So that tells me the GA102 is probably yielding very poorly. Its most likely the RTX3090 is just a rebranded Quadro,with the best dies. It is the same problem with the GA100 which is made on TSMC 7NM,it has well over a 350W TDP,but is significantly cut down,and TSMC 7NM is very mature by now.
I suspect the GA104 is using Samsung 7NM/5NM. If its on TSMC 7NM,something is really up with Ampere as a design.
Oh yeah don't get me it appears poor but there is also a mahoosive difference in power useage going to GDDR6X as well.
Hot, hungry and poor yields - as said by nearly everyone over the last week or so
The board power of the 3080 is.. something else. The 2080 was quite efficient, though people complained at getting a mid-tier chip - they had the performance to do so. Now on a supposedly much more efficient process they're increasing it some 50% or so? Very telling.
It's been a long time since Nvidia gave up the efficiency win. No wonder AMD are waiting with the mid-tier.
kalniel
The board power of the 3080 is.. something else. The 2080 was quite efficient, though people complained at getting a mid-tier chip - they had the performance to do so. Now on a supposedly much more efficient process they're increasing it some 50% or so? Very telling.
It's been a long time since Nvidia gave up the efficiency win. No wonder AMD are waiting with the mid-tier.
As long as it means we get a good bump up in the models under £300,that will do me fine!! :P
CAT-THE-FIFTH
As long as it means we get a good bump up in the models under £300,that will do me fine!! :P
Next year.. maybe. If Nvidia are having to use die from a model up (compared to the 2000 series, more like their normal set) and it's a new process then they're not exactly making cost savings in production so I don't think there's going to be any charity with the 3070/3060 for now. Then early next year I guess AMD will release Navi 22/23 which should be at least the same performance but with more RAM and better efficiency, then Nvidia will respond with a refresh version of the 3070 etc. with twice the ram and hopefully (from Nvidia's perspective) better yield and efficiency.
So yes, next year things should be nicely competing.
More leaks:
https://wccftech.com/zotac-geforce-rtx-3090-geforce-rtx-3080-geforce-rtx-3070-custom-graphics-cards-pictured/https://videocardz.com/newz/zotac-geforce-rtx-3090-trinity-picturedPictures of the GA102 in the RTX3080:
https://videocardz.com/newz/nvidia-ampere-ga102-rtx-3090-3080-gpu-picturedEven the RTX3080 uses 3 PCI-E 8 pin power connectors!!

The clockspeed slider goes very high.Apparently there are 20GB RTX3080 models too.
kalniel
Next year.. maybe. If Nvidia are having to use die from a model up (compared to the 2000 series, more like their normal set) and it's a new process then they're not exactly making cost savings in production so I don't think there's going to be any charity with the 3070/3060 for now. Then early next year I guess AMD will release Navi 22/23 which should be at least the same performance but with more RAM and better efficiency, then Nvidia will respond with a refresh version of the 3070 etc. with twice the ram and hopefully (from Nvidia's perspective) better yield and efficiency.
So yes, next year things should be nicely competing.
We will see,but if I can run Cyberpunk2077 on what I have now,then I might not bother upgrading for a while longer. From the leaks,I suspect the mainstream AMD models might be coming a bit quicker after the launch than we suspect.
kalniel
Next year.. maybe. If Nvidia are having to use die from a model up (compared to the 2000 series, more like their normal set) and it's a new process then they're not exactly making cost savings in production so I don't think there's going to be any charity with the 3070/3060 for now. Then early next year I guess AMD will release Navi 22/23 which should be at least the same performance but with more RAM and better efficiency, then Nvidia will respond with a refresh version of the 3070 etc. with twice the ram and hopefully (from Nvidia's perspective) better yield and efficiency.
So yes, next year things should be nicely competing.
It's AMD, they'll drop the prices on the 5XXX series and call it a mid range just like they did with CPUs (I hope)
Xlucine
It's AMD, they'll drop the prices on the 5XXX series and call it a mid range just like they did with CPUs (I hope)
That would be an emergency stop gap if they need to get another release out quickly, but they know everyone is waiting for RDNA2 so it's of very limited value.