facebook rss twitter

AMD responds to Radeon VII 'short supply' rumours

by Mark Tyson on 18 January 2019, 10:11

Tags: AMD (NYSE:AMD)

Quick Link: HEXUS.net/qad3ty

Add to My Vault: x

Earlier this week some sites shared rumours that AMD Radeon VII graphics cards would be in short supply. These concerns seem to have a common source: TweakTown's exclusive report, citing industry contacts, which stated that there would be "less than 5000" 7nm Vega 20 GU packing, 300W, 16GB HBM2 made. Furthermore, TweakTown went on to say that there would be "no custom Radeon VII graphics cards made," by AiBs but the source wasn't so certain about this assertion.

AMD has since officially responded to the 'short supply' aspersions. ExtremeTech, part of ZDNet, says it has got a statement from AMD that addresses the rumours touched upon in the intro, regarding the Radeon VII. I've reproduced the full but brief statement below:

"While we don't report on production numbers externally, we will have products available via AIB partners and AMD.com at launch of Feb. 7, and we expect Radeon VII supply to meet demand from gamers."


Admittedly AMD's statement is a rather weak rebuffal. The AiB Radeon VII cards are expected to be simply partner-branded AMD reference designs, and AMD doesn't really deny the quantities that will be available. It just asserts there will be enough.

Shortly after its big reveal we got to see quite a few more gaming benchmarks for the AMD Radeon VII. However, despite the lack of product and third part testing results available, Nvidia CEO Jensen Huang, claimed that the rival red team's new hope offered "lousy" performance and added that "we'll crush it".

AMD's new Radeon VII will become available from 7th Feb, priced at $699.



HEXUS Forums :: 12 Comments

Login with Forum Account

Don't have an account? Register today!
300W for a 7nm card…..something ain't right.
It has HBM, I could have told you (have told you repeatedly) that it would be in short supply and cause the price of the card to make it useless vs. the competition. Oh look, it's the same price as a card that has MANY more features, RT+DLSS+VRS+GSYNC etc. See the point? If this card had been say 8-12GB GDDR5x or GDDR6, it could have been sold for $350-550. But they decided to put 16GB (devs just migrating to 11GB, so no point for some time to come), and literally have a disadvantage without features to go with the $699 price tag. Which do you think will be used FIRST? 16GB vs. 11 or less, or RT, DLSS or VRS? I'm thinking the last 3 will be used FAR sooner than 16GB of overpriced vid memory with bandwidth no gamer needs today either.

Build what people want and they will come. This card just seems like a PR move, more than anything gamers would run to the store for. Not that I mind, I just think gamers WOULD run to the store for 12GB GDDR5x for $350-450 as opposed to $699 with missing features vs. NV cards with GDDR6 maybe landing $450-550 (depending on how much 8-16GB?)? I guess they think people will buy the past for $699 vs. the future for $699 (I doubt it). I would not be surprised if they built 5K of them, as I don't think more would sell well without a cut and I'm guessing these aren't much better than break even margin wise (the 16GB alone is $240+ here probably, 8GB was $180 in 2017, doubt it's dropped in half with nobody using it). This card could have sold in decent quantities, but not at $699 missing many features of the future that devs want. I'd expect some form of DLSS/RT/VRS on all future consoles as it really helps weaker units (handhelds, consoles, tablet/phone). Anything aimed at making the masses perform like rich people, will be a big hit, and that is what RT+DLSS and VRS helps. Better looking graphics on weaker hardware is a winner. AMD keeps using HBM as if it will sell a card, but it doesn't. You can't point to something on screen and say, see that's what HBM does, where you can do that for RT+DLSS+VRS. You get speed or looks, or a combo of both. 16GB should be reserved for work/pro cards where it's actually usable.

https://www.gamersnexus.net/guides/3032-vega-56-cost-of-hbm2-and-necessity-to-use-it
Stop doing this to your cards. Perhaps they are forced into this to lower power, as they couldn't build it with other mem (watts would be too high?). But 7nm watt issues vs. 12nm competition with much more features? You built it wrong ;)
5000 cards is 4500 more cards than they expect to sell.

Opposite to ferrari where all limited editions they make exactly 1 less than they expect to sell
You never cease to amaze me with your tripe, nobodyspecial xD
nobodyspecial
It has HBM, I could have told you (have told you repeatedly) that it would be in short supply and cause the price of the card to make it useless vs. the competition. Oh look, it's the same price as a card that has MANY more features, RT+DLSS+VRS+GSYNC etc. See the point? If this card had been say 8-12GB GDDR5x or GDDR6, it could have been sold for $350-550. But they decided to put 16GB (devs just migrating to 11GB, so no point for some time to come), and literally have a disadvantage without features to go with the $699 price tag. Which do you think will be used FIRST? 16GB vs. 11 or less, or RT, DLSS or VRS? I'm thinking the last 3 will be used FAR sooner than 16GB of overpriced vid memory with bandwidth no gamer needs today either.

Build what people want and they will come. This card just seems like a PR move, more than anything gamers would run to the store for. Not that I mind, I just think gamers WOULD run to the store for 12GB GDDR5x for $350-450 as opposed to $699 with missing features vs. NV cards with GDDR6 maybe landing $450-550 (depending on how much 8-16GB?)? I guess they think people will buy the past for $699 vs. the future for $699 (I doubt it). I would not be surprised if they built 5K of them, as I don't think more would sell well without a cut and I'm guessing these aren't much better than break even margin wise (the 16GB alone is $240+ here probably, 8GB was $180 in 2017, doubt it's dropped in half with nobody using it). This card could have sold in decent quantities, but not at $699 missing many features of the future that devs want. I'd expect some form of DLSS/RT/VRS on all future consoles as it really helps weaker units (handhelds, consoles, tablet/phone). Anything aimed at making the masses perform like rich people, will be a big hit, and that is what RT+DLSS and VRS helps. Better looking graphics on weaker hardware is a winner. AMD keeps using HBM as if it will sell a card, but it doesn't. You can't point to something on screen and say, see that's what HBM does, where you can do that for RT+DLSS+VRS. You get speed or looks, or a combo of both. 16GB should be reserved for work/pro cards where it's actually usable.

https://www.gamersnexus.net/guides/3032-vega-56-cost-of-hbm2-and-necessity-to-use-it
Stop doing this to your cards. Perhaps they are forced into this to lower power, as they couldn't build it with other mem (watts would be too high?). But 7nm watt issues vs. 12nm competition with much more features? You built it wrong ;)

Can I just ask is the Radeon VII missing features or does the RTX have pointless extra features?

… I know, don't feed the troll, I already know its answer…