HEXUS Forums :: 18 Comments

Login with Forum Account

Don't have an account? Register today!
Posted by 00oceanic - Tue 11 Jun 2019 01:14
Like most people, I think 16 cores is completely no use to me, but do I want it? Absolutely!
The core count of the 3950X CPU is what I've been waiting for to replace my aging (and recently performance reduced) 4770k
Posted by Corky34 - Tue 11 Jun 2019 07:31
Not that it matters much but i thought the I/O die was going to use 14nm, not the 12nm mentioned in the article, did AMD confirm 12nm for the I/O die?
Posted by yeeeeman - Tue 11 Jun 2019 07:49
Yes, Anandtech confirmed that it is indeed 12nm.
Posted by Bagpuss - Tue 11 Jun 2019 07:49
“Of course, Intel may well change its pricing stack in the wake of AMD's new threat”


Intel, Price cuts? Haha!! hell will freeze over before they'd stoop so low as to price cut because of anything AMD comes up with.

Their corporate culture will simply not allow it. They'd rather take the market share hit until they're ready next year with a response.
Posted by DanceswithUnix - Tue 11 Jun 2019 08:04
Bagpuss
Their corporate culture will simply not allow it. They'd rather take the market share hit until they're ready next year with a response.

Whilst I don't think they will have a response as early as next year (10nm sucks for clock speeds so they are stuck with enhanced 14nm until their 7nm comes out) I agree they probably won't budge on price. After all, people kept buying the Pentium 4 and that was a piece of junk whereas what Intel are selling now is OK silicon, just might not be as good as AMDs in some circumstances. So corporate types will keep buying Intel chips in huge quantity.

What I will watch with interest is how the Dell product mix changes. They got labelled “the best friends money can buy” in past litigation, I can't imagine they want that label plastered on them again but just how much cash will Intel offer to keep them selling key products as Intel only.
Posted by Corky34 - Tue 11 Jun 2019 08:16
Am i the only one who thinks the 3800X seems rather poor value from a gaming perspective?
Posted by PC-LAD - Tue 11 Jun 2019 08:59
Corky34
Am i the only one who thinks the 3800X seems rather poor value from a gaming perspective?
I agree with you in every regard, if you are gaming on the high end the GPU matters more than the CPU thanks the higher workload on that end. No point in putting money into something that won't be utilized properly. the r5 series is most likely to dominate again. I think I might wait to see if they bring out a 4 core 8 thread again (w// no igpu) but idk?
Posted by DanceswithUnix - Tue 11 Jun 2019 09:40
Corky34
Am i the only one who thinks the 3800X seems rather poor value from a gaming perspective?

It is steep. I'm sure I could buy one, but not sure I want to at that price, will have to wait for Phoronix to get their hands on one and see how fast it can churn through Linux code compiles. The 3600X is the sensible option for me, but I might stretch to the 3700X.

Actually, the sensible option was always the 2600X which monstered everything I threw at it and is currently about £170 on Amazon, but where is the fun in that ;)

Edit: I gather Zen2 has some countermeasures to the speculative execution problems, will be nice to see some stuff like retpoline turned off.
Posted by kalniel - Tue 11 Jun 2019 10:46
The picture I take from that is that gaming is often not bottlenecked by CPU, though interestingly minor differences across architecture remain at different price points, even though the chip might be slower or faster in the range.
Posted by philehidiot - Tue 11 Jun 2019 12:16
00oceanic
Like most people, I think 16 cores is completely no use to me, but do I want it? Absolutely!
The core count of the 3950X CPU is what I've been waiting for to replace my aging (and recently performance reduced) 4770k

Aye, I've noticed my 4690K has been playing up recently to the point where I've applied my end of life extending overclock (I buy an overclockable chip with mobo and cooling and then overclock when it begins to bottleneck. This is usually when a new game comes out but I think it's due to the security patches this time as the games haven't changed but the performance has dropped substantially).

Don't get me started on companies buying the P4 when the Althon64 was available and relatively awesome. Utter madness. Way more up front cost, way more electricity and much lower performance. Also, likely shorter longevity due to the running temps and the higher demands on PSU and power delivery system. I had a desktop P4 in a laptop. Cooked itself.
Posted by LSG501 - Tue 11 Jun 2019 12:51
philehidiot
I think it's due to the security patches this time as the games haven't changed but the performance has dropped substantially)
More than likely, I've seen a ‘noticeable’ decrease with these security patches and as such have been looking to bring forward my update for my 4790k…. on some sites it's saying the combined performance loss is around 16% on newer generation cpu's which according to intel aren't as badly affected as our generation.
Posted by preter_s - Tue 11 Jun 2019 17:50
“Of course, Intel may well change its pricing stack in the wake of AMD's new threat”

A few have commented on this, but I'll add to it. Reducing pricing has a direct effect on gross margins, which is a measure of the profit margins per dollar of sales. Intel's share price has always been propped up by its monopolistic or near monopolistic status which allowed it to maintain very high historical margins.

The stock market looks very closely at that gross margin metric. The moment there is a price war, the share price will tank sharply due to the gross margin deterioration. Therefore Intel will be loathe to engage in a straight price war, especially on its latest products (9th & 10th gen crap).

At the same time, if they do not cut price, market share will take a hit. However, this metric is less of a headline metric in the quarterly earnings announcements. Less of a knee jerk reaction to share prices.

However, in Germany and some other countries already, reports are showing AMD outselling Intel by more than 2 to 1. Despite the unit sales disparity, so far the sales dollars taken in is about the same for both due to the higher Intel prices.

But given that AMD's lead in price-performance ratio is continuing to widen, that will change very soon. When total volume and total sales revenue drops, Intel will be caught with its panties down and the shares will eventually tank one way or another too.

Truth be told, Intel is in a world of hurt whichever path of least hurt it chooses. The recent trend showing declines in Intel share prices (and conversely the rocketing AMD share price) says it all and is poised to continue with greater velocity until Intel gets its act together on 10nm and 7nm. That seems to be a few years away still!
Posted by FRISH - Wed 12 Jun 2019 03:11
I was thinking of going for a Ryzen 7 originally, but now I'm starting to consider a Ryzen 5. Will definitely need to await for proper benchmarks to come first.
Posted by Corky34 - Wed 12 Jun 2019 06:38
yeeeeman
Yes, Anandtech confirmed that it is indeed 12nm.

Just to add to this, it seems (Ian Cutress tweet) we're both right:
So for clarity: Rome large IO die = GF 14nm Matisse small IO die = GF 12nm X570 Chipset = Matisse IO die on GF 14nm That's right. The X570 chipset is the same floorplan as the Matisse IO die, but with diff chicken bits enabled/disabled. AMD has good reuse of chips
That wasn't very clear for me at first but what he's essentially saying is that EPYC uses an I/O die fabricated on 14nm, Ryzen I/O die uses 12nm, and the X570 chipset is actually the I/O die from EPYC, at least i think that's what he's saying, other interpretations are welcomed.
Posted by DanceswithUnix - Wed 12 Jun 2019 07:14
Corky34
, and the X570 chipset is actually the I/O die from EPYC, at least i think that's what he's saying, other interpretations are welcomed.

If that's true… well I wasn't expecting that :D

That implies there are wasted memory controllers on the X570 chipset (meh), and more interestingly there are lots of chipset bits already in the CPU. I presume there will be a laptop version that brings those functions out to avoid needing a chipset and save cost and space.
Posted by Tabbykatze - Wed 12 Jun 2019 10:10
I have to admit, AMD have really nailed the chip production reuse, I wonder what their failure/recycle rate is and how much of each wafer is consumed. It must be a quite high percentage.
Posted by Xlucine - Wed 12 Jun 2019 17:27
Corky34
Just to add to this, it seems (Ian Cutress tweet) we're both right:

That wasn't very clear for me at first but what he's essentially saying is that EPYC uses an I/O die fabricated on 14nm, Ryzen I/O die uses 12nm, and the X570 chipset is actually the I/O die from EPYC, at least i think that's what he's saying, other interpretations are welcomed.

So they've made the IO die for desktop ryzen in 12nm and 14nm flavours, and have a spare dual channel memory controller on X570? Ian seems to confirm the chipset and IO are on different sizes in the replies to that tweet, so it's not failed parts getting re-used. This seems very odd, but then again I'm not an electrical/electronic engineer.

DanceswithUnix
If that's true… well I wasn't expecting that :D

That implies there are wasted memory controllers on the X570 chipset (meh), and more interestingly there are lots of chipset bits already in the CPU. I presume there will be a laptop version that brings those functions out to avoid needing a chipset and save cost and space.

Ryzen could do that already with the teased 300 “chipset”
Posted by Corky34 - Wed 12 Jun 2019 18:21
I would imagine they've used a single design for all I/O dies and they thought why not use that for the X570 chipset, it makes a lot of economic and time sense as the EPYC I/O die simply needs two memory controllers disable (or just not connected) for it to be used in Ryzen and simply needs all of them disabled to be used as a chipset, IIRC Zen(+) came with enough connectivity (PCIe, USB, Network, etc) that it didn't need a chipset (southbridge?).

With Zen2 all that connectivity has been moved out of the CPU core dies and into it's own die, redesigning that just because you need fewer PCIe lane, memory controllers, USB, SATA does seem like a waste of time when you can just churn out the same thing and just not bother using the parts you don't need, it's pretty smart when you think about it.