No doubt that 20x faster is in very specific tests, until it's actually out in the wild then any figures can be taken with an appropriate amount of salt..
Moore's Law: A roughly doubling of transistor count every 2 years, and an approximate equivalent performance increase assuming programming models allow.
2021 - 2012 = 9 years, or 4.5 doublings.
2^4.5 = 22.6
22.6 > 20
The monster-math works out. But that means it's only “kept up”, so its relative strength compared to Nvidia and AMD's latest will be… about the same as the gap back in Haswell times?
I'm stuck in silicon prison and 14nm keeps dragging on?
Terbinator
I'm stuck in silicon prison and 14nm keeps dragging on?
Was there not some testing done that proves that the nm isn't really comparable between Intel and AMD etc due to how things vary..?
'[GSV
Trig;4287585']Was there not some testing done that proves that the nm isn't really comparable between Intel and AMD etc due to how things vary..?
Yes and no. The Intel fanboys overstate it. There are some measurements that show similarities of Intel 10nm and amd 7nm in terms of transistor size, but the gate density is different etc. It gets distorted into Intel isn't that far behind when you use a microscope, well sorry but that's only looking at part of the picture. The performance stats speak for themselves.
'[GSV
Trig;4287585']Was there not some testing done that proves that the nm isn't really comparable between Intel and AMD etc due to how things vary..?
Was meant to be a very bad Johnny Cash gag :O
Steve
Moore's Law: A roughly doubling of transistor count every 2 years, and an approximate equivalent performance increase assuming programming models allow.
2021 - 2012 = 9 years, or 4.5 doublings.
2^4.5 = 22.6
22.6 > 20
The monster-math works out. But that means it's only “kept up”, so its relative strength compared to Nvidia and AMD's latest will be… about the same as the gap back in Haswell times?
This tbh. In benchmarks my rtx3070 is roughly 10x faster than my old gtx46SO (which hit gtx470 levels in tests ), average 26% per year over the last decade. I will struggle to believe 20* faster claims without tangible independent proof
Terbinator
Was meant to be a very bad Johnny Cash gag :O
No, but now you have to tell me the reference there as I'm not a JC fan…
Yours for the low price of $1,000,000.
If and I say IF this card is comparable to a 3060/3060ti (I'm a bit doubtful in all honesty), there is stock (which is possible with intel) and is priced competitively I can actually see people buying it because the ‘xx60’ range seems to be one of the more popular price/performance spots.
Assuming it comes out before the next round of gpu's from nvidia and amd of course.
LSG501
If and I say IF this card is comparable to a 3060/3060ti (I'm a bit doubtful in all honesty), there is stock (which is possible with intel) and is priced competitively I can actually see people buying it because the ‘xx60’ range seems to be one of the more popular price/performance spots.
Assuming it comes out before the next round of gpu's from nvidia and amd of course.
Given intel's stance on no win7 support that will rule it out for me. I need cards that will still support win7 and other OS, not just be tied to win10.
LSG501
If and I say IF this card is comparable to a 3060/3060ti (I'm a bit doubtful in all honesty), there is stock (which is possible with intel) and is priced competitively I can actually see people buying it because the ‘xx60’ range seems to be one of the more popular price/performance spots.
Assuming it comes out before the next round of gpu's from nvidia and amd of course.
Personally it could be a cheaper card and I'd still not be interested until I could see what Intel's driver long term support is like. This is why I really like AMD - my aging rx480 still gets driver updates that improve performance. It might be slight but it there. Nvidia aren't as good in this regard but at least they release regular title specific + stability driver updates for years. Having suffered Intel drivers failure to fix bugs after about a year after the product is released I'm just not going to risk it just yet.
I think my hope is the card is good for crypto and the miners buy it while leaving AMD cards for me :)
Terbinator
'[GSV
Trig;4287585']Was there not some testing done that proves that the nm isn't really comparable between Intel and AMD etc due to how things vary..?
Was meant to be a very bad Johnny Cash gag :O
I got it, excellent work.
cheesemp
Personally it could be a cheaper card and I'd still not be interested until I could see what Intel's driver long term support is like. This is why I really like AMD - my aging rx480 still gets driver updates that improve performance. It might be slight but it there. Nvidia aren't as good in this regard but at least they release regular title specific + stability driver updates for years. Having suffered Intel drivers failure to fix bugs after about a year after the product is released I'm just not going to risk it just yet.
I think my hope is the card is good for crypto and the miners buy it while leaving AMD cards for me :)
That's the kicker, Intel seems pretty crud at long term support (long term means lost money), I wouldn't even consider an Intel GPU until either deep into their second iteration or even third (DG1 is not an iteration, that was barely a prototype).
cheesemp
Personally it could be a cheaper card and I'd still not be interested until I could see what Intel's driver long term support is like. This is why I really like AMD - my aging rx480 still gets driver updates that improve performance. It might be slight but it there. Nvidia aren't as good in this regard but at least they release regular title specific + stability driver updates for years. Having suffered Intel drivers failure to fix bugs after about a year after the product is released I'm just not going to risk it just yet.
I think my hope is the card is good for crypto and the miners buy it while leaving AMD cards for me :)
Not saying I'd buy it either but not everyone is tech orientated like those of us on this forum.
I'm basically stuck in the nvidia camp due to the software I use only supporting cuda :(
“If Intel can get the level of performance we are seeing hinted at, ample supplies out, and its pricing right, we could be looking at a much more interesting three horse GPU race in H2 this year.” Too late for that.
'[GSV
Trig;4287585']Was there not some testing done that proves that the nm isn't really comparable between Intel and AMD etc due to how things vary..?
Not so much testing as released figures on dimensions, I've not got the time to dbl check but i think TSMC's
10nm was roughly comparable to Intel's
14nm whereas their
7nm is a bit smaller in most key areas.
Comparing node sizes between manufactures isn't very useful though due to the number of variables.
This “nm” thing is mostly a marketing term nowadays, as they pick the smallest feature and try to make it like it's the whole transistor at that size, to entice laypeople and the bean counters.
As for the Xe whatever GPU: I'll believe it when I see it, but I'm already expecting a huge flop if it ever gets released.
It is possible that in 10 years there will be quantum processors.Moor's law is not working 100%, progress is getting slower, there will definitely be something new.
The conjecture and estimates are all fine in my eyes. But it will still be irrelevant (sadly) for me as a buyer, (just like all AMD cards too) compared to nvidia. And I say that as an AMD fanboy.
That's because nvidia cards have DLSS (and tensor cores) which, 3D rendering power aside, gives them advantages in ray tracing and, more importantly, what DLSS had do for framerates/resolution (or both).
I know that some games seem to favour generic DX raytracing which AMD competes quite well with. But the DLSS is the dealbreaker. The fact that nvidia cards can use AI to make games like Cyberpunk 2077 playable at 4K is totally insane. Yes they are rendering at 1080p or 1440p but there are enough videos showing that the fidelity of the AI “upscaling” (although not actually upscaling) is near indistinguishable.
So if I were buying today (for the insane cost these cards are!), it would have to be nvidia, no question.
I just sure hope AMD had get their implementation of DLSS out quickly to compete…
What *is* impressive about Intel Xe is just how fast Intel has closed the gap. Kudos to them. I really hope we have even 3 way competition in the GPU market in the near future.
Noli
What *is* impressive about Intel Xe is just how fast Intel has closed the gap. Kudos to them. I really hope we have even 3 way competition in the GPU market in the near future.
Larrabee has been overdue since 2011, not sure that this is that impressive. And it seems to me they're only pushing this due to stagnation elsewhere.