facebook rss twitter

Purported Intel Xe-HPG graphics card pictured and detailed

by Mark Tyson on 9 April 2021, 10:11

Tags: Intel (NASDAQ:INTC)

Quick Link: HEXUS.net/qaeqgp

Add to My Vault: x

Some interesting pictures and rather detailed information about Intel's upcoming Xe-HPG discrete desktop DG2 graphics cards for PC gamers has been shared on YouTube. Intel teased some revelations about its Xe-HPG last month but all we got was a head-scratching Easter egg scavenger hunt, so this news is welcome but requires a dash of salt, with its anonymous sources.

TechTuber Moore's Law Is Dead (MLID) published this video earlier today. Described as a full-leak of the Intel Xe-HPG DG2 graphics card with 512EUs, MLID showed us the images of a graphics card engineering sample, as reproduced in samples above and below.

The DG2 model shown is being tested and used for development right now but the plain looking cooling shroud is obviously a make-do affair and said to be one of about three designs being used by testers. Probably the most revealing thing about the pics shared is the power input section, comprising an 8- and 6-pin power connector side-by-side. Apparently Intel had been testing this card at between 225 and 250W, but has decided to push it to 275W to make the GPU more competitive vs its intended targets.

Click to zoom all images above

Other important details about the Xe-HPG with 512EUs, probably the top-end model, are revealed by MLID above. Key points are that this graphics card will run with a GPU clock of up to 2.2GHz. It will come with 16GB of GDDR6 on a 256-bit bus. Apparently Intel is having this GPU made on TSMC's N6 process.

As for performance, MLID's source is quoted as saying that this 512EU graphics card "should be treated like an RTX 3070 Ti. That is where the majority of samples are performing." Time Spy benchmarks certainly put it in this ballpark but the leaker rounds off his projections by tempering expectations and insisting the top Xe-HPG will definitely not match an RTX 3080. Noted strong points of the Xe-HPG are its encoding and 'prosumer capabilities'.

At this stage the drivers have stability issues, it is said, but the development team should get them sorted out as long as the management don't push them to release too soon. Certainly complicating the driver support in games is the development of XeSS, Intel's DLSS-a-like technology. It looks like we will get some DG2 releases in Q4 this year, but some models will not arrive until next year. We don't have any prices floated for the top-end Xe-HPG but mid-range models (sporting 128 and 256EUs) are supposed to be targeting the US$200 to $300 mark.

Last but not least, MLID claims that the successor to Xe-HPG DG2 has been codenamed 'Elasti' and is scheduled for 2023. It all sounds quite promising, but please remember that salt shaker,



HEXUS Forums :: 23 Comments

Login with Forum Account

Don't have an account? Register today!
Encouraging numbers - if they really do have a unit that can match a 3070…and can price it accordingly, we could finally have a 3rd competitor in the consumer GPU market again for the first time since 3DFX went under.

All good news there imo :)
Spud1
Encouraging numbers - if they really do have a unit that can match a 3070…and can price it accordingly, we could finally have a 3rd competitor in the consumer GPU market again for the first time since 3DFX went under.

All good news there imo :)
Hopefully this will be a card that we can actually buy…. if the numbers are good (and yields are decent) Intel have a real chance to shift a lot of cards.

I wonder if we’ll start seeing rigs with AMD CPU and Intel GPUs?
It's ideal market conditions to launch a new GPU, it would be hard not to succeed so if Intel mess this up they've only got themselves to blame.
A quick glance at the logo and I thought it was a new “X” game being introduced.

Players of X space games will know what I mean :)
IF, and I say IF, that information is accurate and the price is right (not to mention stock and supply) it might be interesting to see how AMD/Nvidia react with pricing etc, especially if Intel take a more aggressive approach with a lower price to gain marketshare.

Sadly unless Intel has something comparable to cuda and the software I use makes use of it then I'm stuck with Nvidia….