vacancies advertise contact news tip The Vault
Win with Seagate - Seagate FireCuda 510 SSD! [x]
facebook rss twitter

Intel Gen11 GT2 GPU (Iris Plus Graphics 940) benchmarks leak

by Mark Tyson on 25 February 2019, 12:21

Tags: Intel (NASDAQ:INTC)

Quick Link: HEXUS.net/qad42m

Add to My Vault: x

Various benchmarks for an Intel Gen11 GT2 GPU have surfaced on the internet. These GPUs are expected to appear in codename Ice Lake CPUs later this year. From leaks unearthed just ahead of the weekend it looks like Intel has pushed the envelope and its integrated graphics will give Nvidia GeForce MX and AMD APUs much stiffer competition.

As well as GFX Bench and CompuBench scores, this Reddit thread provides some eyebrow raising comparisons of Intel Gen11 GT2 against Intel Gen9 GT2, and the Ryzen 2700U, and Ryzen 2400G. The charts speak for themselves but the Intel Gen11 GT2, or the Iris Plus Graphics 940, is significantly faster than the Gen9 Intel GPU, and the Ryzen 2700U with Vega 10. Compared against the Ryzen 2400G with Vega 11 graphics it is about 25 per cent slower on average in the given tests. As NoteBookCheck comments, Skylake generation GPUs offered 24 EUs at most, while the Ice Lake GPU will feature up to 64 EUs and a 4x larger L3 cache.

Intel UHD Graphics 620 vs Intel Iris Graphics 940

The generational divide, as charted above, is huge. On average the Gen11 GPU is about 75 per cent faster but falls down in comparison in just the Driver Overhead 2 test. Simply going by the name of this test, it might be something to do with the driver software not responding well to a new GPU.

Intel Iris Plus Graphics 940 vs AMD Vega 10 and Vega 11 (click to zoom image)

Above you see Intel Iris Plus Graphics 940 easily boss the Vega 10, and trade blows with an AMD APU featuring Vega 11 graphics.

Intel Gen11 graphics do more than just add performance, they will also herald the arrival of features such as DisplayPort 1.4a support, plus VESA DSC support for 5K 120Hz output. Gen11 will be the last generation before Gen12, upon which Intel's discrete GPUs will be based.

There is still quite some time until the first 'Sunny Cove' based Ice Lake CPUs will launch, and please remember to take leaked synthetic benchmarks with a pinch of salt - but the figures are encouraging for those looking to update a thin and light laptop late this year. Such strongly improved integrated graphics could help reduce pricing, power consumption and complexity compared to hybrid graphics solutions packing GeForce MX solutions, for example.



HEXUS Forums :: 14 Comments

Login with Forum Account

Don't have an account? Register today!
Long overdue! If it's a U suffix that's pretty good, and I'm sure it'll get an MSRP twice the 2700U. If it's a C suffix chip they really should do better, but it's intel so they'll charge double the cost of the red chip anyway
Looks like they're testing the waters to see if they can worry the big boys at the low end.

Maybe also a bit of a teaser to make us wonder what they've got in the oven.

My concern is the market clout they've got and the power they have over OEMs. If Intel come out with decent discreet graphics I can see AMD's GPU production becoming very vulnerable. Intel are the kind to use every advantage they have got, whether it constitutes as fair play or not.
At least we know intel can compete with AMD, let's just wait and see if they will be able to compete with Nvidia with their Gen12 Graphics chips.
What we have to remember is that these are integrated graphics which have very difficult builds and functionality from discrete so there might not be much transferrability. But it does bode well for their discrete department because the skills and understanding would be transferable.
Tabbykatze
What we have to remember is that these are integrated graphics which have very difficult builds and functionality from discrete so there might not be much transferrability.

How so? AMD use Vega cores in integrated and their top end GPUs. Nvidia seem to use the same cores in their desktop parts as the Nintendo Switch.

Now Intel have traditionally sucked at scaling their graphics up beyond something tiny, but that's just them.