facebook rss twitter

AMD Infinity Cache patent hints at RDNA 2 secret sauce

by Mark Tyson on 6 October 2020, 15:21

Tags: AMD (NYSE:AMD)

Quick Link: HEXUS.net/qaeorf

Add to My Vault: x

The AMD RDNA 2 graphics launch event is a little over three weeks away. If anything, the event just became even more important, with the Nvidia CEO's statement about RTX 3000 series graphics card demand outstripping supplies for the rest of 2020.

Some of the leaks we have seen for the Radeon RX 6000 GPU series appear to indicate a memory/bus choice that seems rather anaemic compared to previous generations, especially HBM toting cards, and the latest GeForce RTX 3000 series rivals. However, about a month ago the RedGamingTech (RGT) YouTube channel put out a video that asserted that RNDA 2 would be delivered with an "Insane Cache System" to deliver "monstrous" performance.

A few hours ago Twitter tech leakster Momomo_us unearthed an AMD patent filed earlier this year which appears to back up RGT's assertion that Infinity Cache is the RDNA 2 graphics card secret sauce. RGT has subsequently published an updated video acknowledging this revelation and thanking sources.

The AMD Radeon RX 6900 XT graphics card is quite firmly outlined thanks to ROCm firmware and Apple MacOS X data mining. From digesting these indicators we expect it to come packing the following key specs: 80CUs, significantly faster GPU clocks than RDNA 1 cards could muster (likewise GeForce RTX 3000 cards), and a 256-bit memory interface. That latter data nugget, the memory bus width, combined with the touted GDDR6 memory type chosen, caused a few furrowed brows.

The AMD Infinity Cache patent, which may be used as an RDNA 2 secret sauce, comes into play to ease your memory bandwidth concerns. RGT reckons that AMD could be using a sizeable cache on the RDNA 2 GPUs, which is as big as 128MB for the high end GPUs. For comparison, the Radeon RX 5700 XT comes with 4MB of L2 cache. AMD's patented tech would mean fewer cache misses, and it would also be able to reduce data duplication across caches. The beefy cache on the GPU means it would have to access the main GDDR6 memory over the 256-bit memory bus (in the case of the RX 6900 XT) less often.

Tom's Hardware points out that AMD filed a patent on Adaptive Cache Reconfiguration Via Clustering last year and the authors went on to publish a paper on leveraging shared L1 GPU caches. Infinity Cache may be the commercialisation of this technology. Unfortunately we can't say for sure whether the Infinity Cache patent really is the secret sauce behind RDNA 2's "monstrous" performance. Please keep in mind that this kind of leaked information always requires a pinch of salt.



HEXUS Forums :: 25 Comments

Login with Forum Account

Don't have an account? Register today!
really am curious about AMD… its what has been holding me away from team green this far.
QuorTek
really am curious about AMD… its what has been holding me away from team green this far.

oh be honest. It's cause your preorder got cancelled and you haven't been able to find another legitimate seller for love nor money. ;)
Sounds like something to compensate for missing memory bandwidth, I worry it would need per game optimisations to work well.
If so reminds me a bit of Fury cards which had similar problems but the other way around - massive bandwidth but short of total memory. At the time AMD stuck some engineers on optimising benchmark favourites so they looked good, but that was only a short term sticking plaster that fell off as the card aged.
ik9000
QuorTek
really am curious about AMD… its what has been holding me away from team green this far.

oh be honest. It's cause your preorder got cancelled and you haven't been able to find another legitimate seller for love nor money. ;)

I sit on a 1080ti, in a HEDT build, it is not much about the money, but what I get for the money, also I usually skip 1-2 generations by standard, I am very curious about AMD, and for gaming, mostly because all developers is now developing for AMD gear that is resting inside the new consoles… it would seem logic to wait… not upgrading GFX card when I cant get the full gain out of, ill wait to DDR5 comes out… and yes namely with Nvidia, you can first really feel the difference if you do jump 1-2 generations by standard… I want to see what red team has as well, am not into brands, been Intel for years, but going AMD next time because I get more value, unless Intel really dishes out something flawless next year same time.
I expected to see explained abstract in the article. I feel like these articles recently are getting more shallow day to day.
Don't get me wrong but “new picture of ‘this’ chip” or articles that says “this word will change the world” are not turning me on…