very nice.
now cough up those reviews
Patiently waiting for reviews. it all seems to good to be true but want to see them compete so bad. Rooting for them.

The problem with this is the RTX3070 8GB is around RTX2080TI level performance but starts at $499,which is $80 less.
I'm not getting one
If I pair a 6900 XT with a 5950X cpu, I'll still get under 30 fps in Guild Wars 2 during the most packed player events, at 1080p resolution
The needs of mmorpg gamers have been ignored again
Good, i'm pleased they have at least got a comparable product now in “normal” scenarios - doesn't really matter if its a few frames better/worse in specific titles…the key is AMD are finally going to compete at the top end again, yey!
Too soon to say whether they will be worth it or not - we need benchmarks and for me personally, I am really interested in how well they perform with Raytracing enabled…if it lags way behind Nvidia on that score then it just rules them out for me completely.
I am hopeful though - the fact that the leaks have proven to be largely true so far, and the same leaks indicate good DXR performance as well, gives me hope they will put in a decent implementation and compete there too.
Next hope is that AMD don't screw up their actual launch like Nvidia did so people who don't own a youtube channel or who won't resort to bots can actually buy one :)
CAT-THE-FIFTH

The problem with this is the RTX3070 8GB is around RTX2080TI level performance but starts at $499,which is $80 less.
Sure, but is 16Gb vs 8Gb card. AMD wouldn't priced that much if they didn't compared the hell out of this cards.
Was really expecting cheaper prices if I'm being honest.
Sure I'll just go ahead and get a Nvidia 3070 now if I can manage to swipe one online somewhere.
I would've bought a 3080 but no stock anywhere it seems and I'm not going into some preorder queue for same, nor for the 3070 either, ridiculous.
So AMD not beating Nvidia on prices, near same on performance, nothing outstanding either.
Only thing they might have over Nvidia at this stage is availability of stock, maybe. Shall wait and see.
Also, proper reviews and teardowns need to be done on the AMD cards.
Thanks for posting the info though.
darcotech
Sure, but is 16Gb vs 8Gb card. AMD wouldn't priced that much if they didn't compared the hell out of this cards.
But it is with smart access memory,so probably a bit closer on most systems. But it does leave the sub £500 market open for Nvidia,as we are already seeing details of the RTX3060TI and possibly RTX3060 being launched next month.
OTH,if a 60CU RDNA2 part is faster than an RTX2080TI,than the 56CU part in the XBox Series X,probably is pretty fast too.
liquidflower
I'm not getting one
If I pair a 6900 XT with a 5950X cpu, I'll still get under 30 fps in Guild Wars 2 during the most packed player events, at 1080p resolution
The needs of mmorpg gamers have been ignored again
I only really play MMO's and would agree with you.
World of Warcraft have updated their game to include support for RTX though, which is something at least.
GW2 and ArenaNet are awful for letting both their game and their client go out of date and not keeping up with improvements but there's no where near as many players playing GW2 as there once was.
I would guess Black Desert Online and some of the upcoming MMOs out of Asia would include support for RTX at least.
The new PlanetSide Arena supposedly also going to include support for RTX and DLSS.
Spud1
Good, i'm pleased they have at least got a comparable product now in “normal” scenarios - doesn't really matter if its a few frames better/worse in specific titles…the key is AMD are finally going to compete at the top end again, yey!
Too soon to say whether they will be worth it or not - we need benchmarks and for me personally, I am really interested in how well they perform with Raytracing enabled…if it lags way behind Nvidia on that score then it just rules them out for me completely.
I am hopeful though - the fact that the leaks have proven to be largely true so far, and the same leaks indicate good DXR performance as well, gives me hope they will put in a decent implementation and compete there too.
Next hope is that AMD don't screw up their actual launch like Nvidia did so people who don't own a youtube channel or who won't resort to bots can actually buy one :)
Nvidia are probably busy currently using their developer relationships to try bake in raytracing implementations which work well with their implementation and not AMD's.
But if they manage to do that for CyberPunk 2077 Nvidia will sell well. Long term? Ampere might ages as badly as 2GB Kepler GTX680 did versus 3GB HD7970 especially since aside from the 3090 their cards are once again VRAM starved.
Give me a half way decent trade in price for my 5700XT and i will upgrade to 6800XT, even though i then have a new cooler i need to buy and a 5700XT one i need to sell NIB.
But the wiser thing for me would probably be to upgrade til a nice 1440p screen instead.
Besides i have never claimed to be the wise guy.
Also waiting to see if there will be any price adjustments when either brands cards are actually on sale.
They didn't go into ray-tracing specifics much so it wouldn't surprise me if ray-tracing in Big Navi is worse than RTX 20. They probably only added it because Nvidia has been pumping so much money into convincing gullible fools that its a ‘must have’ feature. Ray-tracing is still a fad and won't be of any interest to most people until it works properly. As long as its a performance hit its of no use in games until it actually has any advantages besides the pointless “look how shiny it is” bragging.
Until then its just a gimmick like 3DTV or VR.
cptwhite_uk
5700XT is overpriced
So is a 2080ti. Stock needs to catch up before they will EVER go down.
So far it's looking very much like all the rumours (believable ones) are true.
Nvidia were forced to price the way they did, to preempt AMD, but couldn't get them to market due to manufacturing issues.
They'll likely pop AMD to the lower segment though, unless AMD have another surprise in store for us, but I doubt it.
CAT-THE-FIFTH
But it is with smart access memory,so probably a bit closer on most systems. But it does leave the sub £500 market open for Nvidia,as we are already seeing details of the RTX3060TI and possibly RTX3060 being launched next month.
Why doesn't that leave the sub £500 market open for AMD either?
I mean, you're having a good dunk on AMD as a leftover from their CPU price bump, do you just want to get it all out your system :P
Disappointing price on the 6800. We've seen recently that the 3070 is roughly on-par with the 2080ti in terms of performance, which is also where the 6800 sits. This won't disrupt any pricing, and makes a big opening for Nvidia to slide in with a 3070 super/ti for the same price as the 6800 with higher performance.
Was hoping they'd come in lower than the 3070, but I'm not kidding anybody, I'm not buying AMD GPUs after the driver experience with 5700XT.
Tabbykatze
Why doesn't that leave the sub £500 market open for AMD either?
I mean, you're having a good dunk on AMD as a leftover from their CPU price bump, do you just want to get it all out your system :P
Kat-Fight ;-)
AMD should just price the old cards better, that will do very well for the 1080p segment, and as we know the 5700XT can also do 1440p just fine, but it just lack some of the new spiffy stuff.
Pretty sure if Nvidia start to play funky with the price on 2070 / 2060 cards, AMD is going to play ball too ( well they sort of have to either way it is Nvidia that is going to hurt the most playing that game )
I also like the the very short RDNA 3 mention that is is on track and looking good.
liquidflower
I'm not getting one
If I pair a 6900 XT with a 5950X cpu, I'll still get under 30 fps in Guild Wars 2 during the most packed player events, at 1080p resolution
The needs of mmorpg gamers have been ignored again
That is the server not being able to handle it, not so much the personal hardware.
Either way, AMD take my money, never thought I had to say this, the the 6900XT may be my next purchase, well currently NVIDIA BSOD and whatever -_-
Sits here waiting on the release of a 3080 (ti) with 20GB memory… (I'm stuck with nvidia due to cuda)
Not that I'd be able to buy one because of stock levels.
Hoping this might cause a small price war but as expected they're just priced around the same (so still ‘higher’ than most would like)
Dashers
Disappointing price on the 6800. We've seen recently that the 3070 is roughly on-par with the 2080ti in terms of performance, which is also where the 6800 sits. This won't disrupt any pricing, and makes a big opening for Nvidia to slide in with a 3070 super/ti for the same price as the 6800 with higher performance.
Was hoping they'd come in lower than the 3070, but I'm not kidding anybody, I'm not buying AMD GPUs after the driver experience with 5700XT.
I don't think that's where AMD is positioning the 6800. AMD's slides showed a 20-30% average performance increase on 2080Ti which itself narrowly beats a 3070 in most scenarios. Accounting for cherry picking of results and SAM, if we even just assume a 10-15% average increase in performance on 3070 along with double the VRAM (which I would happily pay for) I don't think the pricing is unreasonable at all. Again benchmarks will tell but it looks good to me so far. It happily sits between the 3070 and 3080 in performance and price and offers double VRAM.
Agreed. 6800 is a 3070ti fighter, not a 3070. Nvidia kept their options open and they have a nice (for them) $599 slot waiting to be filled. 6800 at $579 is smart - offset a bit of the NVidia premium and better the power consumption.
Nice try but rtx just destroyed the whole market for few months ;)
nacasatu
I only really play MMO's and would agree with you.
GW2 and ArenaNet are awful for letting both their game and their client go out of date and not keeping up with improvements
I think there was some talk on the forum saying changing the game to use DX12 wouldn't have much of a speed improvement
Tabbykatze
Why doesn't that leave the sub £500 market open for AMD either?
I mean, you're having a good dunk on AMD as a leftover from their CPU price bump, do you just want to get it all out your system :P
Because there are no sub £500 AMD RDNA2 cards yet?? The RTX3070FE is already out at sub £500(barely).Next month the RTX3060TI/RTX3060 will be out just in time for Cyberpunk 2077,and that was reported by Hexus.ATM,it looks like it is Navi 21 that was revealed. The lower end SKUs are Navi 22 and Navi23,which are competitors to the GA104/GA106.
If you followed two or three leakers,they got a lot of information accurate so far about Ampere and RDNA2. Paul from RGT in September leaked that RDNA2 would use a 128mb “Infinity Cache”,256 bit memory controller,have upto 80CUs,and have clockspeeds upto around 2.3GHZ. He also implied from the same sources that Navi 22 and Navi 23 were Q1,
maybe Q2,so I hope his sources are wrong on that. If not that is a few months of Nvidia having the sub £500 market during the part of the year with the biggest sales.
I hope closer to the launch of the top SKUs,we get some more information about the Navi 22 based GPUs. I assume these are the 6700/6600 series. This will be the bread and butter of the RDNA2 range.
lodoya
Nice try but rtx just destroyed the whole market for few months ;)
And back to reality..
Infinity Cache! Rumours/guesses were right! Very smart if it works as well as it seems and overcomes the 256bus limitation.
I'm less enamoured by the smart cache though, mostly because I'm a sour Intel user and don't like leaving performance on the table.
I'll wait for the official reviews. I mean all these charts look promising, but it's interesting that they've now included a “Rage Mode” and “SAM” (perhaps as a counter to DLSS). That part concerns me, if they don't have some form of driver or software level implementation to counter DLSS and have to rely on hardware, what exactly are we going to be looking at as a performance hit for DXR? Is it going to be the same as the 20xx series with RTX turned on and no DLSS (especially if you don't have a 5xxx Ryzen system?).
Pricing wise looks fairly ballpark, perhaps above the level I was expecting for the 5800/XT version, 5900XT looks fairly well positioned vs a future 3080Ti.
As always, the devil is in the details.
I think, while mid / low end gamer of course have wants and ambitions, then these like me are probably still on 1080p and so are settled on a number of gaming essential parameters.
So i just think that these people just want to be able to play new games at reasonable FPS, and dont necessarily need shiny chrome and reflecting puddles of water on the ground to enjoy those games.
The cheaper Nvidia cards that do support ray tracing smartness, it will probably only be enjoyable on a 1080p screen at best.
QuorTek
That is the server not being able to handle it, not so much the personal hardware.
This isn't lag. That's a different problem. Sometimes when I'm nowhere near other players, my framerate will still be around 20fps
My first 3D card, the 3Dfx Voodoo, had 8MB of ram in total. This has 128MB of cache.
I think my 4200ti had 128MB of ram. That's not helping :D
Feeling old now. I'll go get me liniment and the Worthers Originals and go sit by the fire in my slippers…
128MB divided between 5120 shaders isn't a lot, I'm surprised it helps so much vs using that area for more shaders or putting the cost towards HBM,
Iota
but it's interesting that they've now included a “Rage Mode” and “SAM” (perhaps as a counter to DLSS)
Nope, completely different. They've got their own counter to DLSS (ML super resolution), but Rage Mode is basically a powerbudget overclock and SAM is lower level access to the memory on the GPU.
liquidflower
QuorTek
That is the server not being able to handle it, not so much the personal hardware.
This isn't lag. That's a different problem. Sometimes when I'm nowhere near other players, my framerate will still be around 20fps
Surely it's the developers responsibility to make a game that runs on current hardware rather than a card manufacturer's problem.
Performance exceeds my expectations however there's still rtx to consider. The pricing is close but who's to say Nvidia weren't intending to price it out the stratosphere and then once they got wind of AMDs pricing they changed pricing.
chj
Performance exceeds my expectations however there's still rtx to consider
I presume you mean performance in RTX features? It looks like every feature accelerated by RTX is also accelerated by these new cards, so it's just a question of how much by.
chj
Surely it's the developers responsibility to make a game that runs on current hardware rather than a card manufacturer's problem.
Guild Wars 2 has been around for years and was once the second biggest mmorpg in the world. World of Warcraft is also still around. Card manufacturers have had years to find out the performance problems of mmorpgs and improve their hardware. They didn't bother
Current cards aren't good enough to run it at a decent framerate
liquidflower
World of Warcraft is also still around
Yeah, though I wasn't especially wowed by AMD using it to showcase these new cards. “Look at that lighting!” :/
I've estimated the 2080 Ti FPS, (scaled pixel height in photo editing software), and converted this to the performance lead a percentage and the aggregated estimated percentage lead over all games represented

On this information it would suggest the 5800 is comfortably between the 3070 and 3080, albeit more likely closer to the 3070. Still it's a sizable increase and along side the 16Gb vRAM might be why they feel they can justify it's $579 price. Still I think $549 would have been an easier sell, they don't have a TDP advantage over the 3070 here, so the perf/watt is likely to be very close.
I want to actually see how much AMD can beat Nvidia in performance/watt. If they can do quite well,this will work out very well in laptops,especially if AMD can get away with using a narrower memory bus to do the same job.
If we take the given TBP of 220W for thr 3070, and 250W for the 6800, that's 13.6% power increase for a likely 14-15% performance increase (accounting for cherry picking). So basically they're likely to be neck and neck.
This guy is very eager to see reviews, even if i fear they will be on optimal hardware, so newest AMD CPU's and 500 series motherboards.
Something few of us have or plan to get anytime soon.
Okay it is still okay as some of the new things are only supported that way, would be interesting to see the same CPU/GPU but on a non 500 series, and then what the difference is.
I would be interested in seeing how it perform on a Intel system VS the other guys nearest competition, but even this is probably going to be on a very new Intel CPU.
Personally on a 1080p screen and a 12 core threadripper i assume i would be CPU bottle necked in most things, so maybe fortunate i cant find a game worth playing.
There are probably also a lot of people on older ryzen CPU's, that would like a new GPU and nothing else, so what can they then expect from a 6800 / XT VS if t hey had gotten a nvidia card.
cptwhite_uk
If we take the given TBP of 220W for thr 3070, and 250W for the 6800, that's 13.6% power increase for a likely 14-15% performance increase (accounting for cherry picking). So basically they're likely to be neck and neck.
Good for AMD that they managed to get onto TSMC then! :P
Hang on, on that table the 6800 has a 4096 bit memory bus, that a typo?
Gentle Viking;4267528
Something few of us have or plan to get anytime soon.
I may decide to upgrade, it certainly isn't a priority right now. I'd rather wait for reviews and see how things pan out.
kalniel
I wasn't especially wowed by AMD using it to showcase these new cards
Haha nice turn of phrase.
Looks like AMD are going to beat Nvidia to have ACTUAL products available to the general public.
Pricing is comparable, as is performance and power requirements.
The question is: Do you buy AMD, or wait for some undefined future date for Nvidia?
O they will have actual products ready to sell, for about 4 hours.
If you think anything else you dont get the basics of supply / demand 101
And if they have stock to last a whole day,,,,, then i will be very very surprised.
Some RT results supposedly revealed:
https://twitter.com/ghost_motley/status/1321538287446695939So AMD ran some DirectX Ray Tracing benchmark on the 6800 XT and got 471fps.
Someone ran the same benchmark with their RTX 3080 TUF and got 630fps
Source: Our super secret Silicon Palace Discord.
You beat me to it. I was going to post a figure for the 3080 there as well.
One more to add then: a stock 2080 scores about 308 fps.
Edit!!
https://twitter.com/tomwarren/status/1321489315428519936?s=20AMD tells me it's working on its own super sampling technology, similar to Nvidia's DLSS. It will be open and cross-platform, which could mean it'll come to Xbox Series X and PS5. DLSS is a game changer for Nvidia, so this could be big news.
DanceswithUnix
My first 3D card, the 3Dfx Voodoo, had 8MB of ram in total. This has 128MB of cache.
What's worse is it doesn't feel like we've got 1000x better graphics (shoot me if my maths if wrong).
Either that or I've got some very rose tinted glassed on. :)
What are the prices for the cards in GBP?
Corky34
What's worse is it doesn't feel like we've got 1000x better graphics (shoot me if my maths if wrong).
Either that or I've got some very rose tinted glassed on. :)
Well, the games aren't 1000x better and the wow factor of first seeing a 3D card in action after software rendering I don't really think has been managed since. But IIRC that Voodoo card could only game up to 800x600, 4MB of texture ram doesn't store enough for decent realism and the ability to write shader programs seemed quite an improvement.
… or to put it another way, I wouldn't want to try and run my Rift off of an old Voodoo card :D
I am pretty sure 10x ray tracing performance over software is what Nvidia promised with the 20 series…
I really want these cards to compete with ampere, and they do seem to do that in rasterised graphics, but I fear they will be more on par with the 20 series performance once you start turning all the fancy new DX12 features on.
Primey0
What are the prices for the cards in GBP?
Straight $=£, so £650 and £1,000 (plus gouging and scalping, as per the NV lineup)
Gentle Viking;4267558
O they will have actual products ready to sell, for about 4 hours.
If you think anything else you dont get the basics of supply / demand 101
And if they have stock to last a whole day,,,,, then i will be very very surprised.
Yep.
We will soon see if AMD can provide decent stock levels to their customers…Unlike Nvidia, who are busy covering their arses with
Scalpers etc. nonsense, to cover up for poor yields on their silicon.
were they all 2.5slots? I've only space for 2.2 max (leaving a small slice for breathing space).
DanceswithUnix
But IIRC that Voodoo card could only game up to 800x600, 4MB of texture ram doesn't store enough for decent realism and the ability to write shader programs seemed quite an improvement.
Oh look at you with your lardydar 800x600, common folk like me only had 640x480. ;)
Also OpenGL FTW, or rather Glide.
ohmaheid
Looks like AMD are going to beat Nvidia to have ACTUAL products available to the general public.
Pricing is comparable, as is performance and power requirements.
The question is: Do you buy AMD, or wait for some undefined future date for Nvidia?
That is a huge assumption. Nvidia claimed they would have loads of stock - and actually - they did relative to previous launches - but demand was unprecedented for GPU launch.
AMD should do slightly better as they have had more time to prepare, and although the numbers currently show near enough parity with Nvidia in ideal conditions, with a Ryzen CPU, in games that Favour AMD'd APIs…if you look at the facts then AMD are likely to lose out by a few FPS in the real world.
All that said, I am certain Bots will grab 90% of the stock as they did with Nvidia, and many people will still be waiting until 2021 before they can buy one of these.
Sooo, we have to wait next until next year before we get a <£500 card from AMD, wonder how many impatient people will buy the 3070 rather than wait…
I'm certainly not wanting to spend more than £500, and I am a touch on the impatient side…
DanceswithUnix
… 128MB divided between 5120 shaders isn't a lot …
Don't graphics shaders work with
very small discrete subsets of the same data though? iirc the L1 and L2 caches are comparatively tiny on GPUs too, so it makes sense that the ‘L3’ would appear quite small for the number of shaders…
EDIT for crosspost:
'[GSV
Trig;4267621']Sooo, we have to wait next until next year before we get a <£500 card from AMD …
to get an
RDNA2 card < £500, yes. The RDNA1 and polaris cards are currently filling out the bottom end of AMD's stack.
Many people are not good with waiting, thats for sure :) so if NV can have some 3070 cards on the shelves they should sell well, even for me with a 5700XT thats still a substantial upgrade if i was in the mood to do that.
And many more are on even lesser hardware, my friend have a nice 6/12 core Ryzen 2 CPU but his GFX card are just a 1060 3 GB, fortunately for him, he also dont game, or at least not anything new his 1060 will have problems running.
If AMD & partners could have stock of these new cards, it could be a very very good thing for AMD, as the timing / circumstances could hardly be any better.
A 5700XT would be an upgrade from my RX580 yes, but, I'd feel a bit meh about buying a new last gen card, unless the price was very good that is..
Yeah ideally you would be able to buy a new gen card in a lesser model and at a good price, but the advent of a new series cards do sadly not mean that the past generations are just dumped at rock bottom prices.
So a lot of head scratching go on in relation to how to make a buck on everything as it slowly fizz out.
Personally i have always updated to current generation GFX cards, but never the top models as that kind of money i would rather spend on something else, and this was even true in the days where i spent around 8 hours a day gaming, and then a little other stuff on the side.
Okay my 5700XT sort of was the top model when i got it, and the 4600ti i got back in the day was also a rather expensive experience.
If my memory serve me right the card i upgrade from to the 5700 XT was a GTX 570,,,,,, so a substantial update any way you look at it, and the XT not having smart ray traced stuff did not bother me one bit, cuz if i was able to find a game i would like to play i probably would still play it with GFX settings pretty low.
5 - 600 USD that is about as far as i am willing to go for any computer component.
Corky34
Oh look at you with your lardydar 800x600, common folk like me only had 640x480. ;)
Also OpenGL FTW, or rather Glide.
I think by then I was rocking a 17“ iiyama that in theory would do 1600x1200, but you had to really squint. I am now reminded that in those days if you wanted a monitor 17” across, you have better have the space for it to go back 17" behind as well. It did me well, but I don't miss CRTs (other than for zombie lightgun games on the Dreamcast ofc).
scaryjim
Don't graphics shaders work with very small discrete subsets of the same data though? iirc the L1 and L2 caches are comparatively tiny on GPUs too, so it makes sense that the ‘L3’ would appear quite small for the number of shaders…
Not really, Nvidia did quite a good paper a while back which raved about the wonders of HBM memory and described the footprint of GPU threads. Sounds like they are usually working on different parts of the problem, calculating different outputs from different bits of texture etc. Then there are the waves of threads where each SP is working on more than one thread which will dilute the cache further.
I wonder if AMD worked out that there is some break point where if you get a cache huge enough then it starts pulling things together. Interesting that the size is the same as we see in Intel Iris parts.
'[GSV
Trig;4267648']A 5700XT would be an upgrade from my RX580 yes, but, I'd feel a bit meh about buying a new last gen card, unless the price was very good that is..
The main problem with the RX5700XT is not even the lack of RT,but its lower tier of DX12 support,ie,lack of mesh shaders and VRS support.
I'm sure right now, those things don't matter in the games I play, but I intend whatever I drop my cash on, to last me a good few years so anything new that's likely to be picked up by developers in the future, would be a nice addition.
'[GSV
Trig;4267702']I'm sure right now, those things don't matter in the games I play, but I intend whatever I drop my cash on, to last me a good few years so anything new that's likely to be picked up by developers in the future, would be a nice addition.
Mesh shaders and VRS are ways of improving FPS in games,so are useful features IMHO.
So, looks like I will be waiting until the new year then…
Wonder if AMD will respond to NVidia, 8Gb cards would be one way to counter a price drop by NVidia, but still, if you cant buy the damned things it doesn't matter what the price or performance is..
lack of mesh shaders and VRS support.
That would be nice to have indeed, but not a deal braker as i would probably have to trim some settings down to get at least 144 fps in a game to match the 144 HZ of my monitor.
And i am thinking about stepping up from 1080 / 144 Hz to 1440 / 240 Hz so i would need even more FPS to satisfy my 1:1 FPS/Hz wants
Visual fidelity dont matter much to me, so even a good looking DX11 game will probably be fine for me in single player where i do have GFX settings up as high as i can, but for competitive online gaming i will ditch looks for performance any day of the week.
I'm also looking at a new monitor, my old 60Hz 1920/1080 Dell has served me well, but I want moar hurtz!!
Gentle Viking;4267750
lack of mesh shaders and VRS support.
That would be nice to have indeed, but not a deal braker as i would probably have to trim some settings down to get at least 144 fps in a game to match the 144 HZ of my monitor.
And i am thinking about stepping up from 1080 / 144 Hz to 1440 / 240 Hz so i would need even more FPS to satisfy my 1:1 FPS/Hz wants
Visual fidelity dont matter much to me, so even a good looking DX11 game will probably be fine for me in single player where i do have GFX settings up as high as i can, but for competitive online gaming i will ditch looks for performance any day of the week.
Mesh shaders and VRS can actually increase FPS if applied in games,hence why they are useful. Now that RDNA2 and the consoles also support(on top of Turing and Ampere),I can see both being implemented more often now. Hence if I was buying a new card today for a decent amount of money,they are features ideally I would want to have!
I often feel like windows are like a trip wire. :mrgreen:
Gentle Viking;4267807
I often feel like windows are like a trip wire. :mrgreen:
Maybe - I wonder if its a windows issues or hardware issue,or whether AMD is just segmenting it to Zen3 and RDNA2?? Its apparently worked under Linux for years with earlier generations of AMD CPUs and GPUs!
CAT-THE-FIFTH
Interesting:
https://www.phoronix.com/forums/forum/phoronix/latest-phoronix-articles/1215570-linux-support-expectations-for-the-amd-radeon-rx-6000-series/page4#post1215694
So it appears maybe the requirement for RDNA2 and Zen3 for Smart Access Memory maybe more a driver issue or Windows issue,than a hardware issue. In theory Zen2 should also work with it!
I was wondering, not based on the extra details you've provided now, if the Base Address Registers being larger than 256bit (is my bitness correct?) had something to do with PCIe 4.0. I've not looked so don't know but first thing i thought of when hearing it would be limited to 5 series CPU's and boards was maybe PCIe 3.0 just can't address more than 256bits at a time and because of that 4 series CPUs fixed the BAR at the maximum limit of PCIe 3.0.
I would go read about it but it was only a musing and I'm sure someone will be kind enough to correct me now I've voiced it.
Corky34
I was wondering, not based on the extra details you've provided now, if the Base Address Registers being larger than 256bit (is my bitness correct?) had something to do with PCIe 4.0. I've not looked so don't know but first thing i thought of when hearing it would be limited to 5 series CPU's and boards was maybe PCIe 3.0 just can't address more than 256bits at a time and because of that 4 series CPUs fixed the BAR at the maximum limit of PCIe 3.0.
I would go read about it but it was only a musing and I'm sure someone will be kind enough to correct me now I've voiced it.
As I said in another thread, I wonder if this is some kind of cut down type of CCIX (scrapped in favour of CXL as CXL one the DMA battle due to wider numbers of companies jumping on).
Corky34
I was wondering, not based on the extra details you've provided now, if the Base Address Registers being larger than 256bit (is my bitness correct?)
I gather some graphics cards can map a window of 256MB, though I thought it was commonly a couple of 128MB windows. 256 bit would be a massive address range, we haven't gotten close to the limits of a 64 bit address range yet, and PCIe can only go up to 64 bit addressing.
From what I remember of the early PCI specs (and that was some quarter century ago and not the most memorable of documents :D ) it seemed like the original intention was to map all the memory of a card into the memory space of the CPU. OFC with 32 bit PCs still being around when graphics cards were getting to 1GB of ram the GPU vendors had to limit the mapped memory segment as it was using up enough address space as it was and giving us that lovely Windows XP limit (was it 3GB? As a Linux user I can't quite remember).
The bit that doesn't make sense to me here is that the Ryzen 3000 series and the Ryzen 5000 series are supposed to use the same IO controller die. So either AMD fibbed about re-using the controller die, or they can't be bothered to enable this feature in their old BIOS modules. The PCI configuration is a BIOS issue, so it does need motherboard support.
Is it because of money from Nvidia or are the people at Hexus being dumb? RTX and DXR are NOT the same. Its misleading and stupid to use the term RTX as it just reinforces Nvidia's attempts to control industry standards (just like Freesync/G-Sync).
maxopus
Is it because of money from Nvidia or are the people at Hexus being dumb? RTX and DXR are NOT the same. Its misleading and stupid to use the term RTX as it just reinforces Nvidia's attempts to control industry standards (just like Freesync/G-Sync).
Which people at Hexus say it's the same? By RTX performance in the article Hexus mean the AMD RX6000 series perform as fast as the NVidia RTX series. Why would that be because of money from Nvidia?
DanceswithUnix
I gather some graphics cards can map a window of 256MB, though I thought it was commonly a couple of 128MB windows. 256 bit would be a massive address range, we haven't gotten close to the limits of a 64 bit address range yet, and PCIe can only go up to 64 bit addressing.
From what I remember of the early PCI specs (and that was some quarter century ago and not the most memorable of documents :D ) it seemed like the original intention was to map all the memory of a card into the memory space of the CPU. OFC with 32 bit PCs still being around when graphics cards were getting to 1GB of ram the GPU vendors had to limit the mapped memory segment as it was using up enough address space as it was and giving us that lovely Windows XP limit (was it 3GB? As a Linux user I can't quite remember).
The bit that doesn't make sense to me here is that the Ryzen 3000 series and the Ryzen 5000 series are supposed to use the same IO controller die. So either AMD fibbed about re-using the controller die, or they can't be bothered to enable this feature in their old BIOS modules. The PCI configuration is a BIOS issue, so it does need motherboard support.
Between what Tabbykatze and yourself have said now I'm totally lost, the first thing i knew about the BAR was having a video GN did about this release on in the background so i defiantly wasn't paying much attention, i can hear my teacher now shouting out for the pupils at the back to pay more attention. :)
I think we've got a great big ball of wires though, i thought CCIX was for cache coherence between CPU's and an ‘other’ cache, something this (AFAICT) isn't because it seems to be about how much of the system RAM can be addressed at a time, and I'm not sure it's about what your talking about either DwU as you seem to be describing the address space that PCI(e) register themselves at and not the other register that PCI(e) uses when it's addressing data stored in RAM.
EDIT: You seem to be talking about the PCI configuration space whereas “Smart Access Memory” seems to relate to the reserved address space in RAM for when something needs to send data to the device.
Corky34
EDIT: You seem to be talking about the PCI configuration space whereas “Smart Access Memory” seems to relate to the reserved address space in RAM for when something needs to send data to the device.
This feature allows the CPU to just simply access the whole of the GPU memory as one simple linear lump, rather than having to go through small windows into the ram.
How you access the memory on any PCI(e) card is set up through the PCI configuration space, using the base address registers.
The thing I am confused about is that this shouldn't be hard. I assume the difficulty would be in some older BIOS implementations having expectations on how a video card is set up that it potentially breaks, so perhaps there is some finesse in the configuration for backwards compatibility. That might explain why they claim it only works on 500 series motherboards with 5000 series CPUs. That is odd, when 5000 series CPUs share an IO die with 3000 series, and the 4000 series APUs are more recent.
Edit: I've had a bit of a refresher recently reading about BAR registers in context of the raspberry pi compute module and people trying to get video cards working on the PCIe x1 interface of the breakout board. The first hurdle was the Pi configuration only allowed a piddly 64MB of address space to be mapped by the PCIe card. Seems they've got past that now, and into other problems. Wonder if their hacks will be good enough to map a 16GB vram, though I suspect a 6800 GPU might be a tad bottlenecked by that arm SOC :)
Yeah saw those a few hours ago, if they hold water then team red are back in the game. :clapping:
I do like in forza 4 1440p even the bottom 6800 kick 3090 ass :surprised: :whip:
Phage
Straight $=£, so £650 and £1,000 (plus gouging and scalping, as per the NV lineup)
I highly doubt that.
For me Nvidia is for RTX and always will be .