Also a note to perspective buyers - the people over on XS forums tested the various models. At the same voltages the R7 1700 consumes less power than a R7 1700X or R7 1800X.
This hints that the R7 1700 is a lower leakage chip according to them and will be the better buy for most people. At least for a lot of non-gaming work the Core i7 6900K looks massively overpriced.
As far as I'm concerned, I reel like this review, of the 1700x, has successfully sold a lot of 1700's!
Super interesting looking scaling!
Well at a couple of hundred quid cheaper than my 5930K it looks like much better value for money. Although I was somehow hoping for more.
All academic for me as it's a good few years until I'll be upgrading again.
Jowsey
As far as I'm concerned, I reel like this review, of the 1700x, has successfully sold a lot of 1700's!
Super interesting looking scaling!
The R7 1700 is the star - it consumes less power at 4GHZ apparently than a R7 1700X or R7 1800X.
Hexus - how good was the XFR? did it mean that it was boosting to the same level as the 1800X when it was boosting too or did it come in lower even with the same cooler?
I've obviously missed the briefing. I'm seeing “XFR” bandied about all the time - what is it?
Bluecube
I've obviously missed the briefing. I'm seeing “XFR” bandied about all the time - what is it?
Its an auto-overclock of upto 200MHZ on the X series and upto 100MHZ on the non-X series.
Just been looking at some comparison sites (with pretty pictures and big, bold numbers, because that's about my level of understanding on these things)… but there doesn't look to be that much between this 1700 and my years-old 3930K, at least until you get into the multicore stuff, anyway.
Higher TDP and a few such, but many of the scores are around the same, or slightly better than the 1700.
I'm kinda impressed, actually!!
Certainly won't be adopting Ryzen, at least not just yet. I still need to update my GPU and get a better monitor, but it's good to hear AMD might have started getting at least as good as Intel's current stuff. Last AMD processor I had was a 6350, IIRC… Seems AMD like to stay on the same platform for-EVERRRRRRRRRRRRR, while Intel change theirs faster than the Ttaskmistress changes her mind!!
Ttaskmaster
Just been looking at some comparison sites (with pretty pictures and big, bold numbers, because that's about my level of understanding on these things)… but there doesn't look to be that much between this 1700 and my years-old 3930K, at least until you get into the multicore stuff, anyway.
Higher TDP and a few such, but many of the scores are around the same, or slightly better than the 1700.
I'm kinda impressed, actually!!
Certainly won't be adopting Ryzen, at least not just yet. I still need to update my GPU and get a better monitor, but it's good to hear AMD might have started getting at least as good as Intel's current stuff. Last AMD processor I had was a 6350, IIRC… Seems AMD like to stay on the same platform for-EVERRRRRRRRRRRRR, while Intel change theirs faster than the Ttaskmistress changes her mind!!
When it comes to games SMT currently can regress performance quite a bit,so much so you can get 5% to 15% better gaming performance overall.
So the best is still to come - its a bit of a bummer,but I expect down the line gaming performance will improve.
For non-gaming stuff,some of the reviews are showing great performance especially under Linux where even a Core i7 6900K or Core i7 6950K is challenged.
8 cores for the price of 4. Don't mind if I do.
pretty sad that review companies still focus on single core performance, on a multi core proc. i guess they have to find something negative to say about AMD.
The results are very positive, and certainly look good for situations that require a bigger core count. I'd like better gaming performance ideally, but with patches, and a more mature process, a few months later things may have evened out a little. Thanks for the review!
Thank you for your sensible QHD (1440P) gaming benchmarks in this review :)
A lot of sites have been annoying benchmarking at ridiculously-low resolutions like 640x480 or 800x600 in order to show a massive Intel bias. I wish I were joking here, but sadly I am not: this is what sites have really been doing.
Other sites have pointed out that AMD is technically “losing” still at 1080P, but that anything above that shows no discernible difference between AMD and Intel processors (even when paired with the highest-end gaming cards available).
Any chance you could do a review on the cheaper R7 1700 now too?
anselhelm
Thank you for your sensible QHD (1440P) gaming benchmarks in this review :)
A lot of sites have been annoying benchmarking at ridiculously-low resolutions like 640x480 or 800x600 in order to show a massive Intel bias. I wish I were joking here, but sadly I am not: this is what sites have really been doing.
Other sites have pointed out that AMD is technically “losing” still at 1080P, but that anything above that shows no discernible difference between AMD and Intel processors (even when paired with the highest-end gaming cards available).
Any chance you could do a review on the cheaper R7 1700 now too?
This. I'd really like to see an OC'd 1700 with at least an 1080 at 1440p in BF1 against a 6700k and a 7700k
anselhelm
Thank you for your sensible QHD (1440P) gaming benchmarks in this review :)
A lot of sites have been annoying benchmarking at ridiculously-low resolutions like 640x480 or 800x600 in order to show a massive Intel bias. I wish I were joking here, but sadly I am not: this is what sites have really been doing.
Other sites have pointed out that AMD is technically “losing” still at 1080P, but that anything above that shows no discernible difference between AMD and Intel processors (even when paired with the highest-end gaming cards available).
…..
We learn very little about CPU performance by benching in GPU-limited scenarios. I agree that 640x480 is irrelevant as next to nobody uses those resolutions anymore. However 1080p should always be included in any gaming bench. 1080p accounts for 43.23% of Steam users while 1440p is only 1.81% (
http://store.steampowered.com/hwsurvey).
We need results for 1080p and 1440p because it will show CPU performance clearly when you compare the two. By only including 1440p in this review we don't know what to expect when we upgrade our graphics cards. The GTX 1080 Ti is out now. If we swapped one of those with the GTX 1080 in this test suite we would see a clearer difference in CPU performance as the GPU bottleneck is lifted.
And thats what we need to know in this Ryzen 7 1700X review - CPU performance in real world gaming scenarios.
Ideally we'd make the reviewers work harder by insisting on various resolutions and settings even for CPU reviews.
Some of the higher quality settings do use CPU resources so running at 640x480 or 800x600 in low quality settings doesn't necessarily always show how well a CPU can cope with some theoretical monster drawcall load in the future.
Another thing reviewers do not cover is having any kind of streaming going on (ShadowPlay, ReLive etc.).
Also, multiplayer is very tricky to benchmark. This goes back to BF4 and Mantle, where benchmarks noted only small differences but yet people playing multiplayer noticed huge differences with smoothness.
Someone over on the AT forums
posted this:
I don't see benchmarks as an indication of what to expect at X or Y resolution, to me they are an indication of how much better or worse X is when compared to Y, there are to many variables involved with PCs to be able to compare one system with another.
ordered mine today .. asrock taichi and a 1700 .. with 3333 ram .. plus a few other goodies ..now to wait for vega
anselhelm
Thank you for your sensible QHD (1440P) gaming benchmarks in this review :)
A lot of sites have been annoying benchmarking at ridiculously-low resolutions like 640x480 or 800x600 in order to show a massive Intel bias. I wish I were joking here, but sadly I am not: this is what sites have really been doing.
Other sites have pointed out that AMD is technically “losing” still at 1080P, but that anything above that shows no discernible difference between AMD and Intel processors (even when paired with the highest-end gaming cards available).
Any chance you could do a review on the cheaper R7 1700 now too?
Hi there,
As soon as we have the R7 1700, which should be in the coming week, it will be on page.
Tarinder
Hi there,
As soon as we have the R7 1700, which should be in the coming week, it will be on page.
According to XS forums testing,they saw the R7 1700 at similar voltages to the R7 1700X/1800X consume less power??
Any chance you could see if that is true - the hypothesis is that the R7 1700X/1800X are leakier parts made for more extreme cooling and the R7 1700 should do better under normal cooling.
Tarinder
Hi there,
As soon as we have the R7 1700, which should be in the coming week, it will be on page.
Do you know I kinda miss the Tables of Doom ?
(But then I am an Accountant !)
Firejack
We learn very little about CPU performance by benching in GPU-limited scenarios. I agree that 640x480 is irrelevant as next to nobody uses those resolutions anymore. However 1080p should always be included in any gaming bench. 1080p accounts for 43.23% of Steam users while 1440p is only 1.81% (http://store.steampowered.com/hwsurvey).
We need results for 1080p and 1440p because it will show CPU performance clearly when you compare the two. By only including 1440p in this review we don't know what to expect when we upgrade our graphics cards. The GTX 1080 Ti is out now. If we swapped one of those with the GTX 1080 in this test suite we would see a clearer difference in CPU performance as the GPU bottleneck is lifted.
And thats what we need to know in this Ryzen 7 1700X review - CPU performance in real world gaming scenarios.
A 1080 at 1080p is not a real world gaming scenario. If we benchmark at a sensible resolution for the graphics card we learn whether the CPU is a significant bottleneck, which is what is important - every graphics card has been GPU limited at the resolution du jour, and this is not likely to change within the next decade. A 1080ti at QHD is not realistic, because only an idiot would spend almost £700 on a GPU to run a QHD in this day and age - you'd expect it to perform at 4K, so if a 1080ti is used then the tests should be run at 4K.
Xlucine
A 1080 at 1080p is not a real world gaming scenario. If we benchmark at a sensible resolution for the graphics card we learn whether the CPU is a significant bottleneck, which is what is important - every graphics card has been GPU limited at the resolution du jour, and this is not likely to change within the next decade. A 1080ti at QHD is not realistic, because only an idiot would spend almost £700 on a GPU to run a QHD in this day and age - you'd expect it to perform at 4K, so if a 1080ti is used then the tests should be run at 4K.
I don't fully agree - people do run GTX1080 cards at 1080P,and from my experience of using a GTX1080,some games like Deus Ex:Mankind Divided I have run can't hit a constant 60FPS at qHD(!).
Its still playable,but I do have framerate dips,and I can understand why people might use the card at 1080P instead which I personally would not do either,but its a data point which is still useful.
Planetside 2 is even more ridiculous - I mean I thought it was just a CPU hog,but some of the shadow settings would hit performance badly on the same card at qHD so I reduced them to the lowest.
CAT-THE-FIFTH
Any chance you could see if that is true - the hypothesis is that the R7 1700X/1800X are leakier parts made for more extreme cooling and the R7 1700 should do better under normal cooling.
Leakier parts are only really good under LN2 conditions, remember TWKR?
http://www.legitreviews.com/amd-phenom-ii-42-twkr-black-edition-processor_1009Perhaps the non X 1700 just has a lower default voltage?
The snag, then, is that Intel still has a commanding lead in IPC, which tends to be a key contributor to in-game performance and some everyday workloads. AMD has closed the gap compared to the woeful FX-series, but single-thread proficiency isn't quite what we had hoped. Putting the PiFast result into perspective, the top two Ryzen chips are still playing catch-up with the dated Intel Core i5-2500K, which scored 20.5 way back in 2011.
That statement strikes me as … too bombastic.
a) All the chips in question have faster clocks (4.2-4.5GHz all-core boost vs. 3.5GHz (3.6 if XFR kicks in) all-core and 3.8GHz 2-core boost for the 1700X). That alone should account for 10% (2-core load vs. i3-7350k or i5-7600k) - 20% (pretty much everything else at 4 or fewer cores at full load) performance deficits in up-to-4-threaded workloads.
b) We still haven't figured out the balance between lack of optimization for Ryzen/very specific optimization for Core vs. actual performance differences in specific tasks.
c) PiFast is a single application, and as such it's a stretch to make it the be-all, end-all benchmark for IPC.
Am I saying OMG RYZEN IS TEH BEST U GUISE!!!!1!!? No. But it stands to reason that performance outliers might be corrected by software, driver or microcode updates.
Valantar
The snag, then, is that Intel still has a commanding lead in IPC, which tends to be a key contributor to in-game performance and some everyday workloads. AMD has closed the gap compared to the woeful FX-series, but single-thread proficiency isn't quite what we had hoped. Putting the PiFast result into perspective, the top two Ryzen chips are still playing catch-up with the dated Intel Core i5-2500K, which scored 20.5 way back in 2011.
That statement strikes me as … too bombastic.
a) All the chips in question have faster clocks (4.2-4.5GHz all-core boost vs. 3.5GHz (3.6 if XFR kicks in) all-core and 3.8GHz 2-core boost for the 1700X). That alone should account for 10% (2-core load vs. i3-7350k or i5-7600k) - 20% (pretty much everything else at 4 or fewer cores at full load) performance deficits in up-to-4-threaded workloads.
b) We still haven't figured out the balance between lack of optimization for Ryzen/very specific optimization for Core vs. actual performance differences in specific tasks.
c) PiFast is a single application, and as such it's a stretch to make it the be-all, end-all benchmark for IPC.
Am I saying OMG RYZEN IS TEH BEST U GUISE!!!!1!!? No. But it stands to reason that performance outliers might be corrected by software, driver or microcode updates.
I agree. I think we may even see big improvements in certain benchmarks over the next few months as the platform matures.
For me personally though the bottom line is value for money and this is where Ryzen is the big winner (for the moment).
CAT-THE-FIFTH
Tarinder
Hi there,
As soon as we have the R7 1700, which should be in the coming week, it will be on page.
According to XS forums testing,they saw the R7 1700 at similar voltages to the R7 1700X/1800X consume less power??
Any chance you could see if that is true - the hypothesis is that the R7 1700X/1800X are leakier parts made for more extreme cooling and the R7 1700 should do better under normal cooling.
“Leakier parts”… what does this even mean in terms of a CPU?
Stu C;3778495
“Leakier parts”… what does this even mean in terms of a CPU?
A leakier part uses more power but is able to tolerate higher voltages. Ideal for suicide runs under LN2 at crazy voltages.
A low leakage part usually can't reach the same max clock but uses less power. Ideal for modest overclocks or undervolting.
TPU's GPU-Z has a read ASIC quality feature for GPUs and what it says mostly applies to CPUs too:
Am I missing something or does even the high-end Zen only have 16 PCIe Lanes? Seems a bit small for a new architecture/chipset
Technically 20, but for anything short of trying to make a supercomputer there's no real performance difference for consumers. SLI/crossfire will happily drop frames whether or not there's a full x16 lane to each card, and the 7700k only has 16
Well this is very interesting… For the first time since I got into this hobby, if I were building a new system today I would probably put an AMD chip in it!
Brilliant news for the consumer, and kudos to AMD for making CPUs interesting again :clapping:
Over to you now Intel…
I would too. Waiting for Intel's response. Hopefully we won't just get a price war, but a performance war too. And I hope AMD doesn't get crushed!
Firejack
However 1080p should always be included in any gaming bench. 1080p accounts for 43.23% of Steam users while 1440p is only 1.81% (http://store.steampowered.com/hwsurvey).
Interesting… I thought 1440p was now the standard and I was one of the few peasant gamers still playing games at 1080p!!
I must be spending too long hanging out with you Enthoo lot…!!
Ttaskmaster
Interesting… I thought 1440p was now the standard and I was one of the few peasant gamers still playing games at 1080p!!
I must be spending too long hanging out with you Enthoo lot…!!
One more peasant gamer here! My PC is mostly 8 years old, except for the graphics card which is about 4 years old. Does the job though and still enjoying the games at 1080p!
Despite the cost advantage of 1080p monitors, I went with 1920x1200 IPS as I really like the extra vertical height for web browsing, documents, etc., and I do prefer it for gaming. I've been pondering switching up to 4K, but am unsure of the lower height aspect ratio.
One aspect of Ryzen that has disappointed me a little was the limited nature of XFR. Perhaps I misread earlier reports on what it was going to do, but it sounded like it would allow a CPU to auto-oc as long as there was thermal headroom; I was surprised it actually has a limit of +100 or +200MHz depending on CPU model. Still, I suppose a genuinely “unlimited” auto-oc would be much more difficult to implement safely, it'd need to have some kind of specified max voltage threshold, etc. Perhaps in the future AMD will try something more flexible, eg. something that makes an individual CCX boost up a lot higher if the current active threads can be pulled locally into one CCX, but that will need improvements in OS tuning.
Overall though, very impressed, and I'm sure single-threaded and gaming performance will improve over time as optimisations come along. If anything it's better than I was expecting, given the way most code has been Intel focused for such a long time now. That Ryzen can do this well right of the bat is remarkable.
However, I do think for some solo pro users they might be better off waiting to see what comes along with Naples, and/or how Intel responds. The 64GB RAM limit and lesser PCIe provision of Ryzen is a bit on the low side for some pro tasks, especially AE. Still, it's an excellent kickup for anyone stuck on older tech who's been put off by newer Intel pricing (one just has to be content with the stock performance), though I'm not sure it would be that much of a boost for anyone who's using a decent X79 setup (eg. 1700X is 26% faster than a 3930K @ 4.7 for CB); I plan on building a 1700X rig to run some tests. Definitely good though for anyone planning on a fresh build.
Ian.