facebook rss twitter

3DMark to be updated with DirectX 12 vs Mantle tests

by Mark Tyson on 1 December 2014, 12:05

Tags: AMD (NYSE:AMD), NVIDIA (NASDAQ:NVDA), Futuremark, Ubisoft (LON:UBI), PC

Quick Link: HEXUS.net/qacl7r

Add to My Vault: x

One of the key tests in the HEXUS graphics card review and assessment suite is Futuremark's 3DMark. This synthetic benchmark is updated quite often but we hear of a significant update coming in the not-too-distant future that will include 'Farandole', a test to compare the performance of the DirectX 12 and Mantle APIs.

According to a 3DMark roadmap shown at AMD's recent 'Future of Compute' event, the DirectX12 vs Mantle API comparison tests will come in a version of 3DMark to be released next year, called 'Dandia'. It is understood that both DX12 and Mantle can offer at least 7.5 times the amount of draw calls than DX11 can handle. Thus new tools to measure the performance of these APIs at their limits will be useful until we see a good selection of applications and games that use them in the 'real world'.

Despite there being a relatively long time until 2016, Futuremark showed a slide of an early 'Farandole' test, which will be in the 'Dandia' software release. You can see that the draw call feature test performance (estimate, above) for both DX12 and Mantle is about 7.5 times that of DX11. If you can't read the small-print in the slide it says "These figures represent very early tests at the Beta Stage of drivers and APIs and do not represent the final performance that may be achieved after significant development."

It's interesting to see these tests concerning draw calls in the wake of the Ubisoft Assassin's Creed: Unity problems. During its squirming, Ubisoft said that the graphics performance of the game was being "adversely affected by certain AMD CPU and GPU configurations". However other sources thought the nub of the problem was the game issuing "tens of thousands of draw calls -- up to 50,000 and beyond," causing a juddery experience, even on the best kitted out PCs running DirectX 11. So Ubi could have created a perfectly smooth gaming experience using the currently available AMD Mantle API...

Back to the Futuremark tests. The roadmap shows that the 3DMark Sky Diver was codenamed 'Balboa' before its release and the names given aren't the final public names. Futuremark told WCCFTech, on the subject of possible bias, that "AMD, Intel, Microsoft, NVIDIA …all have the opportunity to inspect the source code and suggest changes. As a result, we're confident that the 3DMark API Overhead feature test will be the fairest way to compare these new APIs."



HEXUS Forums :: 23 Comments

Login with Forum Account

Don't have an account? Register today!
To be release in 2016?

Crikey, anything could happen between now and then……Don't think I'll hold off on any decisions about DX vs mantle!
Sorry, as you can see in the roadmap, it looks like mid-2015
shaithis
To be release in 2016?

Crikey, anything could happen between now and then……Don't think I'll hold off on any decisions about DX vs mantle!
It's interesting to see these tests concerning draw calls in the wake of the Ubisoft Assassin's Creed: Unity problems. During its squirming, Ubisoft said that the graphics performance of the game was being “adversely affected by certain AMD CPU and GPU configurations”. However other sources thought the nub of the problem was the game issuing “tens of thousands of draw calls – up to 50,000 and beyond,” causing a juddery experience, even on the best kitted out PCs running DirectX 11. So Ubi could have created a perfectly smooth gaming experience using the currently available AMD Mantle API…

It's nothing to do with Mantle, DX12, or any other API that you want to throw in there.

Ubi have simply failed to make the game to the standards required to run well on current systems. Even when it does run, it's not just framerate - it's very buggy.

You make a game to the requirements of today. If systems can't handle it, you scale it back. Serious texture issues and such are largely AMD based (and are not present on Nvidia hardware)….but this isn't the fault of AMD, this is firmly the issue of Ubisoft. They failed to do QA on AMD hardware - it's simply impossible these issues could have been missed.

To claim they could have created a “perfectly smooth gaming experience” by using a different API (Mantle or anything else) is madness. The issue is bad coding and QA. It's nothing to do with the API used. The game should be perfectly playable on AMD and Nvidia hardware using DX, full stop. They failed at it. Another API doesn't solve these issues, it only gives them more headroom to make the same mistakes again.
Agent
snip

To claim they could have created a “perfectly smooth gaming experience” by using a different API (Mantle or anything else) is madness. The issue is bad coding and QA. It's nothing to do with the API used. The game should be perfectly playable on AMD and Nvidia hardware using DX, full stop. They failed at it. Another API doesn't solve these issues, it only gives them more headroom to make the same mistakes again.

I was just ‘pondering’ various sources of info rather than claiming it was the truth - I just thought it was interesting as Ubi blamed AMD…
mtyson
I just thought it was interesting as Ubi blamed AMD…

Where?

All I can find is the following statement:

We are aware that the graphics performance of Assassin’s Creed Unity on PC may be adversely affected by certain AMD CPU and GPU configurations. This should not affect the vast majority of PC players, but rest assured that AMD and Ubisoft are continuing to work together closely to resolve the issue, and will provide more information as soon as it is available.

That doesn't sound like they are blaming AMD to me, but they are acknowledging that there are issues on those configurations. Working with AMD to solve them isn't pinning blame.

Is there a different statement that assigns blame?