facebook rss twitter

AMD publishes video showing FreeSync in action

by Mark Tyson on 4 July 2014, 10:56

Tags: AMD (NYSE:AMD), VESA, NVIDIA (NASDAQ:NVDA), PC

Quick Link: HEXUS.net/qacgeb

Add to My Vault: x

We are still seeing Nvidia G-SYNC monitors being launched; AOC outed a new model just over a week ago and we recently had notice of Philips launching one at the upcoming IFA show in September. However, from what we've seen, the G-SYNC hardware adds quite a premium to the monitor price, almost doubling the expense in purchasing a 24-inch gaming monitor. Because of this we are eagerly awaiting the upcoming VESA 'Adaptive-Sync' open standard, requiring no extra proprietary (expensive) hardware, to make its market debut in a monitor.

"FreeSync capable monitors are expected soon"

'Adaptive-Sync' is part of the DisplayPort 1.2a video interface standard which will allow AMD to implement its Project FreeSync. AMD has published a new video showing its FreeSync demonstration from last month's Computex show in Taipei. The video notes describe the value of FreeSync as follows, "FreeSync allows users to eliminate screen tearing and stutter by synchronizing the monitor's refresh rate to the graphics card — even with a monitor that uses only industry standard technology, which requires no additional hardware." Furthermore we are told that "Several AMD Radeon graphics cards support FreeSync now, and FreeSync capable monitors are expected soon."

AMD's video notes say the system specs for the demo was as follows:

  • AMD FX-8350 8-core CPU
  • AMD Radeon R9 290X
  • ASUS Sabertooth 990FX R2.0 Motherboard
  • OCZ Vector 150 240GB SSD
  • 8GB DDR3 AMD Performance RAM
  • DisplayPort driven LED Display @ 2560x1440 – hacked to support FreeSync
  • Windows 8.1

As we have reported before, AMD products supporting FreeSync via DisplayPort 1.2a include; the AMD Radeon R9 290X, R9 290, R7 260X and R7 260 graphics cards, AMD APUs codenamed 'Kabini,' 'Temash,' 'Beema,' and 'Mullins.'

Hopefully Nvidia will make Adaptive Sync capable drivers too, so green team enthusiasts don't have to stump up so much money for the comparable G-SYNC feature in the few monitors that support it.



HEXUS Forums :: 10 Comments

Login with Forum Account

Don't have an account? Register today!
I wonder how this sort of thing works. 1. Nvidia comes up with G-SYNC; 2. someone over at AMD thinks, why didn't we think of that, and makes its alternative using their existing knowledge or 3. they “study” G-SYNC and then make their own version. I'm assuming Nvidia came up with it first of course.

Also, why bother buying a FreeSync capable monitor if certain cards already support it? They've not done a great job of distinguishing between the two options. I assume the monitor would be for people with weaker cards, but then why pay money for an expensive monitor when you could put that money towards a better card?
Vigil
I wonder how this sort of thing works. 1. Nvidia comes up with G-SYNC; 2. someone over at AMD thinks, why didn't we think of that, and makes its alternative using their existing knowledge or 3. they “study” G-SYNC and then make their own version. I'm assuming Nvidia came up with it first of course.

Also, why bother buying a FreeSync capable monitor if certain cards already support it? They've not done a great job of distinguishing between the two options. I assume the monitor would be for people with weaker cards, but then why pay money for an expensive monitor when you could put that money towards a better card?


There's an interview with an AMD guy(I can't remember where). seemed to imply that Nvidia just got there first, but they were already working on it. Eitehr way, G-Sync has physical hardware which inevitably adds cost. This is using parts of the VESA spec, (which was more to do with TVs and laptops) to do this. Note, GCN 1.1 has support for this in its hardware, so they were clearly thinking about it.
It seems as though changes for monitor manufacturers are more firmware than hardware based, the initial cost may just be a supply and demand thing rather than considerable licensing and manufacturing costs as in GSync's case. Only 260x and 290 support it, so I think the next round of AMD cards will be when this starts to kick of properly.

In the case of Gsync you probably are just better off buying a better card and locking in at 60hz rather than paying to protect from effects of dropping under. Save for the few who already have top end cards and want to run 144hz monitors.
Vigil
I wonder how this sort of thing works. 1. Nvidia comes up with G-SYNC; 2. someone over at AMD thinks, why didn't we think of that, and makes its alternative using their existing knowledge or 3. they “study” G-SYNC and then make their own version. I'm assuming Nvidia came up with it first of course.

Also, why bother buying a FreeSync capable monitor if certain cards already support it? They've not done a great job of distinguishing between the two options. I assume the monitor would be for people with weaker cards, but then why pay money for an expensive monitor when you could put that money towards a better card?

You have completed missed the point I think.

The really difference between G-Sync and FreeSync (afaik) is that G-Sync is proprietary and requires specific hardware, however Freesync doesn't (to a degree). FreeSync is part of Display Port 1.2 and as such will be widely adopted.

Also, looking at the AMD cards that already support FreeSync (290x/290 and 260x/260) AMD has already shown it won't limit this to it's high end cards. Thankfully N-Vidia does support some of it's lower cards but why would you pair a £100 card to £350-400 monitor? Whereas because FreeSync is built into Display port there isn't a ridiculous overhead for it.

Saying that, because Nvidia have such a control on G-sync you can argue that that is a better position to be in as a gamer as N vidia can update and maintain the standard far faster then FreeSync (i'd imagine)

Either way, this is good news for everyone, Huzzah for no more screen tearing!!!
“FreeSync allows users to eliminate screen tearing and stutter by synchronizing the monitor's refresh rate to the graphics card — even with a monitor that uses only industry standard technology, which requires no additional hardware.”

I don't get screen tearing and stutter, because I use something video cards have had for over a decade: v-sync, with triple buffering.

So what's the point of this?
This is only for displayport monitors. Displayport is different because the screen's refresh rate clock is generated in the monitor itself, instead of the video card. This will never work for HDMI screens.