AMD is working on a new software feature for its Radeon graphics cards, which it calls "Dynamic Frame Rate Control." Revealed informally to the web, by AMD director of PR Chris Hook, who goes by the handle "AMD_Chris" on various forums, Dynamic Frame Rate Control, or DFRC, is a frame-rate limiter, which gives you power savings when you reduce frame-rates. This probably works by reducing clock speeds to achieve the desired frame-rates.Sounds a lot like V-Sync? Well the way AMD describes it, DFRC is a frame-rate limiter with a slider. Whereas V-Sync makes the GPU spit out frame-rates to match the monitor's refresh-rate. When a game runs, say, 100 FPS, and you enable V-Sync to bring that down to 60 FPS, your GPU is still running at 3D-performance clocks, unless the 3D load is way too low, and the driver decides to change the power state altogether. DFRC probably achieves lower frame-rates by underclocking the GPU, and increasing the clocks, whenever the scene gets more demanding, and the output FPS drops below the target. Hook describes the energy savings with DFRC as "mind blowing." This peaks our curiosity.

67 Comments on AMD Working on "Dynamic Frame Rate Control" Feature

1 to 25 of 67 Go to Page 123 PreviousNext

#1 HTC

Interesting ... but i'll believe it when i see it.



Much was talked about Mantle but it's yet to deliver: perhaps this will be different? Dunno. Posted on Dec 17th 2014, 23:16 Reply

#2 TRWOV

HTC Interesting ... but i'll believe it when i see it.



Much was talked about Mantle but it's yet to deliver: perhaps this will be different? Dunno. Mantle does all what AMD said it would. Game developers not using it is another issue altogether.











As for DFRC, sounds interesting. I've yet to upgrade from a 60Hz monitor so this could be big for me. I'm already seeing <350W figures while gaming but if this can slash 10% or so it would be a welcomed improvement. I guess it'll launch along the Hawaii refresh? Mantle does all what AMD said it would. Game developers not using it is another issue altogether.As for DFRC, sounds interesting. I've yet to upgrade from a 60Hz monitor so this could be big for me. I'm already seeing <350W figures while gaming but if this can slash 10% or so it would be a welcomed improvement. I guess it'll launch along the Hawaii refresh? Posted on Dec 17th 2014, 23:42 Reply

#3 Sempron Guy

I'm not sure but wasn't DFRC already a feature on Radeon Pro? Posted on Dec 17th 2014, 23:46 Reply

#4 HumanSmoke

TRWOV Mantle does all what AMD said it would. Well AMD said the SDK would also be publicly available by the end of the year. Fourteen days left to uphold that particular promise. TRWOV As for DFRC, sounds interesting. I've yet to upgrade from a 60Hz monitor so this could be big for me. I'm already seeing <350W figures while gaming but if this can slash 10% or so it would be a welcomed improvement. I guess it'll launch along the Hawaii refresh? Yet another nail in the conventional graphics benchmarking practice coffin. If boost states, "up to X MHz", and 99th percentile latency measurements weren't enough to muddy the water, I'm sure this next round of dialled in perf v power should send a legion of the OCD afflicted scurrying to the medicine chest. Well AMD said the SDK would also be publicly available by the end of the year. Fourteen days left to uphold that particular promise.Yet another nail in the conventional graphics benchmarking practice coffin. If boost states, "up toMHz", and 99th percentile latency measurements weren't enough to muddy the water, I'm sure this next round of dialled in perf v power should send a legion of the OCD afflicted scurrying to the medicine chest. Posted on Dec 17th 2014, 23:52 Reply

#5 Nordic

So amd cards inefficient so amd's solution is to enhance efficiency by reducing fps... is something I would say if I was trolling.



Being serious, I like AMD but I don't see the utility for this software. Posted on Dec 17th 2014, 23:54 Reply

#6 arbiter

james888 So amd cards inefficient so amd's solution is to enhance efficiency by reducing fps... is something I would say if I was trolling.



Being serious, I like AMD but I don't see the utility for this software. About only use for it would be like a laptop or something like what Nvidia has with battery boost. Um, isn't this something you can do already with MSI afterburn/ EVGA precision? you can set a FPS limit in there which in effect could same almost same thing.About only use for it would be like a laptop or something like what Nvidia has with battery boost. Posted on Dec 18th 2014, 0:08 Reply

#7 SteveS45

This could be ideal for laptops or mobile products. But AMD isn't a big player in that market. Posted on Dec 18th 2014, 0:12 Reply

#8 RejZoR

Only problem is when you need 144fps because of the 144Hz screen... Unless if you'll be able to set any kind of limit, including 144fps. Then I'm interested, especially if it will be able to act as V-Sync, but without the annoying lag. Posted on Dec 18th 2014, 0:12 Reply

#10 Xzibit

This is likely a side benefit to a solution that G-Sync is suffering from when FPS exceeded the max monitor refresh rate.



I would think that's what the slider is there for. Adaptive Sync monitors will still be limited to 60hz - 120hz & 144hz like all other monitors. Having a driver cap in Catalyst wouldn't call for a 3rd party app to limit frame rate and not exhibit the increase lag experience under that scenario. Posted on Dec 18th 2014, 1:20 Reply

#11 DBGT

I doubt about the accuracy of this technology Posted on Dec 18th 2014, 1:59 Reply

#12 Sony Xperia S









:D Waiting for these new babies. Allegedly, 20 nm at GF. :D Posted on Dec 18th 2014, 2:28 Reply

#13 buggalugs

I don't think this is designed for power savings, its just a side affect. Its part of AMD's answer to Gsync but will work on any hardware (as long as you have AMD graphics)... Posted on Dec 18th 2014, 2:30 Reply

#14 the54thvoid

Remember that Nvidias adaptive sync can drop to 30fps.



However, dropping frames down is a gr at idea but I struggle to see any relevance in AAA games at QHD or UHD resolutions where fps can already be close to the bone.

This level of tech only works if your gfx solution is delivering higher frame rates than are required for a smooth performance.

I wonder if the PR slides use a 295x2 on a 1080p screen.



All being said, it's good they're looking at means of dialling back the power use. Posted on Dec 18th 2014, 2:31 Reply

#15 HumanSmoke

Sony Xperia S :D Waiting for these new babies. Allegedly, 20 nm at GF. :D I wouldn't hold your breath. What are the chances that a single Chinese leaker managed to get hold of not only both AMD's and Nvidia's next top GPUs, but also both AMD's second tier card and Nvidia's salvage part. Access to four unreleased top tier parts across two vendors simultaneously? Can't really see it TBH, especially as not a single other piece of leaked info from any one of these four has surfaced anywhere else.

Looks like an educated guess to me. I wouldn't hold your breath. What are the chances that a single Chinese leaker managed to get hold of not onlyAMD's and Nvidia's next top GPUs,both AMD's second tier cardNvidia's salvage part. Access to four unreleased top tier parts across two vendors simultaneously? Can't really see it TBH, especially as not a single other piece of leaked info from any one of these four has surfaced anywhere else.Looks like an educated guess to me. Posted on Dec 18th 2014, 2:50 Reply

#16 nemesis.ie

As someone with multiple cards (3) I think this is a fantastic feature to have.



I always try to play with vsync on and if you look at the power consumption with it on vs off on many titles there is a substantial difference. If this allows clock rate adjustment to achieve the synced rate it should be even better - if it even went so far as to put "extra" cards into zero-core (power them off) when running a title that doesn't need them to get xx(x)fps that would be excellent too.



Another possible side-benefit would be if the changed clock speed also reduced/eliminated the dreaded coil whine that many cards unfortunately suffer, especially with water blocks on them.



A big thumbs up from me if this does what's claimed, the sooner we get it the better, maybe with Omega release 2? (I can dream). Posted on Dec 18th 2014, 3:53 Reply

#17 Aquinus

Resident Wat-man nemesis.ie Another possible side-benefit would be if the changed clock speed also reduced/eliminated the dreaded coil whine that many cards unfortunately suffer, especially with water blocks on them. GPU load has very little to do with the whine. Usually a whine is from an improperly secured inductor that's vibrating to a frequency that resonates at 60Hz. If you knew which inductor was doing this, you could fix it yourself. btarunr Hook describes the energy savings with DFRC as "mind blowing." This peaks our curiosity. Too bad that AMD can't put that kind of effort into multi-monitor power usages at idle. :( GPU load has very little to do with the whine. Usually a whine is from an improperly secured inductor that's vibrating to a frequency that resonates at 60Hz. If you knew which inductor was doing this, you could fix it yourself.Too bad that AMD can't put that kind of effort into multi-monitor power usages at idle. :( Posted on Dec 18th 2014, 4:35 Reply

#18 nemesis.ie

Aquinus GPU load has very little to do with the whine. Usually a whine is from an improperly secured inductor that's vibrating to a frequency that resonates at 60Hz. If you knew which inductor was doing this, you could fix it yourself. IME it's also frame output/GPU load related, e.g. it's a lot worse sitting at the menu of an older game title that spits out lots of frames or e.g. 3DMark than other things. Others have reported that fine-tuning the clock speed can help as certain frequencies cause more resonance than others.



In the case of my cards, even with a soft heat pad material/glue around the caps/chokes and between them and the liquid block, they still whine a bit with the water blocks on - I'm very sensitive to it though, others say they can't hear it and think my machine is silent. IME it's also frame output/GPU load related, e.g. it's a lot worse sitting at the menu of an older game title that spits out lots of frames or e.g. 3DMark than other things. Others have reported that fine-tuning the clock speed can help as certain frequencies cause more resonance than others.In the case of my cards, even with a soft heat pad material/glue around the caps/chokes and between them and the liquid block, they still whine a bit with the water blocks on - I'm very sensitive to it though, others say they can't hear it and think my machine is silent. Posted on Dec 18th 2014, 4:43 Reply

#19 Sony Xperia S

HumanSmoke I wouldn't hold your breath. What are the chances that a single Chinese leaker managed to get hold of not only both AMD's and Nvidia's next top GPUs, but also both AMD's second tier card and Nvidia's salvage part. Access to four unreleased top tier parts across two vendors simultaneously? Can't really see it TBH, especially as not a single other piece of leaked info from any one of these four has surfaced anywhere else.

Looks like an educated guess to me. Doesn't matter what the chances are.



If those deliver at least this performance or approximately, then it's fine. Hope to see them soonish, Q1 2015, anyone? :happy: :p :D Doesn't matter what the chances are.If those deliver at least this performance or approximately, then it's fine. Hope to see them soonish, Q1 2015, anyone? :happy: :p :D Posted on Dec 18th 2014, 5:32 Reply

#20 the54thvoid

Being able to limit your FPS to 60fps on a game where it doesn't matter too much if you're able to get 150fps. The power saving were mind blowing. And then I realised it wasn't a technical software guy. It was another AMD PR guy. I feel sad. I would love to see some hardware info about next gen, something to give us hope but more and more it's just bloody PR this after PR that.



The irony is, I would buy a 295X2 in a flash if I didn't care about the increase power use (or the possibility of coil whine). Hell, I'd splash out on the Ares version, I think it's tech porn in extreme. I just don't like where AMD are coming from just now, it all seems words, words, words with little actual substance.



And though it's not required I'll do it for balance- Nvidia pissed their 780ti buyers off by not releasing a 6Gb version (oh, look another lamo Titan variant instead). They gave the spent underdog 780 6Gb but said "fuck you" to the 780ti crowd - wait your Honour, this is relevant to my rebuttal of NV.

It seems Nvidia squeeze the market because they can and AMD seem to crank up their PR half wits every couple of weeks.



So close to buying a PS4. Hell, and a XBONE - hell both come to the cost of my 780ti classified.



Screw Red and Green. I've just read the PR guys quote. He says this:And then I realised it wasn't a technical software guy. It was another AMD PR guy. I feel sad. I would love to see some hardware info about next gen, something to give us hope but more and more it's just bloody PR this after PR that.The irony is, I would buy a 295X2 in a flash if I didn't care about the increase power use (or the possibility of coil whine). Hell, I'd splash out on the Ares version, I think it's tech porn in extreme. I just don't like where AMD are coming from just now, it all seems words, words, words with little actual substance.And though it's not required I'll do it for balance- Nvidia pissed their 780ti buyers off by not releasing a 6Gb version (oh, look another lamo Titan variant instead). They gave the spent underdog 780 6Gb but said "fuck you" to the 780ti crowd - wait your Honour, this is relevant to my rebuttal of NV.It seems Nvidia squeeze the market because they can and AMD seem to crank up their PR half wits every couple of weeks.So close to buying a PS4. Hell, and a XBONE - hell both come to the cost of my 780ti classified.Screw Red and Green. Posted on Dec 18th 2014, 5:56 Reply

#21 RejZoR

Aquinus GPU load has very little to do with the whine. Usually a whine is from an improperly secured inductor that's vibrating to a frequency that resonates at 60Hz. If you knew which inductor was doing this, you could fix it yourself.



Too bad that AMD can't put that kind of effort into multi-monitor power usages at idle. :( Actually it has. Very high framerate events often cause coil resonance. That's why you hear it more often in main menus or during video sequences. Some games even force 30 or 60fps in menus to avoid that. Actually it has. Very high framerate events often cause coil resonance. That's why you hear it more often in main menus or during video sequences. Some games even force 30 or 60fps in menus to avoid that. Posted on Dec 18th 2014, 6:01 Reply

#22 The Terrible Puddle

I don't think I'll be as worked up as Chris. Posted on Dec 18th 2014, 6:57 Reply

#23 Aquinus

Resident Wat-man RejZoR Actually it has. Very high framerate events often cause coil resonance. That's why you hear it more often in main menus or during video sequences. Some games even force 30 or 60fps in menus to avoid that. Right, but a (or many) coil(s) are(is) whining because it is (they're are) not fully secured. Just because more load makes a coil whine more because of one reason or another is beside the point. It whines for one reason and one reason only, because a coil is allowed to move and it vibrates. If coils don't vibrate, there is no coil whine. Right, but a (or many) coil(s) are(is) whining because it is (they're are) not fully secured. Just because more load makes a coil whine more because of one reason or another is beside the point. It whines for one reason and one reason only, because a coil is allowed to move and it vibrates. If coils don't vibrate, there is no coil whine. Posted on Dec 18th 2014, 7:25 Reply

#24 RCoon

the54thvoid I've just read the PR guys quote. He says this:







And then I realised it wasn't a technical software guy. It was another AMD PR guy. I feel sad. I would love to see some hardware info about next gen, something to give us hope but more and more it's just bloody PR this after PR that. But if you play Quake 3 on a 295x2 at 800x600 the power savings will be "mind blowing".

#seewhatididthere But if you play Quake 3 on a 295x2 at 800x600 the power savings will be "mind blowing".#seewhatididthere Posted on Dec 18th 2014, 7:30 Reply

#25 the54thvoid

RCoon But if you play Quake 3 on a 295x2 at 800x600 the power savings will be "mind blowing".

#seewhatididthere I see indeed :toast:



Q3?



DX9, crossfire is broken, :laugh:



See what I did there :p



And if you're playing Q3 at 600p, Dr Who wants his Tardis back. :D I see indeed :toast:Q3?DX9, crossfire is broken, :laugh:See whatdid there :pAnd if you're playing Q3 at 600p, Dr Who wants his Tardis back. :D Posted on Dec 18th 2014, 7:42 Reply