AMD Radeon Technologies Group (RTG) head Raja Koduri, in an interview with Venture Beat, confirmed that the company is currently working on two 14 nm FinFET GPUs based on the "Polaris" (4th generation Graphics CoreNext) architecture. He was quoted as referring to the two chips as "Polaris 10" and "Polaris 11." He remarked that the two chips are "extremely power efficient."Koduri ran Venture Beat through what's new with these chips , besides being built on the 14 nm process and GCN 4.0 stream processors - a redesigned front-end, new geometry processors, a new multimedia engine, and new display controllers. GCN 4.0 lends the chip an up-to-date API support besides significantly higher performance, the new multimedia engine features native h.265 hardware acceleration, and the display controllers support the latest DisplayPort 1.3 and HDMI 2.0a connectors.

37 Comments on AMD Working on Two "Polaris" GPUs: Raja Koduri

1 to 25 of 37 Go to Page 12 PreviousNext

#1 john_

Two GPUs can't cover the whole market, so for the beginning AMD is giving a small GPU for those who want something power efficient and enough for medium quality 1080p gaming and the successor to Fury X to regain the top spot. Probably the old GPUs will remain as rebrands for the sub $100 market and the prices between $200 and $600. Posted on Jan 18th 2016, 3:05 Reply

#2 HumanSmoke

john_ Two GPUs can't cover the whole market, so for the beginning AMD is giving a small GPU for those who want something power efficient and enough for medium quality 1080p gaming and the successor to Fury X to regain the top spot. Probably the old GPUs will remain as rebrands for the sub $100 market and the prices between $200 and $600. Probably. You would think a high-end GPU would be a priority if just to maintain marketing momentum although I suspect it is most definitely needed for the VR push. The other GPU would almost certainly IMO be a Pitcairn/Curacao/Trinidad replacement since it is getting pretty long in the tooth and doesn't have great support for many of the newer GCN features Probably. You would think a high-end GPU would be a priority if just to maintain marketing momentum although I suspect it is most definitely needed for the VR push. The other GPU would almost certainly IMO be a Pitcairn/Curacao/Trinidad replacement since it is getting pretty long in the tooth and doesn't have great support for many of the newer GCN features Posted on Jan 18th 2016, 3:34 Reply

#3 Orijin16

2 gpu's covers the high end and upper mid range if they just disable a few cores or use gddr5 on one and hbm on another, leaving rebrands for the low end / lower mid range.



When I first heard about Polaris going all out on efficiency I was a bit concerned because efficiency and processing grunt don't often go hand in hand, but they did a pretty good job coming close to Maxwell's efficiency with Fiji, despite being on a much larger process node, so shrinking down to 14nm should give them some pretty good performance per watt even without all the other design tweaks.



I'm pretty excited to see what they release, although saying that I was about Fiji which turned out to be slightly disappointing and could / should of been much better than it is. Posted on Jan 18th 2016, 4:15 Reply

#4 julizs

Two new Gpu's will be enough.



1. R9 490X

R9 490 (disable some cores)



2. R9 480X

R9 480 (disable some cores)



Rebrand the rest if you need to, not interested in those anyway... Posted on Jan 18th 2016, 4:44 Reply

#5 NC37

julizs Two new Gpu's will be enough.



1. R9 490X

R9 490 (disable some cores)



2. R9 480X

R9 480 (disable some cores)



Rebrand the rest if you need to, not interested in those anyway... Sadly this will likely be so.



Course they only really need 2 if they get it right. Heck in the old days ATI got by on about that much at times. Sadly this will likely be so.Course they only really need 2 if they get it right. Heck in the old days ATI got by on about that much at times. Posted on Jan 18th 2016, 4:52 Reply

#6 julizs

NC37 Sadly this will likely be so.



Course they only really need 2 if they get it right. Heck in the old days ATI got by on about that much at times. Well, it can only get better.





R9 300 series initially was a big disappointment, too many rebrands and the new cards came way too late and were too expensive.

Imo the 900 series was also a disappointment, 960 is a turd, 970 was good performance per dollar but turned out to be 3.5gb ram and also had a buzzing problem. 980 was 200€ more then 970 and almost identical performance. 980ti later on was cool.





My expectations are at an all time low but I think both Amd and Nvidia are gonna deliver this time. Well, it can only get better.R9 300 series initially was a big disappointment, too many rebrands and the new cards came way too late and were too expensive.Imo the 900 series was also a disappointment, 960 is a turd, 970 was good performance per dollar but turned out to be 3.5gb ram and also had a buzzing problem. 980 was 200€ more then 970 and almost identical performance. 980ti later on was cool.My expectations are at an all time low but I think both Amd and Nvidia are gonna deliver this time. Posted on Jan 18th 2016, 5:05 Reply

#7 micropage7

personally i want lower power consumption card with middle power

just wait.... Posted on Jan 18th 2016, 5:06 Reply

#8 RejZoR

I think I'll be switching to AMD again when this comes out. These Polaris GPU's sound amazing! A proper evolution of what R9 Fury X should be when it was released. Posted on Jan 18th 2016, 5:27 Reply

#9 Aquinus

Resident Wat-man GCN 4.0? Wasn't the last version 1.3 not 3.0? Posted on Jan 18th 2016, 5:50 Reply

#10 medi01

Did they ever have more than 2?



Wasn't it even one back in 4xxx times? (with x2 for a flagship) Posted on Jan 18th 2016, 6:42 Reply

#11 okidna

Aquinus GCN 4.0? Wasn't the last version 1.3 not 3.0? There's difference beween AMD internal naming scheme and press/tech-sites naming scheme regarding GCN.

AMD use "generation" naming scheme and press/tech-sites prefer to use "x.x" naming scheme.



And each have a reason for their preference naming scheme, press/tech-sites consider that GCN progression is an evolution or incremental update hence the small increment in the version (1.0, 1.1, 1.2) meanwhile AMD think every update is a milestone so it's worthy for a "new generation" name.



GCN 1.0 = GCN 1st generation (Southern Islands)

GCN 1.1 = GCN 2nd generation (Sea Islands)

GCN 1.2 and 1.3 (if you consider Fiji) = GCN 3rd generation (Volcanic Islands)

GCN ?.? (probably 1.3/1.4 or 2.0) = GCN 4th generation (Arctic Islands) There's difference beween AMD internal naming scheme and press/tech-sites naming scheme regarding GCN.AMD use "generation" naming scheme and press/tech-sites prefer to use "x.x" naming scheme.And each have a reason for their preference naming scheme, press/tech-sites consider that GCN progression is an evolution or incremental update hence the small increment in the version (1.0, 1.1, 1.2) meanwhile AMD think every update is a milestone so it's worthy for a "new generation" name.GCN 1.0 = GCN 1st generation (Southern Islands)GCN 1.1 = GCN 2nd generation (Sea Islands)GCN 1.2 and 1.3 (if you consider Fiji) = GCN 3rd generation (Volcanic Islands)GCN ?.? (probably 1.3/1.4 or 2.0) = GCN 4th generation (Arctic Islands) Posted on Jan 18th 2016, 7:31 Reply

#12 Deeveo

Don't they usually start with smaller chips with a new process node, and especially combined with new (improved?) architecture? But I really hope both succeed and we see some fierce competition on performance to keep those prices in check. Posted on Jan 18th 2016, 7:53 Reply

#13 jabbadap

HumanSmoke Probably. You would think a high-end GPU would be a priority if just to maintain marketing momentum although I suspect it is most definitely needed for the VR push. The other GPU would almost certainly IMO be a Pitcairn/Curacao/Trinidad replacement since it is getting pretty long in the tooth and doesn't have great support for many of the newer GCN features Yep, AMD needs new low/mid range gpu. Think about notebooks, most of notebook discrete gpus right now is maxwell based. Pitcairn, Bonaire and Tonga(and their respective aka names) can't really compete on perf/W metrics with maxwell. Yep, AMD needs new low/mid range gpu. Think about notebooks, most of notebook discrete gpus right now is maxwell based. Pitcairn, Bonaire and Tonga(and their respective aka names) can't really compete on perf/W metrics with maxwell. Posted on Jan 18th 2016, 8:05 Reply

#14 Moofachuka

That's the same as Fury... There was a normal Fury and a Fury X chip. They don't need to work on low end since the current gen will move down to lower range... (which in other words, rebranding) Posted on Jan 18th 2016, 8:47 Reply

#15 deemon

notebooks will use apu-s, not gpu-s Posted on Jan 18th 2016, 8:57 Reply

#16 xfia

deemon notebooks will use apu-s, not gpu-s The high end of mid range is def Carrizo. The top tier R7 model is just as good as the game systems at 1080p. The high end of mid range is def Carrizo. The top tier R7 model is just as good as the game systems at 1080p. Posted on Jan 18th 2016, 9:17 Reply

#17 jabbadap

deemon notebooks will use apu-s, not gpu-s I said discrete, apus have igpu but even many amd apu netbooks have discrete radeon with them(hybrid crossfire or not). And do you suggest it's vice to amd to leave every intel inside laptop to nvidia.



And the myth that igpus will rule the world is just a myth. It's moving target, game graphics will evolve to require better and better gpus, which tiny core igpu can't handle. I said discrete, apus have igpu but even many amd apu netbooks have discrete radeon with them(hybrid crossfire or not). And do you suggest it's vice to amd to leave every intel inside laptop to nvidia.And the myth that igpus will rule the world is just a myth. It's moving target, game graphics will evolve to require better and better gpus, which tiny core igpu can't handle. Posted on Jan 18th 2016, 9:48 Reply

#18 GhostRyder

I guess by that standard we will get 4 total cards on the new architecture. Well that is all fine and dandy as long as we get some serious updates this round with the new lineup. We need some cards that will have 8gb HBM, new supported techs like DP 1.3 and such, along with (hopefully) a little better overclocking. I will be purchasing this year based on what's released so ill pick my cards on who has the top dog. Posted on Jan 18th 2016, 9:51 Reply

#19 xfia

jabbadap I said discrete, apus have igpu but even many amd apu netbooks have discrete radeon with them(hybrid crossfire or not). And do you suggest it's vice to amd to leave every intel inside laptop to nvidia.



And the myth that igpus will rule the world is just a myth. It's moving target, game graphics will evolve to require better and better gpus, which tiny core igpu can't handle. AMD throws down 3x the graphics performance as Intel when it comes to apu's... all that Intel inside stuff with no NV to back it up is kinda garbage. On top of that HSA is practically made to be best friends with DX12. If they make both sides of the next apu's with 14nm finfet the performance increase will certainly bring them up another tier.. if not two. AMD throws down 3x the graphics performance as Intel when it comes to apu's... all that Intel inside stuff with no NV to back it up is kinda garbage. On top of that HSA is practically made to be best friends with DX12. If they make both sides of the next apu's with 14nm finfet the performance increase will certainly bring them up another tier.. if not two. Posted on Jan 18th 2016, 9:54 Reply

#20 jabbadap

xfia AMD throws down 3x the graphics performance as Intel when it comes to apu's... all that Intel inside stuff with no NV to back it up is kinda garbage. Iris pro in notebooks are quite similar performance vice as most powerful carrizo igpu so I don't think your graphics performance is up to date. AMD apus needs more memory bandwidth, while intel uses dedicated edram from micron to address memory bandwidth problems. Next gen APU's will hopefully address this with hbm. Iris pro in notebooks are quite similar performance vice as most powerful carrizo igpu so I don't think your graphics performance is up to date. AMD apus needs more memory bandwidth, while intel uses dedicated edram from micron to address memory bandwidth problems. Next gen APU's will hopefully address this with hbm. Posted on Jan 18th 2016, 10:09 Reply

#21 xfia

jabbadap Iris pro in notebooks are quite similar performance vice as most powerful carrizo igpu so I don't think your graphics performance is up to date. AMD apus needs more memory bandwidth, while intel uses dedicated edram from micron to address memory bandwidth problems. Next gen APU's will hopefully address this with hbm. yeah i read a leak but with the 35w power target considered eh.. and best in class online gaming

www.notebookcheck.net/AMD-Radeon-R7-Carrizo-Benchmarks.144288.0.html

still tho i expect finfet to super charge AMD apu's

www.forbes.com/sites/patrickmoorhead/2015/06/11/for-advanced-micro-devices-it-was-all-about-carrizo-at-computex-2015/#2715e4857a0b16c05f4b3d7a yeah i read a leak but with the 35w power target considered eh.. and best in class online gamingstill tho i expect finfet to super charge AMD apu's Posted on Jan 18th 2016, 10:29 Reply

#22 jabbadap

xfia yeah i read a leak but with the 35w power target considered eh..

www.notebookcheck.net/AMD-Radeon-R7-Carrizo-Benchmarks.144288.0.html

still tho i expect finfet to super charge AMD apu's Well yeah, 14nm apus will be the thing. They can put more shaders to igpu, use hbm memory and possibly Zen -cpu cores(hopefully they don't just wait to get zen ready if it's take too much time, die shrinked excavator could be a good filler product).



Spoiler: Uhh slightly going offtopic, sorry about that...

www.notebookcheck.net/Intel-Iris-Graphics-540.149939.0.html Heh, i know intel and amd Watts are different, but 15W cpu(Intel Core i7-6650U from ms surface pro 4): Well yeah, 14nm apus will be the thing. They can put more shaders to igpu, use hbm memory and possibly Zen -cpu cores(hopefully they don't just wait to get zen ready if it's take too much time, die shrinked excavator could be a good filler product). Posted on Jan 18th 2016, 11:28 Reply

#23 xfia

I keep seeing the commercials for the surface pro 4. I love the new pen and how it folds up. I would get one for some video editing..netflix on the go and stuff with a Lumia to match. Off topic but tired of Android.. Posted on Jan 18th 2016, 11:43 Reply

#24 Deeveo

APU with 8GB HBM could be a thing of wonder, you could make solder version with a mb with no memory slots at all. El cheapo system eh? Posted on Jan 18th 2016, 11:44 Reply

#25 xfia

Deeveo APU with 8GB HBM could be a thing of wonder, you could make solder version with a mb with no memory slots at all. El cheapo system eh? I think thats exactly what the next XB and PS will have.. they need HBM to drive 4k the way they want to. I also think that the XB1 and PS4 will be thought of as the 1080p system for the same games for awhile until devs start taking advantage of the new features they will be packed with. I think thats exactly what the next XB and PS will have.. they need HBM to drive 4k the way they want to. I also think that the XB1 and PS4 will be thought of as the 1080p system for the same games for awhile until devs start taking advantage of the new features they will be packed with. Posted on Jan 18th 2016, 12:03 Reply