The GeForce GTX 1080 set high standards for efficiency. Launched as a high-end product that was faster than any other client-segment graphics card at the time, the GTX 1080 made do with just a single 8-pin PCIe power connector, and had a TDP of just 180W. The reference-design PCB, accordingly, has a rather simple VRM setup. The alleged GTX 1080-successor, called either GTX 1180 or GTX 2080 depending on who you ask, could deviate from its ideology of extreme efficiency. There were telltale signs of this departure on the first bare PCB shots The PCB pictures revealed preparation for an unusually strong VRM design, given that this is an NVIDIA reference board. It draws power from a combination of 6-pin and 8-pin PCIe power connectors, and features a 10+2 phase setup, with up to 10 vGPU and 2 vMem phases. The size of the pads for the ASIC and no more than 8 memory chips confirmed that the board is meant for the GTX 1080-successor. Adding to the theory of this board being unusually hot is an article by Chinese publication Benchlife.info, which mentions that the reference design (Founders Edition) cooling solution does away with a single lateral blower, and features a strong aluminium fin-stack heatsink ventilated by two top-flow fans (like most custom-design cards). Given that NVIDIA avoided such a design for even big-chip cards such as the GTX 1080 Ti FE or the TITAN V, the GTX 1080-successor is proving to be an interesting card to look forward to. But then what if this is the fabled GTX 1180+ / GTX 2080+, slated for late-September?

75 Comments on NVIDIA GTX 1080-successor a Rather Hot Chip, Reference Cooler Has Dual-Fans

1 to 25 of 75 Go to Page 123 PreviousNext

#1 Midland Dog

i doubt it, if titan v runs at similar power to titan xp then there is no way that these new cards are going to be "hot, or hungry". my guess is that the dual fan reference design is to entice people to buy the founders edition cards before the aib cards drop Posted on Aug 10th 2018, 5:25 Reply

#2 INSTG8R

Vanguard Beta Tester Regardless of why, the fact they are not using the ancient blower style anymore is a step in the right direction. Posted on Aug 10th 2018, 5:35 Reply

#3 krykry

Midland Dog i doubt it, if titan v runs at similar power to titan xp then there is no way that these new cards are going to be "hot, or hungry". my guess is that the dual fan reference design is to entice people to buy the founders edition cards before the aib cards drop Remember that Titan V gets a lot of power efficiency from HBM2 which uses about a third of the same power GDDR5 uses for the same performance, and doesn't have a significant increase in amount of cores. So HBM2 efficiency covers the computing cores increase.



...Which means that Titan V and Titan xp perf/watt are roughly same. Remember that Titan V gets a lot of power efficiency from HBM2 which uses about a third of the same power GDDR5 uses for the same performance, and doesn't have a significant increase in amount of cores. So HBM2 efficiency covers the computing cores increase....Which means that Titan V and Titan xp perf/watt are roughly same. Posted on Aug 10th 2018, 5:55 Reply

#4 Caring1

Hotter and hungrier, just to stay ahead of the competition by a few points in Benchmarks.

The way I see it, better efficiency and lower power consumption wins, lets see which uses the most at the wall. Posted on Aug 10th 2018, 6:09 Reply

#5 FordGT90Concept

"I go fast!1!11!1!" Or they're just not doing Founder's Edition for the series. NVIDIA has had plenty of time to stockpile chips and send them to AIBs so there's no delay between launch date and AIB availability. Alternatively, it could be an AIB card that was described that NVIDIA may have bulk ordered to carry the Founder's Edition badge (e.g. Asus, MSI, or Gigabyte as an informal apology for GeForce Partner Program). Posted on Aug 10th 2018, 6:19 Reply

#6 cucker tarlson

Caring1 Hotter and hungrier, just to stay ahead of the competition by a few points in Benchmarks. Their competition is a +300W card that is 200W GTX 1080 performance.



I think they just want to sell more FE cards, this time they're offering a solution similar to AIB coolers, they'll probably offer a blower one as well. As for the "hot and hungry", 8-pin works for 150-180W cards fairly well, but once you go near 200W you'd better have that extra 6-pin, max power spikes are always higher than avg. power consumption. It's gonna be 1080Ti with GDDR6 and slightly more efficient process, 10-15% faster, with TDP probably half way between 1080 and 1080Ti. Their competition is a +300W card that is 200W GTX 1080 performance.I think they just want to sell more FE cards, this time they're offering a solution similar to AIB coolers, they'll probably offer a blower one as well. As for the "hot and hungry", 8-pin works for 150-180W cards fairly well, but once you go near 200W you'd better have that extra 6-pin, max power spikes are always higher than avg. power consumption. It's gonna be 1080Ti with GDDR6 and slightly more efficient process, 10-15% faster, with TDP probably half way between 1080 and 1080Ti. Posted on Aug 10th 2018, 7:26 Reply

#7 Prima.Vera

That's an MSI card in the pic, but to be honest it would be awesome to have something like this on the factory card. Those coolers are the best both performance and noise wise. Posted on Aug 10th 2018, 8:22 Reply

#8 stimpy88

If this is true, and it's running hot, then it looks like nVidia is struggling to innovate, and is instead relying on overclocking the chip to get performance...



nVidia has had a long time with no pressure on them to make this "new" GPU, so this is rather telling, if true. Posted on Aug 10th 2018, 9:27 Reply

#9 RejZoR

NVIDIA is not struggling to innovate. They just don't really care at this point as they can milk some more money out of old Pascal with minimal effort. I mean, really, why would they throw in millions if they can just stretch Pascal a bit and call it a day. So what if it's hot, it'll be faster than anything one can buy for gaming and that's enough for some. I mean, whoever is buying the fastest Ferrari, petrol consumption is the last thing they care about. It's not much different here. All this efficiency is all nice and fancy, but if you have the fastest card in the world, would you really care? I know I wouldn't. Posted on Aug 10th 2018, 9:34 Reply

#10 Liviu Cojocaru

RejZoR NVIDIA is not struggling to innovate. They just don't really care at this point as they can milk some more money out of old Pascal with minimal effort. I mean, really, why would they throw in millions if they can just stretch Pascal a bit and call it a day. So what if it's hot, it'll be faster than anything one can buy for gaming and that's enough for some. I mean, whoever is buying the fastest Ferrari, petrol consumption is the last thing they care about. It's not much different here. All this efficiency is all nice and fancy, but if you have the fastest card in the world, would you really care? I know I wouldn't. I agree, basically the same as Intel did with the CPU's. As for the two fan set up I as well think it might be for the more powerful version coming out later in the year I agree, basically the same as Intel did with the CPU's. As for the two fan set up I as well think it might be for the more powerful version coming out later in the year Posted on Aug 10th 2018, 9:40 Reply

#11 Xzibit

This reminds me of when TT "industry sources" said the 1180 & 1170 were going to be released in July Posted on Aug 10th 2018, 9:49 Reply

#12 tami626

Liviu Cojocaru I agree, basically the same as Intel did with the CPU's. So, does it mean hope for AMD? :D So, does it mean hope for AMD? :D Posted on Aug 10th 2018, 9:51 Reply

#13 softreaper

If blower did have cooled Fermi, they could on anything, it's just a question of design / cost or anything else Posted on Aug 10th 2018, 9:58 Reply

#14 Liviu Cojocaru

tami626 So, does it mean hope for AMD? :D That might be an option as well but I have a G-Sync monitor so... That might be an option as well but I have a G-Sync monitor so... Posted on Aug 10th 2018, 9:59 Reply

#15 the54thvoid

stimpy88 If this is true, and it's running hot, then it looks like nVidia is struggling to innovate, and is instead relying on overclocking the chip to get performance...



nVidia has had a long time with no pressure on them to make this "new" GPU, so this is rather telling, if true. Nobody knows for sure what's in the chip yet. I don't see the OC as the issue. It all depends what they've put in the hardware. By all accounts (metaphorically speaking) Nvidia are bringing a higher degree of compute back to their chips. And that's a hotty right there. Nobody knows for sure what's in the chip yet. I don't see the OC as the issue. It all depends what they've put in the hardware. By all accounts (metaphorically speaking) Nvidia are bringing a higher degree of compute back to their chips. And that's a hotty right there. Posted on Aug 10th 2018, 10:23 Reply

#16 bug

Caring1 Hotter and hungrier, just to stay ahead of the competition by a few points in Benchmarks.

The way I see it, better efficiency and lower power consumption wins, lets see which uses the most at the wall. AMD can't match Pascal in either efficiency or raw pixel crunching. I'd say there's some room for Nvidia to up the power draw while still keeping its leadership.

I tend to be rather laid back when it comes to high-end cards because I don't buy them. I buy mid-range and the power draw/efficiency is much better in that segment regardless of what happens at the top. Obviously ymmv. tami626 So, does it mean hope for AMD? :D Your competition raking in cash because you underperform never equals hope.

AMD's hope is Zen can make them enough cash to fund their GPU game. Just like their GPU game kept them afloat during Bulldozer days. So there is hope for AMD, but it has nothing to do with this announcement right here. AMD can't match Pascal in either efficiency or raw pixel crunching. I'd say there's some room for Nvidia to up the power draw while still keeping its leadership.I tend to be rather laid back when it comes to high-end cards because I don't buy them. I buy mid-range and the power draw/efficiency is much better in that segment regardless of what happens at the top. Obviously ymmv.Your competition raking in cash because you underperform never equals hope.AMD's hope is Zen can make them enough cash to fund their GPU game. Just like their GPU game kept them afloat during Bulldozer days. So there is hope for AMD, but it has nothing to do with this announcement right here. Posted on Aug 10th 2018, 10:23 Reply

#17 Vya Domus

They are going with the same strategy that they had with the second generation of Kepler based GPUs. Larger dies with higher frequencies and no tangible advancement in power efficiency, they can't rely on new nodes everytime. Posted on Aug 10th 2018, 10:54 Reply

#18 medi01

tami626 So, does it mean hope for AMD? :D Elaborate on "hope for AMD", I might have missed the problems they had recently, looking at their stock. RejZoR NVIDIA is not struggling to innovate. They just don't really care at this point as they can milk some more money out of old Pascal with minimal effort. I mean, really, why would they throw in millions if they can just stretch Pascal a bit and call it a day. First, it's billions, and second, because developing GPUs takes many months and there are likely numerous projects at nVidia that has never been seen by consumer.



It was obvious that Pascal had thermal headroom. It was also obvious that 20-25% bump from arch bump on the same process node would not have come for free. Why does wording in OP sound like it was written by an nVidia employee working in marketing department?Elaborate on "hope for AMD", I might have missed the problems they had recently, looking at their stock.First, it's billions, and second, because developing GPUs takes many months and there are likely numerous projects at nVidia that has never been seen by consumer.It was obvious that Pascal had thermal headroom. It was also obvious that 20-25% bump from arch bump on the same process node would not have come for free. Posted on Aug 10th 2018, 11:00 Reply

#19 uuuaaaaaa





This is allegedly leaked info by an nVidia employee: Posted on Aug 10th 2018, 11:24 Reply

#20 tami626

Liviu Cojocaru That might be an option as well but I have a G-Sync monitor so... Well, I guess competition is good for everyone, even if you buy NVIDIA exclusively. Look what Intel did after Ryzen. They stole the MOAR CORES tactics. Well, I guess competition is good for everyone, even if you buy NVIDIA exclusively. Look what Intel did after Ryzen. Posted on Aug 10th 2018, 11:36 Reply

#21 Liviu Cojocaru

tami626 Well, I guess competition is good for everyone, even if you buy NVIDIA exclusively. Look what Intel did after Ryzen. They stole the MOAR CORES tactics. Competition is always good for technology world and consumers in general...unfortunately sometimes some companies that should be in competition arrange prices and the customer has to suffer off this Competition is always good for technology world and consumers in general...unfortunately sometimes some companies that should be in competition arrange prices and the customer has to suffer off this Posted on Aug 10th 2018, 11:46 Reply

#22 FordGT90Concept

"I go fast!1!11!1!" uuuaaaaaa This is allegedly leaked info by an nVidia employee:



Makes sense to me. The Tensor cores in Volta are not things NVIDIA wants to waste fab space on for gamers. Turing always made sense considering the delay between Pascal and now. NVIDIA pushing RTX also fits NVIDIA's modus operandi. Makes sense to me. The Tensor cores in Volta are not things NVIDIA wants to waste fab space on for gamers. Turing always made sense considering the delay between Pascal and now. NVIDIA pushing RTX also fits NVIDIA's modus operandi. Posted on Aug 10th 2018, 11:46 Reply

#23 bug

Vya Domus They are going with the same strategy that they had with the second generation of Kepler based GPUs. Larger dies with higher frequencies and no tangible advancement in power efficiency, they can't rely on new nodes everytime. RTX alone would disagree with that assertion. But it is true that besides tensors and RTX, not many improvements have been advertised for Volta, so you're probably not far off base. RTX alone would disagree with that assertion. But it is true that besides tensors and RTX, not many improvements have been advertised for Volta, so you're probably not far off base. Posted on Aug 10th 2018, 11:48 Reply

#24 Fluffmeister

Hey, at least their top don't isn't using LC yet, unlike... Posted on Aug 10th 2018, 12:01 Reply

#25 uuuaaaaaa

FordGT90Concept Makes sense to me. The Tensor cores in Volta are not things NVIDIA wants to waste fab space on for gamers. Turing always made sense considering the delay between Pascal and now. NVIDIA pushing RTX also fits NVIDIA's modus operandi. Totally, I think it will also be the same node of the current gen, which is fitting with the subject of this threads' OP. Totally, I think it will also be the same node of the current gen, which is fitting with the subject of this threads' OP. Posted on Aug 10th 2018, 12:04 Reply