NVIDIA plans to release a fix for the GeForce GTX 970 memory allocation issue. In an informal statement to users of the GeForce Forums, an NVIDIA employee said that the company is working on a driver update that "will tune what's allocated where in memory to further improve performance." The employee also stressed that the GTX 970 is still the best performing graphics card at its price-point, and if current owners are not satisfied with their purchase, they should return it for a refund or exchange.

89 Comments on NVIDIA to Tune GTX 970 Resource Allocation with Driver Update

1 to 25 of 89 Go to Page 1234 PreviousNext

#1 puma99dk|

I doubt we will could see a return of this card world-wide, bcs soo many ppl incl. myself purchased this card last year when it was released but if it happens it wouldn't stop me to get it exchanged and pay a little extra to get a GTX 980 :laugh: Posted on Jan 28th 2015, 7:05 Reply

#2 esrever

Sounds like damage control more than anything. From the look of it, its serious hardware problem when you can't use the 0.5GB at the same time as the 3.5GB. It may not show up in nvidia's PR but the use of the last 0.5GB seems to be filled with microstutter because of the bus having to switch from 1 memory pool to the other with some latency. The drivers already currently try to not ever use the last 0.5GB unless absolutely needed, the best they could do is a hard lock to remove any use of the last 0.5GB if they want to remove the stutter. Posted on Jan 28th 2015, 7:08 Reply

#3 Lord Xeb

Definitely damage control.



I own a 970 and happy with it. Posted on Jan 28th 2015, 7:22 Reply

#4 HumanSmoke

puma99dk| I doubt we will could see a return of this card world-wide, bcs soo many ppl incl. myself purchased this card last year when it was released but if it happens it wouldn't stop me to get it exchanged and pay a little extra to get a GTX 980 :laugh: I'm guessing Nvidia, its partners, and sellers will close that down pretty promptly. It probably isn't that big a deal to add a superscript and footnoted conditions in fine print to the spec sheet I'm guessing Nvidia, its partners, and sellers will close that down pretty promptly. It probably isn't that big a deal to add a superscript and footnoted conditions in fine print to the spec sheet in much the same way that AMD does . Doesn't seem particularly logical to continue a bad situation when the remedy should be easy enough to implement. Posted on Jan 28th 2015, 7:40 Reply

#5 Naito

Lord Xeb I own a 970 and happy with it. Rightfully so. The performance hasn't changed since the initial batch of reviews just, as Anandtech put it, our perception of the GPU. Rightfully so. The performance hasn't changed since the initial batch of reviews just, as Anandtech put it, our perception of the GPU. Posted on Jan 28th 2015, 7:41 Reply

#6 zzzaac

True, performance is still top notch (and is business as usual when reaching up to 3.5GB). It is disappointing that the NVIDIA spec sheet was wrong, as I'm sure it wouldn't of ended up like this if they mentioned it properly.



Not sure about recalls (highly doubt it though), If I hazard a guess, at best, buyers might get a free game. Posted on Jan 28th 2015, 9:26 Reply

#7 buggalugs

Naito Rightfully so. The performance hasn't changed since the initial batch of reviews just, as Anandtech put it, our perception of the GPU. Yeh but most reviews missed this phenomenon, and If you're in the category of user with high res display who needs 4GBs of memory its not good. Those users may have chosen a 290X instead or something else. Nvidia deserves heat from this because they were dishonest about specs.....and they waited until after Christmas sales until the tech media reported on it before they would admit it.



Nvidia advertised the 970 as "having the same memory subsystem as the 980" when it clearly doesn't. There has been a thread on Nvidia forums for 3 months since the 970 came out about stuttering over 3.5Gb, I don't believe Nvidia just figured this out now. Yeh but most reviews missed this phenomenon, and If you're in the category of user with high res display who needs 4GBs of memory its not good. Those users may have chosen a 290X instead or something else. Nvidia deserves heat from this because they were dishonest about specs.....and they waited until after Christmas sales until the tech media reported on it before they would admit it.Nvidia advertised the 970 as "having the same memory subsystem as the 980" when it clearly doesn't. There has been a thread on Nvidia forums for 3 months since the 970 came out about stuttering over 3.5Gb, I don't believe Nvidia just figured this out now. Posted on Jan 28th 2015, 9:48 Reply

#8 Recus





When you say Nvidia should compensate GTX 970 remember that others should do it too. When you say Nvidia should compensate GTX 970 remember that others should do it too. Posted on Jan 28th 2015, 9:51 Reply

#9 john_

People just found GTX 970's "Kill Switch" that Nvidia didn't wanted to be found. I bet that optimization was already happening. And when Nvidia will want to push 970 owners to upgrade, that optimization will become history. Posted on Jan 28th 2015, 10:05 Reply

#10 Xzibit

john_ People just found GTX 970's "Kill Switch" that Nvidia didn't wanted to be found. I bet that optimization was already happening. And when Nvidia will want to push 970 owners to upgrade, that optimization will become history. Isn't that the typical cycle ?



I read it more like a directional path in architecture/software/binning. I think the end goal was to salvage more chips by using software to assist in those borderline unsalvageable part like the L2 cache without taking out the other L2 and pairing MCs and DRAM. Probably would have preferred to have kept it but too many that didn't qualify for 980 were showing 1 bad L2. They saw a way to make it work with software and were hoping it go un-noticed and they could just take out the 2 L2+MC and 1GB for the 960 TI. If they were to cut off the 2 L2s that would be less salvable chips and the "960 Ti" would be a 2GB 1k L2 salvage. That would have put them in another marketing dilemma given they are still 280/280x with 3GB & 290 with 4GB.







Time will tell if we see a similar segmented GPU down the line with GM200, Pascal or Volta. Isn't that the typical cycle ?I read it more like a directional path in architecture/software/binning. I think the end goal was to salvage more chips by using software to assist in those borderline unsalvageable part like the L2 cache without taking out the other L2 and pairing MCs and DRAM. Probably would have preferred to have kept it but too many that didn't qualify for 980 were showing 1 bad L2. They saw a way to make it work with software and were hoping it go un-noticed and they could just take out the 2 L2+MC and 1GB for the 960 TI. If they were to cut off the 2 L2s that would be less salvable chips and the "960 Ti" would be a 2GB 1k L2 salvage. That would have put them in another marketing dilemma given they are still 280/280x with 3GB & 290 with 4GB.Time will tell if we see a similar segmented GPU down the line with GM200, Pascal or Volta. Posted on Jan 28th 2015, 10:31 Reply

#11 bogami

That Nvidia have failed us nothing new. Especially with the continued excessive products 100% over the class value !:mad::banghead: .However, I am not surprised how they grow prototypes of STELTH aircraft in China like mushrooms after rain where are the main co-operating manufacturers factories ..Fuckers to us sell. failure cut

processors that have so locked sherders on best models ....:banghead:I look forward to AMD 300 best because it looks like it will be again better than the TITAN X and a half cheaper. nvidia I buy only due to the acquisition of AGEIA physics unit, now is the AMD solve this problem with its engine .Except as crazy prices lie fraud manipulations are ..I not here to say something good about nVidia recently -horor. Posted on Jan 28th 2015, 11:30 Reply

#12 Fluffmeister

Good news, I look forward to the driver update(s). Posted on Jan 28th 2015, 11:36 Reply

#13 HumanSmoke

bogami That Nvidia have failed us nothing new. Especially with the continued excessive products 100% over the class value !:mad::banghead: .However, I am not surprised how they grow prototypes of STELTH aircraft in China like mushrooms after rain where are the main co-operating manufacturers factories ..Fuckers to us sell. failure cut

processors that have so locked sherders on best models ....:banghead:I look forward to AMD 300 best because it looks like it will be again better than the TITAN X and a half cheaper. nvidia I buy only due to the acquisition of AGEIA physics unit, now is the AMD solve this problem with its engine .Except as crazy prices lie fraud manipulations are not here to say something about nVidia recently horor. It's like trying to read a message in fridge magnet letters during a violent earthquake. It's like trying to read a message in fridge magnet letters during a violent earthquake. Posted on Jan 28th 2015, 11:50 Reply

#14 john_

Fluffmeister Good news, I look forward to the driver update(s). Yes, very good news. With any driver update there are performance gains of course. What Nvidia will do is every performance gain that will also affect GTX 970 to call it "better resource allocation". Placebo pills for the owners of the cards, but OK, good news, who am I to disagree with that? Yes, very good news. With any driver update there are performance gains of course. What Nvidia will do is every performance gain that will also affect GTX 970 to call it "better resource allocation". Placebo pills for the owners of the cards, but OK, good news, who am I to disagree with that? Posted on Jan 28th 2015, 11:57 Reply

#15 Fluffmeister

Good point, more performance, can't wait! Posted on Jan 28th 2015, 12:16 Reply

#16 ShurikN

Recus



When you say Nvidia should compensate GTX 970 remember that others should do it too. The PS4 was advertised as 8GB GDDR5 total system ram, some is used on cpu, some on gpu. I dont see your point. The PS4 was advertised as 8GB GDDR5 total system ram, some is used on cpu, some on gpu. I dont see your point. Posted on Jan 28th 2015, 12:24 Reply

#17 Severus

I also own the 970 Strix and am very pleased with it. I haven't encountered the problem on my daily use. Posted on Jan 28th 2015, 12:26 Reply

#18 CounterZeus

The card already performs great for me (1080p), but I image the update will be good for those using SLI and actually have the GPU power to use more than 3.5GB video memory. (Yes I know, Skyrim with mods is an exception)



I just wish Nvidia told us everything from the start because the real life performance still stands out for the price.



Would have been nice to see the reviews testing scenarios where the extra disable parts actually have an impact. Posted on Jan 28th 2015, 12:32 Reply

#19 HumanSmoke

CounterZeus Would have been nice to see the reviews testing scenarios where the extra disable parts actually have an impact. pitting the GTX 970 against a GTX 980 simulating a 970 able to use the full 4GB vRAM allocation at full speed. PCGH have done some testing pitting the GTX 970 against a GTX 980 simulating a 970 able to use the full 4GB vRAM allocation at full speed. Posted on Jan 28th 2015, 12:49 Reply

#20 Mathragh

So what is there to tune? Making more sure the least-used data gets put in the extra 0,5GB?



From all I've read, it seems kinda unlikely they didn't already spend a significant amount of effort doing this, but time will tell I guess!



This all is becoming quite a complicated and difficult story. I hope people won't be put off simply by the massive amounts of (mis)information going around causing them to forget the real issue: Trust has once again been broken. Posted on Jan 28th 2015, 13:52 Reply

#21 bogami

HumanSmoke It's like trying to read a message in fridge magnet letters during a violent earthquake. I hope . 100% too expensive locked (meaning that the failure cut the processor)and they will do the same with TITAN X .. GTX480 core processors then the same the following year sold GTX580 processors .GTX 680 middle class sold for the highest, as now GTX 980, 27 nm and is not 20 nm. sold as the highest class processor is only a medium .TITAN onle 2660 core unlocked (unsuccessfully cut)sould as the best the useless increase in shoulder 100% mor..........lies Fraud Management.

when I hear nVidija is screwed up again I do not see a good product but a lot of greed .I own 5 generations nVidia but I think I'll wait on the AMD new gen .I started with AMD (ATI) and it looks like I'll be back.. GL with GTX970 100% overestimated price card nVidia i think . 20nm is at the door. I hope . 100% too expensive locked (meaning that the failure cut the processor)and they will do the same with TITAN X .. GTX480 core processors then the same the following year sold GTX580 processors .GTX 680 middle class sold for the highest, as now GTX 980, 27 nm and is not 20 nm. sold as the highest class processor is only a medium .TITAN onle 2660 core unlocked (unsuccessfully cut)sould as the best the useless increase in shoulder 100% mor..........lies Fraud Management.when I hear nVidija is screwed up again I do not see a good product but a lot of greed .I own 5 generations nVidia but I think I'll wait on the AMD new gen .I started with AMD (ATI) and it looks like I'll be back.. GL with GTX970 100% overestimated price card nVidia i think . 20nm is at the door. Posted on Jan 28th 2015, 13:52 Reply

#22 iO

Lets hope they dont just limit the card to 3.5GB and multiply the memory read-out by 12% and then say "Hey, look, there are your 4 gigs of vRAM. Now move along"... Posted on Jan 28th 2015, 13:56 Reply

#23 Mathragh

iO Lets hope they dont just limit the card to 3.5GB and multiply the memory read-out by 12% and then say "Hey, look, there are your 4 gigs of vRAM"... Don't forget also multiplying the memory bandwidth by 12,5%! Can't have people thinking the memory is slower than it would've been with 4GB! Don't forget also multiplying the memory bandwidth by 12,5%! Can't have people thinking the memory is slower than it would've been with 4GB! Posted on Jan 28th 2015, 13:57 Reply

#24 64K

I'm fine with Nvidia doing what they can with the drivers. I really don't want to return my 970 as it's a very good performer unless they offered me a trade up to a 980 for $100 more with my return but I still think Nvidia owes us something for their misrepresentation. Hell, even EA gave a free game to people that pre-ordered that mess of a game SimCity. Posted on Jan 28th 2015, 14:21 Reply

#25 Rahmat Sofyan





Oh yeah, this just in



Truth about the G-sync Marketing Module (NVIDIA using VESA Adaptive Sync Technology – Freesync) Basically, what NVIDIA is trying to force you to do is to buy their Module license while using VESA Adaptive-Sync Technology !.



Let’s be more clear. CUDA was made to force the developers to work on a NVIDIA GPU instead of an AMD GPU. The reason is that, if CUDA was capable to be used more widely, like DirectCompute or OpenCL, CUDA will certainly work better on AMD GPU. This is not good for NVIDIA, and the same goes for PhysX that could work on AMD GPU (actually working on CPU and NVIDIA GPU only).



NVIDIA wants to dominate the GPU segment and they are ready to close everything they made and this is not good at all. Always lying to their customers and to the developers.



The main problem here is that NVIDIA doesn’t like standard, so they always make licenses for each of their products and win a lot of Royalties.



Freesync on AMD = Adaptive-Sync like G-Sync Basically the NVIDIA drivers control everything between the G-sync Module and the Geforce GPU. The truth is that the G-sync Module does nothing else than confirm that the module is right here.



Which Monitors are compatible with the G-sync Modded Drivers ?



All the external monitors that include DP 1.2 (past 2010) and all the Laptops that include eDP.



Example : Yamakasi, Crossover, Dell, Asus (for example PB278Q), etc and MSI, Alienware, Asus, etc Laptop. Source



I wish this wasn't true...



PS : Sorry if OOT :), really bored with 3.5GB hype.. It'll be a miracle, these problem can fixed only by an update driver... finger crossed :)Oh yeah, this just inI wish this wasn't true...PS : Sorry if OOT :), really bored with 3.5GB hype.. Posted on Jan 28th 2015, 14:37 Reply