NVIDIA's G-SYNC technology is rivaled by AMD's project Freesync, which is based on a technology standardized by the video electronics standards association (VESA), under Adaptive Sync. The technology lets GPUs and monitors keep display refresh rates in sync with GPU frame-rates, so the resulting output appears fluid. VESA's technology does not require special hardware inside standards-compliant monitors, and is royalty-free, unlike NVIDIA G-SYNC, which is based on specialized hardware, which display makers have to source from NVIDIA, which makes it a sort of a royalty.When asked by Chinese publication Expreview on whether NVIDIA GPUs will support VESA adaptive-sync, the company mentioned that NVIDIA wants to focus on G-SYNC. A case in point is the display connector loadout of the recently launched GeForce GTX 980 and GTX 970. According to specifications listed on NVIDIA's website, the two feature DisplayPort 1.2 connectors, and not DisplayPort 1.2, a requirement of VESA's new technology. AMD's year-old Radeon R9 and R7 GPUs, on the other hand, support DisplayPort 1.2a, casting a suspicion on NVIDIA's choice of connectors. Interestingly, the GTX 980 and GTX 970 feature HDMI 2.0, so it's not like NVIDIA is slow at catching up with new standards. Did NVIDIA leave out DisplayPort 1.2a in a deliberate attempt to check Adaptive Sync?

114 Comments on NVIDIA Sacrifices VESA Adaptive Sync Tech to Rake in G-SYNC Royalties

1 to 25 of 114 Go to Page 12345 PreviousNext

#1 v12dock

Gotta love Nvidia Posted on Sep 24th 2014, 22:24 Reply

#2 INSTG8R

Vanguard Beta Tester



:shadedshu: :rolleyes: :shadedshu: :rolleyes: Posted on Sep 24th 2014, 22:28 Reply

#3 Cheeseball

Not a Potato From the article, it sounds like NVIDIA is actively blocking FreeSync, when in fact all it is is that the 900 series lacks DisplayPort 1.2a support, which is required for FreeSync. Posted on Sep 24th 2014, 22:28 Reply

#4 CookieMonsta

Utterly stupid move by NVIDIA if it proves to be true. History has not been kind to proprietary technology, NVIDIA does not want it's competitors to unite over a common standard lest it becomes marginalized by the LCD manufacturers. Posted on Sep 24th 2014, 22:35 Reply

#5 The Von Matrices

I wouldn't jump to conclusions until there actually are monitors supporting adaptive refresh in the market. It's completely normal for companies to avoid mentioning and outright deny upcoming features/products in order to avoid cannibalizing sales of current products. Posted on Sep 25th 2014, 0:53 Reply

#6 Hitman_Actual

Gysnc works,



Gsync is awesome



Gsync is here now.





Nvidia pushing technology forward with their enginuity and innovations



AMD needs to start making some leaps and strides if they expect to survive.



I wish AMD was pushing technology more then there would be an actual competition between red/green thus pushing performance and technology at a faster rate instead the snail like speed of the past 6+ years. Posted on Sep 25th 2014, 1:29 Reply

#7 ISI300

The fact that they support HDMI 2.0 (the first gpu to support that standard which is the latest HDMI revision), yet refuse to implement the latest displayport standard, makes the whole thing stink. Mother****ers! Posted on Sep 25th 2014, 1:40 Reply

#8 arbiter

I read off somewhere that a closed door meeting that 900 series can do 1.2a, all needs is software update so. Posted on Sep 25th 2014, 1:42 Reply

#9 RejZoR

We change graphic cards way more often than monitors. So being stuck to a single graphic card brand because monitor supports G-Sync only is dumb. But if you have FreeSync enabled monitor, you are free to choose whichever graphic cad brand suits the price/performance best. But G-Sync monitor users are stuck with NVIDIA whether they like it or not (if they want HW adaptive sync). Posted on Sep 25th 2014, 1:44 Reply

#10 RejZoR

arbiter I read off somewhere that a closed door meeting that 900 series can do 1.2a, all needs is software update so. Sure, and all i need is a software update to convert my VGA port into DisplayPort... Things don't work that way with connectors and standards associated with them. Sure, and all i need is a software update to convert my VGA port into DisplayPort... Things don't work that way with connectors and standards associated with them. Posted on Sep 25th 2014, 1:46 Reply

#11 jigar2speed

Hitman_Actual Gysnc works,



Gsync is awesome



Gsync is here now.





Nvidia pushing technology forward with their enginuity and innovations



AMD needs to start making some leaps and strides if they expect to survive.



I wish AMD was pushing technology more then there would be an actual competition between red/green thus pushing performance and technology at a faster rate instead the snail like speed of the past 6+ years. I totally agree, AMD are light years behind Nvidia and should kiss Nvidia's feet for coming up with Gsync - Nvidia has every right to charge everyone for this tech, and btw, how dare AMD copy Gsync and call it free sync - such dumb asses. /if i have to put a sarcasm tag here, i seriously feel bad for you. I totally agree, AMD are light years behind Nvidia and should kiss Nvidia's feet for coming up with Gsync - Nvidia has every right to charge everyone for this tech, and btw, how dare AMD copy Gsync and call it free sync - such dumb asses. /if i have to put a sarcasm tag here, i seriously feel bad for you. Posted on Sep 25th 2014, 1:48 Reply

#12 dansergiu

This doesn't say that Nvidia decided not to support Adaptive Sync. It simply says that they are focusing on GSync. Also, it's unlikely that they will not support it since it's part of the standard, but most likely they will market the GSync as the better solution, at least for a while.



Anyway, my point is that the title of the article is slightly misleading; it looks more like click bait from a tabloid than an article on a tech publication to be honest. Posted on Sep 25th 2014, 2:03 Reply

#13 semantics

dansergiu This doesn't say that Nvidia decided not to support Adaptive Sync. It simply says that they are focusing on GSync. Also, it's unlikely that they will not support it since it's part of the standard, but most likely they will market the GSync as the better solution, at least for a while.



Anyway, my point is that the title of the article is slightly misleading; it looks more like click bait from a tabloid than an article on a tech publication to be honest. Gsync is currently a better product. Given it's an actually buyable product, it can even do QHD at 144hz right now and it works pretty reliably. These are all things we'll have to assess with freesync they are not the same technology they are two different approaches and we have yet to see Freesync in the erratic frame environment of games. That being it's perfect for watching video which is unlikely to see frame rate drops and rises like a game would. Gsync is currently a better product. Given it's an actually buyable product, it can even do QHD at 144hz right now and it works pretty reliably. These are all things we'll have to assess with freesync they are not the same technology they are two different approaches and we have yet to see Freesync in the erratic frame environment of games. That being it's perfect for watching video which is unlikely to see frame rate drops and rises like a game would. Posted on Sep 25th 2014, 2:06 Reply

#14 Animalpak

first of all we must see if Free-Sync works the same or better than G-Sync then we can talk...



For now G-Sync is already a reality and is proven that works flawlessy. Posted on Sep 25th 2014, 2:12 Reply

#15 arbiter

RejZoR Sure, and all i need is a software update to convert my VGA port into DisplayPort... Things don't work that way with connectors and standards associated with them. It could be just as simple as that, why say you support adaptive sync when there is 0 monitors out and according to press release won't be til least q1 on some.

"Today, "

ir.amd.com/phoenix.zhtml?c=74093&p=irol-newsArticle&ID=1969277



That is off AMD's press rls, If they say end of year sounds like no monitor til near end of q1. As other said no reason to say support something isn't out when you have working product Now. Just to kill off your own sales when don't even know if adaptive sync will give any benefit to games or if that is limited to AMD proprietary code. Animalpak first of all we must see if Free-Sync works the same or better than G-Sync then we can talk...



For now G-Sync is already a reality and is proven that works flawlessy. I agree with that. semantics Gsync is currently a better product. Given it's an actually buyable product, it can even do QHD at 144hz right now and it works pretty reliably. These are all things we'll have to assess with freesync they are not the same technology they are two different approaches and we have yet to see Freesync in the erratic frame environment of games. That being it's perfect for watching video which is unlikely to see frame rate drops and rises like a game would. They are 2 different ways, we know well g-sync works but have yet to see how AMD's works. I said this before I don't take AMD's claims at face value, just this "prove it works like you say" then I will give credit they are due. It could be just as simple as that, why say you support adaptive sync when there is 0 monitors out and according to press release won't be til least q1 on some."Today, AMD announced collaborations with scaler vendors MStar, Novatek and Realtek to build scaler units ready with DisplayPort™ Adaptive-Sync and AMD's Project FreeSync by year end.That is off AMD's press rls, If they say end of year sounds like no monitor til near end of q1. As other said no reason to say support something isn't out when you have working product Now. Just to kill off your own sales when don't even know if adaptive sync will give any benefit to games or if that is limited to AMD proprietary code.I agree with that.They are 2 different ways, we know well g-sync works but have yet to see how AMD's works. I said this before I don't take AMD's claims at face value, just this "prove it works like you say" then I will give credit they are due. Posted on Sep 25th 2014, 2:31 Reply

#16 john_

When I posted one week ago about this, I was either ignored or treated as someone that was talking conspiracy theories from a very negative perspective. Either we like or we hate Nvidia, we do know that they stick with their proprietary standards, especially when they have the upper hand in the market. There is nothing strange here with them implementing 1.2 and not 1.2a. It's what they are doing for years. OpenCL is another example with even the 900 series cards supporting only OpenCL 1.1 if I am not mistaken, when AMD was supporting 1.2 for years. OpenCL 1.2 is 3 years old.



It's no negative posting, or bashing, or conspiracy theories, and there is nothing strange here. It is business as usual for Nvidia. Posted on Sep 25th 2014, 3:09 Reply

#17 the54thvoid

It's very simple. Nvidia have worked on g-sync, whether you want it or not, they have a hardware implementation to address frame rate from GPU to monitor.

If you want it, you buy Nvidia's product. If you don't want it, you buy AMD product. Nvidia have pushed a business model to increase profit, to please shareholders.

Nvidia is a business, not a charity, it has zero requirement to work along 'free' business models. It has arguably spent a lot on R&D and builds a very capable vanilla card.

If you do not like what they do, you have no need to buy their products. Buy AMD instead.

By all review accounts, on the whole, Gsync works like a dream, why as a private company would they support a free or cheaper version?



If people start seeing Nvidia and AMD as businesses and not charities, a lot of arguments and misplaced anger venting could be avoided. Posted on Sep 25th 2014, 3:10 Reply

#18 astrix_au

Hitman_Actual Gysnc works,



Gsync is awesome



Gsync is here now.





Nvidia pushing technology forward with their enginuity and innovations



AMD needs to start making some leaps and strides if they expect to survive.



I wish AMD was pushing technology more then there would be an actual competition between red/green thus pushing performance and technology at a faster rate instead the snail like speed of the past 6+ years. What do you think they are doing with Mantle, thanks to AMD Microsoft started working on DX12 after the announced they they were focusing on other areas but Mantle forced their hand on the issue.



Nvidia is forcing people to buy monitors with Gsync, if they released DP 1.2a then sales of GSync monitors would collapse when freesync would be available I'm not loyal or a fan boy. I was going to get 2x 780ti's but Mantle's 290x crossfire performance on BF4 won me over. The 780ti's would sit at 75% each wasting GPU performance. When DX12 comes out and they can run at 100% all the time and use max performance all the time I might switch over but then moves like this makes you think twice.



AMD has planned to make Mantle open so we will wait and see when it gets released at the end of this year if that happens. What do you think they are doing with Mantle, thanks to AMD Microsoft started working on DX12 after the announced they they were focusing on other areas but Mantle forced their hand on the issue.Nvidia is forcing people to buy monitors with Gsync, if they released DP 1.2a then sales of GSync monitors would collapse when freesync would be available I'm not loyal or a fan boy. I was going to get 2x 780ti's but Mantle's 290x crossfire performance on BF4 won me over. The 780ti's would sit at 75% each wasting GPU performance. When DX12 comes out and they can run at 100% all the time and use max performance all the time I might switch over but then moves like this makes you think twice.AMD has planned to make Mantle open so we will wait and see when it gets released at the end of this year if that happens. Posted on Sep 25th 2014, 3:21 Reply

#19 astrix_au

semantics Gsync is currently a better product. Given it's an actually buyable product, it can even do QHD at 144hz right now and it works pretty reliably. These are all things we'll have to assess with freesync they are not the same technology they are two different approaches and we have yet to see Freesync in the erratic frame environment of games. That being it's perfect for watching video which is unlikely to see frame rate drops and rises like a game would. You better hope it's not better than GSync since if you have that card your shit out of luck. Being on AMD I don't need to care about spending an extra $150 on something that you probably won't even notice lol I rather have that for free. I don't see the the jittering on my 120hz display that they show on those demos..... lol as they say a sucker is born every day. Those tests are probably the worst case scenarios if not purposely exaggerated. You better hope it's not better than GSync since if you have that card your shit out of luck. Being on AMD I don't need to care about spending an extra $150 on something that you probably won't even notice lol I rather have that for free. I don't see the the jittering on my 120hz display that they show on those demos..... lol as they say a sucker is born every day. Those tests are probably the worst case scenarios if not purposely exaggerated. Posted on Sep 25th 2014, 3:35 Reply

#20 RCoon

The real crime here is that I don't care about Freesync or Gsync, and have no intention on buying into either.



Jesus, gamers these days think they deserve to get everything for free. Posted on Sep 25th 2014, 3:51 Reply

#21 astrix_au

RCoon The real crime here is that I don't care about Freesync or Gsync, and have no intention on buying into either.



Jesus, gamers these days think they deserve to get everything for free. Me neither and I guess we can thank Nvidia for the extra cost on the Asus Swift, I hope there is a version without the GSync I doubt it will be that noticeable, if you limit your FPS to your monitors refresh rate using RTSS or maxvariable in BF4 it's not needed IMO. Like I said my monitor plays smooth thanks to my 2x 290x's at 120hz. Those demos are laughable, marketing 101 on display that is all. Me neither and I guess we can thank Nvidia for the extra cost on the Asus Swift, I hope there is a version without the GSync I doubt it will be that noticeable, if you limit your FPS to your monitors refresh rate using RTSS or maxvariable in BF4 it's not needed IMO. Like I said my monitor plays smooth thanks to my 2x 290x's at 120hz. Those demos are laughable, marketing 101 on display that is all. Posted on Sep 25th 2014, 3:57 Reply

#22 RCoon

astrix_au Me neither and I guess we can thank Nvidia for the extra cost on the Asus Swift, I hope there is a version without the GSync I doubt it will be that noticeable, if you limit your FPS to your monitors refresh rate using RTSS or maxvariable in BF4 it's not needed IMO. Like I said my monitor plays smooth thanks to my 2x 290x's at 120hz. Those demos are laughable, marketing 101 on display that is all. Maybe we're bad examples, we have decent systems and don't see sub 45FPS instances. I imagine GSync and Freesync are more important for people with low end systems, or midrange systems on 4K ridiculousness. I would assume that's where the sync magic comes in handy for the low FPS ranges and dips. Either way, I don't know why people with high end systems care, they wouldn't see much improvement with gsync or freesync anyway. Maybe we're bad examples, we have decent systems and don't see sub 45FPS instances. I imagine GSync and Freesync are more important for people with low end systems, or midrange systems on 4K ridiculousness. I would assume that's where the sync magic comes in handy for the low FPS ranges and dips. Either way, I don't know why people with high end systems care, they wouldn't see much improvement with gsync or freesync anyway. Posted on Sep 25th 2014, 4:05 Reply

#23 the54thvoid

astrix_au Me neither and I guess we can thank Nvidia for the extra cost on the Asus Swift, I hope there is a version without the GSync I doubt it will be that noticeable, if you limit your FPS to your monitors refresh rate using RTSS or maxvariable in BF4 it's not needed IMO. Like I said my monitor plays smooth thanks to my 2x 290x's at 120hz. Those demos are laughable, marketing 101 on display that is all. The problem is you need to experience the sensation of Gsync. You cannot gauge it through a video or you tube. All the reviews are exceptionally positive about it for the most part.

I too don't care about either product but if Freesync works as well as Mantle, Nvidia might adapt their business model to compete. It might just remain peripheral technology though, much like Mantle.

FTR one of my BF4 mates went mantle with a 290 and truly appreciated the smoothness, though he did come from a 2 GB gtx680. On my part, I only game on one 780ti but our perf.render display outputs were identical. But, I do use a decent CPU so mantle is limited in purpose for my GPU needs. The problem is you need to experience the sensation of Gsync. You cannot gauge it through a video or you tube. All the reviews are exceptionally positive about it for the most part.I too don't care about either product but if Freesync works as well as Mantle, Nvidia might adapt their business model to compete. It might just remain peripheral technology though, much like Mantle.FTR one of my BF4 mates went mantle with a 290 and truly appreciated the smoothness, though he did come from a 2 GB gtx680. On my part, I only game on one 780ti but our perf.render display outputs were identical. But, I do use a decent CPU so mantle is limited in purpose for my GPU needs. Posted on Sep 25th 2014, 4:08 Reply

#24 wickedcricket

astrix_au What do you think they are doing with Mantle, thanks to AMD Microsoft started working on DX12 after the announced they they were focusing on other areas but Mantle forced their hand on the issue.



Nvidia is forcing people to buy monitors with Gsync, if they released DP 1.2a then sales of GSync monitors would collapse when freesync would be available I'm not loyal or a fan boy. I was going to get 2x 780ti's but Mantle's 290x crossfire performance on BF4 won me over. The 780ti's would sit at 75% each wasting GPU performance. When DX12 comes out and they can run at 100% all the time and use max performance all the time I might switch over but then moves like this makes you think twice.



AMD has planned to make Mantle open so we will wait and see when it gets released at the end of this year if that happens. Well following your line of though here. Isn't AMD "forcing" you to buy new Microsoft Software i.e. Windows 9 ?



I agree with above "foreworders": Nvidia pushing technology forward with their enginuity and innovations For now G-Sync is already a reality and is proven that works flawlessy. Exactly people need a solution right here, right now. I am having a blast using it, it is incredible and fun! As a consumer I am not going to wait for "I don't know how long in a distant future" for something that hasn't been even implemented/tested yet.



I also read that it using 1.2a, all needs is software update so, we are talking simple soft solution to be able to use "freesync" on nvidia cards. Why all this doom and gloom and rant. Nvidia has every right to charge everyone for this tech Umm, yes they do if there are people willing to pay for it, duuuh??? Well following your line of though here. Isn't AMD "forcing" you to buy new Microsoft Software i.e. Windows 9 ?I agree with above "foreworders":Exactly people need a solution right here, right now. I am having a blast using it, it is incredible and fun! As a consumer I am not going to wait for "I don't know how long in a distant future" for something that hasn't been even implemented/tested yet.I also read that it using 1.2a, all needs is software update so, we are talking simple soft solution to be able to use "freesync" on nvidia cards. Why all this doom and gloom and rant.Umm, yes they do if there are people willing to pay for it, duuuh??? Posted on Sep 25th 2014, 4:25 Reply

#25 astrix_au

the54thvoid FTR one of my BF4 mates went mantle with a 290 and truly appreciated the smoothness, though he did come from a 2 GB gtx680. On my part, I only game on one 780ti but our perf.render display outputs were identical. But, I do use a decent CPU so mantle is limited in purpose for my GPU needs. Yeah mantle runs awesome on the 290x, other cards are a different story specially older GCN cards. I almost bought 2 GTX 780 ti's, but I liked the idea of low level API. I thinking possibly going one day to Nvidia possibly 2x 980ti's they seem well priced. Yeah mantle runs awesome on the 290x, other cards are a different story specially older GCN cards. I almost bought 2 GTX 780 ti's, but I liked the idea of low level API. I thinking possibly going one day to Nvidia possibly 2x 980ti's they seem well priced. Posted on Sep 25th 2014, 4:27 Reply