With its GeForce "Maxwell" family, NVIDIA, riding on the multi-monitor fad, began equipping its graphics cards with up to three DisplayPort connectors, besides an HDMI, and optionally, a legacy DVI connector. Prior to that generation, AMD dabbled with equipping its cards with two mini-DisplayPorts, besides two DVI and an HDMI.With the latest GeForce RTX "Turing" family, NVIDIA could push for the adoption of USB type-C connectors with DisplayPort wiring, and perhaps even USB-PD standards compliance, pushing up to 60 Watts of power from the same port. This USB+DP+Power connector is called VirtuaLink. This could make it easier for VR HMD manufacturers to design newer generations of their devices with a single USB type-C connection for display and audio input form the GPU, USB input from the system, and power. We reckon 60W is plenty of power for a VR HMD.

18 Comments on USB Type-C with DisplayPort+USB Wiring Could Get a Big Push by NVIDIA

#1 Tsukiyomi91

the future is coming... if a single USB-C can drive VR headset, then setting it up would be much easier. Posted on Aug 20th 2018, 9:18 Reply

#2 anthony256





www.tweaktown.com/news/62539/virtuallink-next-gen-vr-over-single-usb-type-cable/index.html You mean, like VirtualLink? The new USB-C connector for next-gen VR headsets, which could be re-purposed to power next-gen USB-C monitors, too. Posted on Aug 20th 2018, 9:46 Reply

#3 FordGT90Concept

"I go fast!1!11!1!" USB-C (10 gbps) has less bandwidth than DisplayPort (26 gbps) and USB-C gen 2 (10 gbps) only has 1m cable length for that speed (far from enough for VR). I think the VR headsets are going to be using Thunderbolt (up to 40 gbps and if fiber optic, cable length doesn't really matter). If the cables are just copper, this is going to be a serious problem for future headsets.



USB-C powered monitors have the same bandwidth issue. There's also a lot of monitors that 27w wouldn't be enough to drive. I don't think USB-C powered monitors were really the intent or NVIDIA would have put at least two on the card. In general, DisplayPort is the better connector for monitors (better signal separation, longer cable lengths, cheaper to manufacture, and locking mechanism) even though it only has minimal power supply. Supplying monitor power via GPU just relocates power supply from a brick/internal to the monitor to the computer's power supply, motherboard, and GPU. VR headsets are about the only scenario where that's a preferable arrangement. Posted on Aug 20th 2018, 10:03 Reply

#4 randomUser

I would really like my monitor get it's power through a signle cable coming from MB or the GPU. Thats a dream.



10 years ago, monitors could be connected to a PSU, as they had input and output slots. This lets for better cable managment and an empty slot in the outlet.



Having it power through a single cable would be a dream. Posted on Aug 20th 2018, 10:19 Reply

#5 Xzibit

USB PD standards have been around since 2012 afaik



mini connectors are limited up to 60w & regular are up to 100w if they comply with the profiles. cables too.



Slow adoption is probably because very few want to add whats needed to support the added power. Phones adopted it to charge the battery. Posted on Aug 20th 2018, 10:21 Reply

#6 Shamalamadingdong

FordGT90Concept USB-C (10 gbps) has less bandwidth than DisplayPort (26 gbps) and USB-C gen 2 (10 gbps) only has 1m cable length for that speed (far from enough for VR). I think the VR headsets are going to be using Thunderbolt (up to 40 gbps and if fiber optic, cable length doesn't really matter). If the cables are just copper, this is going to be a serious problem for future headsets.



USB-C powered monitors have the same bandwidth issue. There's also a lot of monitors that 27w wouldn't be enough to drive. I don't think USB-C powered monitors were really the intent or NVIDIA would have put at least two on the card. In general, DisplayPort is the better connector for monitors (better signal separation, longer cable lengths, cheaper to manufacture, and locking mechanism) even though it only has minimal power supply. Supplying monitor power via GPU just relocates power supply from a brick/internal to the monitor to the computer's power supply, motherboard, and GPU. VR headsets are about the only scenario where that's a preferable arrangement. According to VESA you're wrong Posted on Aug 20th 2018, 11:03 Reply

#7 ZeppMan217

You guys want more juice pushed through the GPU instead of using an external power supply? Posted on Aug 20th 2018, 11:14 Reply

#8 FordGT90Concept

"I go fast!1!11!1!" Shamalamadingdong According to VESA you're wrong Up to, up to, up to. The longer the cable, the worse the performance. ZeppMan217 You guys want more juice pushed through the GPU instead of using an external power supply? Small, lower panels, why not? Large TVs, nope.



I have a very difficult time believing USB-C can handle 100w. Those cables and contacts are tiny. DC produces a lot of waste heat when the conduits are tiny. Seems like an all around bad idea. 20w? Sure. Up to, up to, up to. The longer the cable, the worse the performance.Small, lower panels, why not? Large TVs, nope.I have a very difficult time believing USB-C can handle 100w. Those cables and contacts are tiny. DC produces a lot of waste heat when the conduits are tiny. Seems like an all around bad idea. 20w? Sure. Posted on Aug 20th 2018, 11:52 Reply

#9 jabbadap

FordGT90Concept Up to, up to, up to. The longer the cable, the worse the performance.



Small, lower panels, why not? Large TVs, nope.

I have a very difficult time believing USB-C can handle 100w. Those cables and contacts are tiny. DC produces a lot of waste heat when the conduits are tiny. Seems like an all around bad idea. 20w? Sure. Well yeah, VirtualLink is designed to enable a new level of immersion in VR, with power, display, and data bandwidth specified to meet the needs of future VR headsets. That includes support for four lanes of HBR3 DisplayPort for high-resolution displays, USB 3.1 Gen2 (SuperSpeed USB 10Gbps) for headset cameras and sensors, and up to 27 Watts of power delivery. Well yeah, Virtuallink standard says up-to 27W over the connector. They also say four lanes HBR3 over the Virtuallink, which is 8.10 Gbit/s per lane thus 32.4 Gbit/sec overall. So in other words should be full Displayport 1.4 bandwidth. Posted on Aug 20th 2018, 12:11 Reply

#10 FordGT90Concept

"I go fast!1!11!1!" And the only way to get that kind of bandwidth is via Thunderbolt 3 (up to 40 gbps). If VirtualLink doesn't use Thunderbolt 3 over fiber optic, bandwidth will severely suffer. Posted on Aug 20th 2018, 13:14 Reply

#11 coonbro

''besides an HDMI, and optionally, a legacy DVI connector''



DVI -?? if a dvi -d what a waste if DVI-I and analog support great . i'll assume another card that you got to support it instead of it supporting you and your needs Posted on Aug 20th 2018, 15:57 Reply

#12 FordGT90Concept

"I go fast!1!11!1!" DVI-I died several generations ago (they eliminated the RAMDAC on the card). If you need VGA, you might as well invest in a DisplayPort to VGA adapter. Posted on Aug 20th 2018, 16:45 Reply

#13 Kaotik

VirtualLink was already released a while back and it's backed by not only NVIDIA, but AMD, Valve, Oculus and Microsoft, of course it will "push through". You could do a little bit background research for these pieces. Posted on Aug 20th 2018, 16:54 Reply

#14 coonbro

FordGT90Concept DVI-I died several generations ago (they eliminated the RAMDAC on the card). If you need VGA, you might as well invest in a DisplayPort to VGA adapter. dang, did not know my 900 card was several gen's old I thought the 10 series was the next and latest ? what series did I miss between the 900 and the 10 ? dang, did not know my 900 card was several gen's old I thought the 10 series was the next and latest ? what series did I miss between the 900 and the 10 ? Posted on Aug 20th 2018, 17:03 Reply

#15 notb

ZeppMan217 You guys want more juice pushed through the GPU instead of using an external power supply? Of course. Why not? Of course. Why not? Posted on Aug 23rd 2018, 3:40 Reply

#16 FordGT90Concept

"I go fast!1!11!1!" coonbro dang, did not know my 900 card was several gen's old I thought the 10 series was the next and latest ? what series did I miss between the 900 and the 10 ? It was phased out of higher end cards before low end cards. HD 5870 Eyefinity 6 I think was the first mainstream card to completely do away with it. Fiji followed suit. R9 290X and beyond have DVI-D, but no RAMDAC to support analog. They've been phasing it out for a long time. It was phased out of higher end cards before low end cards. HD 5870 Eyefinity 6 I think was the first mainstream card to completely do away with it. Fiji followed suit. R9 290X and beyond have DVI-D, but no RAMDAC to support analog. They've been phasing it out for a long time. Posted on Aug 23rd 2018, 8:44 Reply

#17 notb

FordGT90Concept I have a very difficult time believing USB-C can handle 100w. Those cables and contacts are tiny. DC produces a lot of waste heat when the conduits are tiny. Seems like an all around bad idea. 20w? Sure. What? ;-)

Open your PC case and check the cables that handle a much larger current.

USB-C (in 100W mode) is just 5A, so less than on a single Molex cable - really nothing special.

The cable is fairly wide and the contacts are huge. There is a margin for more. :-) What? ;-)Open your PC case and check the cables that handle a much larger current.USB-C (in 100W mode) is just 5A, so less than on a single Molex cable - really nothing special.The cable is fairly wide and the contacts are huge. There is a margin for more. :-) Posted on Aug 23rd 2018, 10:46 Reply