Ads are what have allowed this site to be maintained for the past 16 years.

Last week NVIDIA announced the GeForce GTX 1660 SUPER as their newest Turing "SUPER" graphics card coming in at $229+ USD and delivering around 1.5x faster performance than the GeForce GTX 1060. For those wondering about the Linux gaming performance potential for this graphics card, here are our initial tests of this new graphics card using the EVGA GeForce GTX 1660 SUPER.

On launch day I purchased the EVGA GeForce GTX 1660 SUPER for carrying out these Linux benchmarks. The EVGA GeForce GTX 1660 SUPER (06G-P4-1068-KR) was in-stock on launch day and indeed hitting the $229 USD retail price. This graphics card features a dual fan setup and metal backplate. While the GTX 1660 SUPER reference specifications put the boost clock at 1785MHz, the EVGA model does advertise a possible 1830MHz boost clock frequency. The rest of the specs including 14Gbps 6GB GDDR6 video memory are in-line with the GTX 1660 SUPER specifications.

The graphics card power rating is 125 Watts so a single 8-pin PCI Express power connector is required. Video outputs on the EVGA GeForce GTX 1660 SUPER include one DVI-D, one HDMI, and one DisplayPort output.

While NVIDIA hasn't yet issued a new driver update since last week's GTX 1660 SUPER launch (edit: they now have this morning), the GTX 1660 SUPER does work fine with the existing NVIDIA 435.21 Linux binary driver. The only caveat is the driver reporting it as just a NVIDIA "Device" rather than the GeForce GTX 1660 SUPER product string, but that will be updated for their next public Linux driver build.

Like the rest of the Turing line-up, when using the NVIDIA proprietary driver there hasn't been any fundamental bugs or issues to note. Of course, the proprietary driver is the only real option for Maxwell and newer at least until NVIDIA provides the needed PMU firmware support for allowing graphics card re-clocking support. For Turing GPUs there is just mode-setting support at this point with the Nouveau driver stack and no hardware acceleration.