It's safe to say that these publications, and PC gamers in general, have not reacted very positively to NVIDIA's new GPUs. Many of the marquee features of the RTX line, namely hardware support for ray-tracing and machine learning acceleration, are bets on the future that have little to no impact on gaming as it exists today.

While the RTX 2080 Ti will sell well and undoubtedly outperforms every gaming card released prior, it does so at a $1,000-plus price tag that most are unwilling to even contemplate. And until the RTX-specific features mature, the lower-end models like the 2080 and 2070 are essentially more-expensive versions of the 1080 Ti and 1080.

Just as Engadget might cover a new Pixel phone in great depth, PC specialists were obviously planning a barrage of RTX-related content to quench their audience's thirst for details on the new GPUs. So what to do when interest just isn't there? If you're Gamers Nexus, you start the hardware geek version of a rap beef. The reason? It wants the top spot on a benchmarking leaderboard.

The battle over this spot -- the "two-card" score in 3DMark's Time Spy Extreme benchmark -- started when people began testing the 2080 Ti in SLI. For gaming purposes, SLI, which involves connecting two cards with a bridge so a computer can address both at once, is best left to those with more money than sense. Most game engines offer diminishing returns from multi-GPU scaling, and many games are unstable at launch if you have an SLI setup. When it comes to synthetic benchmarks, though, it's a different story: You're rendering a few predefined scenes, which are built to support multi-card setups.

The 2080 Ti is going to be at the top of all GPU tables it's able to compete in. That's a given. But these tables aren't purely about the strength of the parts, as there are huge variations in score depending on the the rest of the setup and the way those parts are configured. The top spots are usually reserved for big names in the overclocking scene.

The first time I became aware of this battle was when Gamers Nexus put up an announcement for a "#RIPJAY #RIPPAUL" stream on YouTube, in which Steve Burke, the site's editor in chief, would overclock his components in an attempt take the world record score. The Jay who was about to "rest in peace" was JayzTwoCents, a YouTuber with a penchant for water-cooling loops, and Paul is the guy behind Paul's Hardware, a channel more focused on custom-build videos. Paul briefly sat atop the leaderboard, but Jay overtook him with a score of 14,043. Burke was confident he could beat them both.

When attempting to unlock more performance from your parts, typically the challenge is dissipating the additional heat generated by the increased power draw. That's why there's a massive custom liquid-cooling market, heat sinks that weigh almost three pounds and mounts that let you pour liquid nitrogen directly onto a processor. With the RTX line, though, the major issue seems to be delivering the power needed for increased performance. Out of the box, the cards aren't intended to draw the power needed to sustain extreme overclocks. To get around this, Burke used a shunt mod, which -- well, you can Google it if you're actually interested, but essentially it tricks the card into thinking it's using less power than it is, allowing it to draw more.

Using a shunt mod, a jury-rigged cooling system and some hefty CPU overclocks, Burke easily took the top spot during his livestream, posting a score of 14,367 while simultaneously answering hardware questions from thousands of viewers. I haven't overclocked anything in almost a decade, and with the constant questions and commentary, the stream ended up being equal parts education and entertainment. It sent me down YouTube rabbit holes, to the point where I'm now convinced I'll delid the CPU on my next gaming system.