See, this is why it's bad to have flawed analyses – not that I'm picking on this poster directly. But if people actually watch the video the OP cites (which I finally did now that I'm home from work), they not only ran the tests on an intentionally old CPU (an eight-year-old 2700K), they also did not run repeatable benchmarks. Instead, they did whole levels/missions and compared the results. Anyone familiar with benchmarking knows how much even minor differences can result in fluctuating results – even on the exact same setup. And even then, the difference in framerate was either literally zero (which was the case in a good chunk of their results) or very minor, like 3-5%. That difference even includes results where games w/Denuvo ran better. I'm obviously not saying that means Denuvo makes games run better. But it does show that their tests are flawed.



And even if they weren't, you have people taking these results which, at worst, make a 5% difference on an 8-year-old CPU, and use them to say "See! This must be why MHW/ACO runs poorly!" Even if it did cause a 5% FPS drop (which these results make it very difficult to be definitive, and are even counter to this narrative), another 5% improvement wouldn't make it a magnificently better experience.



Honestly, when people take stuff like this as evidence that Denuvo is destroying games and act like it's a 50% performance hit on every game with new (or even moderate) CPUs, it gives both publishers and Denuvo themselves an excuse to hand wave criticisms away. I'd rather we be honest and use the major issues DRM bring up – like the concept of games no longer working when a company/DRM maker go under, or when servers go down – than to jump on anything and everything without a critical eye. The latter only makes those against the practice easier to dismiss, and that hurts the cause rather than helps it.