This week, following a small amount of fanfare, the OnePlus 5 nabbed a DxOMark Mobile score of 87. A day later, as if timed perfectly to demonstrate the flaws of its rating system, DxO gave the LG G6 an 84. And the Internet is suitably riled up. Comment threads suggest something untoward has happened as a result of OnePlus's recently-announced partnership with DxO. Reddit is swimming in incredulous anger. Let me start by saying I don't think DxO has allowed however much money changed hands between it and OnePlus to influence the objectivity of its testing. Nobody is directly buying or selling higher benchmark scores — that would be crazy. Nevertheless, it's become clear that as a basis for judging whether one smartphone camera is better than another, the firm's numbered scores are, at best, flawed. Verizon is offering the Pixel 4a for just $10/mo on new Unlimited lines

DxO's overall scores are taken from an average of sub-scores for exposure and contrast, color autofocus, texture, noise, artifacts and stabilization. There's a brief explainer (dating all the way back to 2012) detailing how DxO generates these mobile scores, apparently showing a mix of automatic and perceptual testing — the latter involving a human using the phone out in the real world. For the automated tests, DxO relies on software like its own DxO Analyzer, which is used by the world's top camera makers to gauge image quality. The specifics of DxO's partnership with OnePlus (and other manufacturers like HTC and Google) haven't been publicly disclosed. But presumably, it's this software, along with other testing equipment, that the imaging teams at these phone makers get access to.

Single numbered scores for phone cameras are at once too vague and too specific.

Firstly, let's address the flaws of using a single number to sum up the entire mobile camera experience. Reducing a smartphone camera to a percentage score has the problem of being at once too vague and too specific. A number — a non-weighted average — doesn't do justice to the complexity of modern smartphone cameras, where performance can vary widely depending on the situation, and not all factors are equally important. At the same time, a score out of 100 implies precision. The OnePlus 5, Huawei P10 and Samsung Galaxy S6 edge+ are all equally good, the numbers tell us. The LG G6 and Moto G4 Plus, also equal, with DxO scores of 84. Anyone who's used these devices out in the real world will tell you the reality is not even close. Meanwhile, DxO rates the Galaxy S6 edge+ as an 87, whereas the Galaxy Note 5 is an 86. Both phones have the same internal hardware and camera modules. There's a one-point variance between these two phones, which in imaging terms are identical. There's also a one-point difference between a Samsung Galaxy S8 and a Sony Xperia Z5, which are light-years apart in real-world performance. This underscores the craziness of putting stock in these single numbered scores for phone cameras, particularly when the same variance can exist between two physically identical cameras and two very, very different ones. DxO scores may well serve as a decent benchmark for the raw capabilities of each camera (personally, I think even that is debatable — see the LG G6 vs Moto G4 Plus example above), but they also have the effect of muddying important details around real-world use.

DxOMark scores often don't line up with reality, telling us the LG G6 is only as good as last year's Moto G4 Plus.