While I won’t speak for all of the staff at XDA Developers I can and will state that a recent article on Android Central appearing to stem from articles written by my colleagues at XDA is not well received. Much of the information was misconstrued, its meaning twisted to fit a certain narrative that casts a negative light both on XDA and my colleagues there. As such I believe a response is warranted from a personal perspective.

Some ground rules in advance of that response are also warranted. As this is obviously not XDA I clearly post here that these views should not be considered those of anyone else but me… unless someone wants to speak up and say so. There’s a reason I’m not posting this on XDA and if it’s not clear enough why above this, feel free to ask for a direct response. Mainly I’m concerned that the writer didn’t take enough time to even contact anyone at XDA, making his own conclusions on the articles without even attempting to get clarification or response from anyone on what was written. I’m sure most of us are willing to give such a response even over social media.

Here are the specific arguments, rebuttals and/or responses to specific text within that article.

1. “(…) that seemed to demonstrate this shiny new phone was, in some ways, not deserving the space at the top of the heap so many tech reviewers has placed it.”

We never said it did not deserve a spot at the top. In fact, we called it a remarkable phone for life, one of the best phones you can buy, and listed every aspect that can justify its price.

2. “(…) It’s a fun read, especially if you only kind of understand what you’re reading”

If the reading of the article is difficult for him I believe that any of the contributors owning a Note 7 would be happy to clarify. I’m pretty sure we’ve also done that at length as seen with other links here and in his own article.

3. “If you use a benchmark app to tell you how great or terrible your phone performs, you’re not getting anything anywhere near a complete picture.“

None of us ever states this. (And while I haven’t personally chimed in on XDA about this topic, I will be soon here on a separate piece.)

4. “Today, many manufacturers implement special code that forces that hardware to perform above the typical thresholds when a benchmark app is being used, which irreparably alters the results.“

Please offer a citation to this for credibility’s sake. I asked our Editor-in-Chief about this and his response was as follows:

I personally always use Trepn when first firing benchmark apps to look for odd behavior. We only used one real benchmark, GFXBench for performance over time and not a score, where keeping clockspeeds high (which didn’t happen) would be detrimental

We’ve seen attempts to boost numbers on mobile and PC blow up quickly and be abandoned/defeated. If there is proof that it is still actively occurring please feel free to offer proof, otherwise it’s conjecture at best.

5. “Samsung and Lenovo both have background tasks that can’t be replicated on the Nexus 6P. Features that can’t be disabled to get a 1:1 compare of the software performance.”

That’s precisely the point of comparing them. TouchWiz cannot be reshaped, therefore knowing just how bad of a sacrifice you make when buying this phone is important. As someone who offers assistance in the open source community, no two OEMs have a similar framework once it diverts away from that, stuff that we have near zero visibility on because it’s their intellectual property.

The whole point is that Samsung’s software is the problem. We want them to take this as constructive criticism and look at ways to address it.

7. “If you see a Note 7 performing anywhere near as smoothly as a Nexus 6P, consider how many more things that Note 7 is doing. Better yet, take a look at the immeasurably more thorough Anandtech review of the Note 7 performance as it compares to all other high performing phones, and see how it regularly outpaces the Nexus 6P.”

As written in the various articles the performance concerns are that it’s not meeting the expectations of what the hardware can do. If a 6P can handle general user tasks — not gaming or intensive use, simple mundane tasks — why is the UI so much less responsive in a direct comparison? It can and should do better — and I’ve said in comments to my own colleagues’ articles reflecting this view.

8. “In this Vladitorial, it was pointed out that some of the findings on XDA weren’t really findings. Specifically, claiming that a 200ms difference in launching apps was an example of “embarrassing performance” is silly and not representative of how people actually use smartphones.”

It’s important when we look at it proportionally, in context — which you mention is missing. So let’s use your argument instead that benchmarks don’t matter and users will go off of what they see. Put a cleanly wiped 6P and Note 7 side by side. Install the same apps and try simple mundane tasks that an average user would do. I did when I didn’t believe what was being said by my colleagues — and I ate a slice of humble pie afterwards.

9. [GIF OF CAMERA BEING OPENED IN DIFFERENT PHONES] “What’s fascinating about this demonstration is the cherry picking.”

Context matters again here. The Note 7 camera is stored in RAM at all times and is a hot launch, and was compared to a cold launch an application on a different device. The best way to then adjust for this is using a third party camera app or Google’s own. This is why our DiscoMark test did not include system apps, or apps with system hooks (such as the preinstalled Note 7 bloatware like Dropbox).

10. The thing is, that’s not how real world testing works. The point of real world testing, as the name suggests, is to offer performance examples of how the whole phone functions as though an “average” user is going to use the phone.

After asking the contributors of those articles about it, because I was equally confused, it was used to offer additional context to the argument. These were primarily GIFs of those contributors using Hangouts and the Google Keyboard, as well as scrolling performance, and the time delta when simply sharing items on Chrome. Hopefully that context is better understood now.

11. “Showing how a share menu loads, especially when those phones are clearly not set up the same way with the same apps”

In the benchmark methodology used in Android Central review: “Nothing special was done before we tested. We even tested in the evening after a day of normal use. None of the phone’s features were disabled. Really, we treated them like we do every phone and just ran the apps we installed to benchmark them.” If you’re going to then use that as your argument you can’t then cherry pick the results. Our EIC actually make sure that no third-party background service interferes with the benchmark, it’s in our guidelines and the specific instructions sent to contributors who do reviews. It’s even recommended to do a factory reset prior to all testing to minimize the impact of anything that may not be “out of the box.”

In my own use having come from an LG G5 to the Note 7 I can say that this is a vast improvement but that delay is still noticeable. And I struggle to understand why. Both are powered by the 820. The Nexus 6P, again, performs this function even faster than the Note 7. That makes it noteworthy.

11.5. “There’s value in testing for things like dropped frames, and reporting on those dropped frames in context is an important thing to do when your goal is to educate and inform potential buyers. It’s hard to say that’s what happened with the presentation from XDA, given the lack of context or proper comparison. Does the Note 7 drop more or less frames than the Galaxy S7 or S7 Edge? Could this be an issue exclusive to the Snapdragon variant of this model? Is this happening because Samsung’s new Grace UI was rushed out and could be fixed in a future update? None of these questions are answered, because the goal wasn’t to inform.”

Reading a GPU profiling screenshot or video already tells you what you need to know and just how much of an impact it has; the whole point is to quantify what you can’t see. At XDA and elsewhere most understand what a bar (or in this case, several) above the green line represents. We actually include examples in pretty much all of our reviews, either screenshots or videos, as well as commentary.

Asking things like “is this happening because Samsung’s new Grace UI was rushed out?” is silly, because none of us outside of Samsung can really answer that. We did answer, however, that this is exclusive to the Snapdragon variant of the phone, and did mention we’d be testing the other variant for our full review. AC claims that because we don’t (although we did, and in some cases couldn’t) answer these questions, it means our goal wasn’t to inform. Android Central says this after that its full review makes nothing but glowingly positive remarks about the Note 7’s performance, some of them entirely unrealistic such as no flaws with multi-tasking, and that the hardware performs as you’d expect a Snapdragon 820 device with 4GB of RAM to perform. (I respectfully disagree because again, if the 6P can perform basic UI tasks better, there’s room to improve.)

On a side note I am concerned about this concept of posting 4-day reviews as well. This is starting to border the gaming industry “reviews” which I challenged in 2014 at Fatal Hero. Perhaps it’s time to declare this same statement beyond gaming.

12. “As a result, Samsung phones are optimized in whatever way they deem most important. Right now those optimizations are for delivering unique Samsung features, like Samsung’s camera, Samsung Pay and the unique S Pen functions. Android, by which I mean the OS, doesn’t place priority on those things. In recent releases there’s been a focus on things like battery consumption when you aren’t using the phone, security at all times, and a consistent 60FPS user interface. It’s difficult to argue that any of these things aren’t important, but neither Samsung’s Android nor Google’s Android places a priority on all of these things.”

Then please explain how you justify the writing off or dismissing of the fact that my colleagues address this in their article, specifically how it doesn’t accomplish this goal of a consistent 60 FPS experience.

13. “Trying to claim the Note 7 is somehow underperforming because it doesn’t behave like a phone it wasn’t built to behave like is ridiculous, no matter how you tightly you try to wrap that narrative in benchmarks.”

Does that mean in your mind that because a phone is “not built to [be fast]” we cannot criticize it, specifically when it packs top-of-the-line hardware and costs twice as much as devices with similar specifications? As mentioned several times here the contributors only used one benchmark, and not to show the score but performance over time.

If it’s still unclear please feel free to contact me by replying, e-mailing me or via social media. I’m sure I can speak for the others on this: If a context is believed to be missing on those articles, we are willing to explain what that context is.