They’re (Almost) All Dirty: The State of Cheating in Android Benchmarks is a pretty damning headline, but that’s only the start. In the article with that title, Anand Lai Shimpi (AnandTech’s namesake) and Brian Klug take a look at how various OEMs try to make themselves look better by detecting the execution of well-known benchmarking applications and in response tweaking the CPUs to run at full speed and to not drop into power-saving mode. Yes, you could run all the cores on a mobile device CPU at full speed constantly for all apps, but that isn’t done because doing so is such a power drain that you’d be recharging a half-dozen times a day. This practice of running all cores at full speed when a benchmarking app is detected is at best the equivalent of pretending to look really busy when the boss is walking by; at worst, it’s lying to the consumer.
In their article, they posted a table showing a number of current Android Devices and the benchmark tests for which they cheat. I love the title of their table, which I’ve copied and pasted below: I Can’t Believe I Have to Make This Table:
I Can’t Believe I Have to Make This Table | |||||||||||
Device | SoC | Cheats In | |||||||||
3DM | AnTuTu | AndEBench | Basemark X | Geekbench 3 | GFXB 2.7 | Vellamo | |||||
ASUS Padfone Infinity | Qualcomm Snapdragon 800 | N | Y | N | N | N | N | Y | |||
HTC One | Qualcomm Snapdragon 600 | Y | Y | N | N | N | Y | Y | |||
HTC One mini | Qualcomm Snapdragon 400 | Y | Y | N | N | N | Y | Y | |||
LG G2 | Qualcomm Snapdragon 800 | N | Y | N | N | N | N | Y | |||
Moto RAZR i | Intel Atom Z2460 | N | N | N | N | N | N | N | |||
Moto X | Qualcomm Snapdragon S4 Pro | N | N | N | N | N | N | N | |||
Nexus 4 | Qualcomm APQ8064 | N | N | N | N | N | N | N | |||
Nexus 7 | Qualcomm Snapdragon 600 | N | N | N | N | N | N | N | |||
Samsung Galaxy S 4 | Qualcomm Snapdragon 600 | N | Y | Y | N | N | N | Y | |||
Samsung Galaxy Note 3 | Qualcomm Snapdragon 800 | Y | Y | Y | Y | Y | N | Y | |||
Samsung Galaxy Tab 3 10.1 | Intel Atom Z2560 | N | Y | Y | N | N | N | N | |||
Samsung Galaxy Note 10.1 (2014 Edition) | Samsung Exynos 5420 | Y(1.4) | Y(1.4) | Y(1.4) | Y(1.4) | Y(1.4) | N | Y(1.9) | |||
NVIDIA Shield | Tegra 4 | N | N | N | N | N | N | N |
They also write (the emphasis is mine):
We started piecing this data together back in July, and even had conversations with both silicon vendors and OEMs about getting it to stop.
With the exception of Apple and Motorola, literally every single OEM we’ve worked with ships (or has shipped) at least one device that runs this silly CPU optimization.
It’s possible that older Motorola devices might’ve done the same thing, but none of the newer devices we have on hand exhibited the behavior. It’s a systemic problem that seems to have surfaced over the last two years, and one that extends far beyond Samsung.
Kudos to Apple and Motorola for not playing this game, and to the other OEMs: cut it out. Even if take away the issue of right versus wrong, there’s still a coldly Macchiavellian, pragmatic reason to not do so. The performance gains you get from rigging benchmark tests in this fashion are below 10%, and they’re not worth the bad press you’ll have to live with for the next little while.
One reply on ““They’re (Almost) All Dirty”: A Damning Indictment of Most Major Android OEMs for Cheating With Benchmarks”
[…] “They’re (Almost) All Dirty”: A Damning Indictment of Most Major Android OEMs for … […]