Benchmark cheating

I remember benchmarks with the Desire, it was way higher with Cyanogen than stock with Sense. :O

Sent from yours truly using Tapatalk 4
 
in before some clown says it's normal and happens on PC but fail to realize this only happens on benchmarks on phones.
 
This is why I value real world performance over benchmarks and spec sheets. When I was deciding between the S4 and the HTC One at launch, I didn't care that the S4 had the higher clock speed or that the two phones were nearly identical on benchmarks. I went into the AT&T store and handled the devices. In the 5 minutes I was playing with the S4 the day the display model was put up, I saw easily visible stutter and lag, even on the home screens. Real world performance issues. So I went with the One. Yes, Samsung did eventually fix that issue on the S4, but at launch, no one knew if they would fix it, or if it even could be fixed.
 
Honestly I don't think it's a big deal. If you read the last paragraph, Anand states that the performance gains from optimization are only around 5%, which isn't significant. Take 5% away from a phone's Antutu score of 30,000 and you'll get 28,500, which is still significantly higher than a phone with a score of 10,000 or 20,000.

Benchmarks provide us metrics to compare one SoC to another. If two phones are using the same SoC, chances are they're going to have similar scores, with one phone having a slightly higher score. No surprise there.

What REALLY grinds my gears are the battery drain tests. Most tech sites use a video loop test, which sets the screen to a near unviewable brightness setting, and loops a video until the battery runs out. WHAT KIND OF PERSON WATCHES VIDEOS ON THEIR PHONE/TABLET FOR 10+ HOURS UNTIL THE BATTERY DIES? Most unrealistic benchmark ever. Users who read these reviews and see that the battery lasts 10+ hours and expect their phone/tablet to last that long, are pleasantly disappointed when they discover their phone actually lasts half that time from normal usage.
 
Last edited: