Advertisement
News
EEtimes
News the global electronics community can trust
eetimes.com
power electronics news
The trusted news source for power-conscious design engineers
powerelectronicsnews.com
EPSNews
News for Electronics Purchasing and the Supply Chain
epsnews.com
elektroda
The can't-miss forum engineers and hobbyists
elektroda.pl
eetimes eu
News, technologies, and trends in the electronics industry
eetimes.eu
Products
Electronics Products
Product news that empowers design decisions
electronicproducts.com
Datasheets.com
Design engineer' search engine for electronic components
datasheets.com
eem
The electronic components resource for engineers and purchasers
eem.com
Design
embedded.com
The design site for hardware software, and firmware engineers
embedded.com
Elector Schematics
Where makers and hobbyists share projects
electroschematics.com
edn Network
The design site for electronics engineers and engineering managers
edn.com
electronic tutorials
The learning center for future and novice engineers
electronics-tutorials.ws
TechOnline
The educational resource for the global engineering community
techonline.com
Tools
eeweb.com
Where electronics engineers discover the latest toolsThe design site for hardware software, and firmware engineers
eeweb.com
Part Sim
Circuit simulation made easy
partsim.com
schematics.com
Brings you all the tools to tackle projects big and small - combining real-world components with online collaboration
schematics.com
PCB Web
Hardware design made easy
pcbweb.com
schematics.io
A free online environment where users can create, edit, and share electrical schematics, or convert between popular file formats like Eagle, Altium, and OrCAD.
schematics.io
Product Advisor
Find the IoT board you’ve been searching for using this interactive solution space to help you visualize the product selection process and showcase important trade-off decisions.
transim.com/iot
Transim Engage
Transform your product pages with embeddable schematic, simulation, and 3D content modules while providing interactive user experiences for your customers.
transim.com/Products/Engage
About
AspenCore
A worldwide innovation hub servicing component manufacturers and distributors with unique marketing solutions
aspencore.com
Silicon Expert
SiliconExpert provides engineers with the data and insight they need to remove risk from the supply chain.
siliconexpert.com
Transim
Transim powers many of the tools engineers use every day on manufacturers' websites and can develop solutions for any company.
transim.com

ARM Beats Intel With Revised AnTuTu Benchmark

By Jim McGregor  07.12.2013 17

The AnTuTu Benchmark, a benchmarking tool for Android smartphones and tablets, has been revised following some discrepancy over whether the latest Intel Atom processor outperformed ARM-based chips from several vendors in key aspects of the benchmark. Under the revised benchmark, overall scores for the Atom Z2580 dropped by about 20 percent.

As I indicated in a post on EE Times earlier this week, there appeared to be some discrepancies with the AnTuTu benchmark in relation to older versions of the benchmark and other benchmarks. (See: Has Intel Really Beaten ARM?) Technical consulting firm BDTI pointed out that the compiled code for the Intel processor was not executing all instructions that were intended for the RAM test. This artificially improved the results for the Lenovo K900 smartphone and the Intel Atom processors. The problem appears to arise from the ICC compiler introduced around version 2.9.4 of the AnTuTu benchmark and used just for the Intel processors. Code for all other processors uses a different compiler, called the GCC.

To rectify the situation, AnTuTu issued revision 3.2.2 to the benchmark Wednesday evening. The revision still uses the ICC compiler, but the resulting scores are drastically different for the Intel processor. The AnTuTu CPU and overall scores dropped by approximately 20 percent, while the AnTuTu RAM score plummeted by approximately 50 percent, as shown in the figure below. All other scores for the Intel processor remain relatively the same. Similarly, scores for the Samsung S4 Oct and Qualcomm Snapdragon 600, both used in the Samsung Galaxy S4, remain relatively unchanged.

As a result of the revised scores, the AnTuTu benchmark is no longer an outlier from the other benchmarks and paints a competitive picture similar to the other benchmarks, with the Samsung Exynos 5 Octa processor outperforming the Intel Atom Z2580 processor (see the figure below). But, are these numbers valid? It is unclear what changes were made to revision 3.2.2 the benchmark.

Partner Content
View All

However, AnTuTu did indicate that new testing standards would be issued in August, presumably with the next major revision of the benchmark. At that point, it will be interesting to see the new “standards” and the resulting changes in the benchmark scores.

Suddenly, the battle between smartphone processors looks much different than all the hype that was floating around because of the AnTuTu benchmark. It’s clear that the ARM processors still hold a significant advantage over the Intel processors, whether you include the new AnTuTu scores or just eliminate them completely from the evaluation.

The current and upcoming revisions to the AnTuTu benchmark will also drastically alter the scores of Intel’s upcoming Bay Trail Atom processor. As with the Atom Z2580, many in the press prematurely proclaimed it the victor over the next-generation ARM processors. Now that also appears to be highly questionable.

So, will all the sensationalistic bloggers retract their stories about Intel beating ARM? Likely not, but this definitely questions their credibility just as it does benchmarks. The moral of the story is that you have to question all benchmark data and use many data points from different sources before drawing a conclusion. It’s clear that Intel still has an uphill climb to catch the ARM camp, much less surpass it. It is also clear that we need better benchmarks for mobile devices that test for platform efficiency and usage models.

— Jim McGregor is founder and principal analyst at Tirias Research.

Related posts:

17 comments
Post Comment
DMcCunney   2013-07-12 14:00:20

There are two target audiences for benchmarks: engineers and end-users.

Engineers have price/performance targets they want to meet when designing systems, and benchmarks are metrics that tell them where they are on meeting the targets.

End-users will see benchmarks as ways of comparing systems.

But in the smartphone market, where the AnTuTu bechmarks seem to be popular, the question is what decisions are made on the basis of them.

Smartphones are better seen as fashion accessories than as tech.  Most buyers are interested in cool.  Smartphones are status markers, and the incentive will be "My phone is cooler than yours!" 

What makes a phone cool?  Brand will be critical.  iPhone buyers aren't buying iOS, they're buying Apple.  There are scads of Android phones, so while Android is cool, it's a common denominator, and just running Android is not a deciding feature.  Windows Phone is not cool, and that may be the biggest challenge Microsoft and Nokia have in getting a share of the market.

The smartphone market reminds me of the movies.  In the movie business, you are as good as your last hit picture, and if your studio hits a dry patch and doesn't have hit pictures for a while, you may go out of business.  The smartphone market is similar.  Motorola had a hit with the Razr, didn't have a followup hit, and was rumored for a bit to be looking at getting out of the smartphone business.

I'd bet that most folks who run the AnTuTu benchmarks are measuring performance on the phone they already bought for other reasons, and aren't making a decision on which phone to buy based on them.

Given that, how much should anyone care what the AnTuTu benchmarks say?

 

Questioner   2013-07-12 14:09:53

Great Post. Fantastic to see that EE Times article was influential enough for Antutu to update their benchmark, though it does cast doubt on the value of traditional benchmarking. Also a great lesson not to 'follow one number' to judge the capability of a smartphone.  I'm sure Intel isn't too happy about such a post, so kudos to EE Times for sticking with the facts/data.

DMac   2013-07-12 14:58:29

Jim, it seems as though you kicked up a hornets nest and forced AnTuTu to take another look at the benchmark. This is an example of great analysis that apparently caused action. Kudos to you Jim.

Jim McGregor   2013-07-12 17:08:08

Your point is well taken and I completely agree. End-users don't care about benchmarks. PC generation and pre-PC generation users usually look for common brands and features that they prefer. The mobile generation looks for something new. Benchmarks really come into play with the OEMs and carriers that are making decisions about what technology and products select. So, they still play a role within the industry, even if we all agree that they should not.

And, when we really look at an ARM or Intel processor, the decision is very similar to selecting the OS - what ecosystem do you want? Because, a processor alone does not make a successul product.

DMcCunney   2013-07-12 17:59:51

Oh, I can see end-users caring about benchmarks.  I just don't think most will use  them to make a purchase decision.  They will use them to validate the decision they already made .  The AnTuTu bechmarks are the sort of thing I can see a smartphone owner point to and say "Look how much better the benchmarks are for the phone I bought than they are for your choice." (with the implicit sub-text "I made a better choice than you did, so I'm smarter and cooler than you.")

And I can see cases in the industry where benchmarks will be used, and there will even be agreement they should be used.  I just hope for better understanding of what the benchmarks measure, how they measure it, and what the results actually mean.  (This may be wishful thinking on my part.)

I also wonder how many mobile device users are really aware of what processor is under the hood, or cares if they know?  As you say, they are buying an eco system, and what is available in that eco system will be far more important than what that eco system runs on.  There are probably device owners who are aware that Intel and ARM are battling for share in the mobile device market, but I really doubt anyone will say "I'm running ARM, a nd you're running (yuck!) Intel!   You're a real dweeb!"

 

 

DMac   2013-07-12 18:18:49

I agree with your point about mobile phone users not giving a hoot what kind of processor their phone is running. Obviously there are some that do, many of which read EE Times, but the vast majority of people buying smartphones could care less.

 

Mike2020   2013-07-12 19:37:03

Sounds like the intel compiler folks found a way to fool the benchmark by bypassing a whole lot of instructions.

 

This is nothing new with intel compilers. They seem to have an army of engineers making specific release for specific benchmarks.

 

All is fair in love and benchmarking?

nick.ro   2013-07-12 19:52:36

Jim, really is on wednesday evening?

I confirm the AnTuTu 3.3.2 release on July 8 or July 9.

JeffBier   2013-07-12 20:29:47

 

"I confirm the AnTuTu 3.3.2 release on July 8 or July 9."

The Google Play store shows AnTuTu 3.3.2 released on July 10, 2013: http://bit.ly/15BV4EA

nick.ro   2013-07-12 20:36:41

Google play isn't only one, you should use google.com. first appeared in China.

nick.ro   2013-07-12 20:54:48

Please google 安兔兔 3.3.2

results:

5 days ago

4 days ago

3 days ago

nick.ro   2013-07-12 20:59:56
perbacco   2013-07-12 23:14:45

Just showing performance numbers without measuing how much energy has been actually used to run the benchmark is meaningless. A quad core out-of-order CPU beating a dual-core in-order CPU in multi-thread benchmarks..color me surprised! 

Jim McGregor   2013-07-14 13:26:51

In terms of the timing, I and many of my colleagues were made aware of the 3.2.2 revision on Wednesday evening, which is what I am going by.

In terms of performance being meaningless, I agree. The best measure is efficiency (performance/Watt), and the most accurate measurement is platform efficiency. Even measuring just the CPU core or SoC is rather meaningless. Take the Intel Atom vs. the Qualcomm Snapdragon. If you were doing a fair comparison you should include all the functions that are integrated into the Snapdragon, such as wi-fi and the cellular baseband modem. On a platfom without those functions integrated into the SoC like the Atom, you have to factor in the combo chip, the baseband modem chip, and the filters managing coexistence issues. Then you need to add in any external accelerators that the other platforms may or may not have ro require, such as image signal processors, video processors, DSP, etc. And this is just to make a viable silicon comparison. However, if you really want to measure overall performance and power, you need to factor all the other system components and software as well.

So, I agree that we need a better way to evalaute mobile paltfroms and the key components that drive them. Any suggestions?

luting   2013-07-16 14:45:05

Jim,

I feel it is not fair to Compare GCC on ARM Processor against ICC on Intel Processor. From my past experience, ARMCC used to outpform GCC 2:1 in some cases. So I feel fair comparison should be either GCC on both ARM & Intel Processor or ARMCC on ARM Processor and ICC on Intel Processor. Of course, that will give ARM processor more edge against Intel Processor. I know ARM inc has been working to improve GNU compiler efficiency. So the gap between ARMCC and GCC might not be as bigger as used to be.

JeffBier   2013-07-17 00:08:31

Another way to look at this is:  Since we're talking about performance of mobile devices running Android, perhaps benchmarks should use whichever compiler is typically used for on a given processor when that processor is used in Android mobile devices.

In other words, rather than choosing the compiler that yields the best benchmark results, let's use the compiler (and compiler settings) that yields the most realistic benchmark results.

Wilco1   2013-07-18 06:33:29

Agreed, GCC is currently the only official compiler on Android, so the OS and all native code is compiled with it. Therefore any Android benchmarking should be done using GCC. Using GCC also has the advantage that unlike some other compilers it is not optimized to do well on benchmarks (or even break them) but actually focusses on performing well on real code.

The next version of AnTuTu will revert back to GCC. Hopefully they also fix the compiler options to be the same on all targets (the ARM version was compiled -Os with inlining and unrolling disabled) so that we finally end up with a fair comparison. Of course none of this changes the fact the benchmark itself remains rubbish - setting bits in memory is not a good memory performance test.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Related Articles