facebook rss twitter

Microsoft releases IE9 Platform Preview 7

by Pete Mason on 19 November 2010, 12:24

Tags: Internet Explorer, Microsoft (NASDAQ:MSFT), Mozilla

Quick Link: HEXUS.net/qa26q

Add to My Vault: x

Microsoft has just released the latest developer preview of Internet Explorer 9, bringing with it a whole host of performance improvements to the new Chakra JavaScript engine.

Even though the announcement makes the point that synthetic benchmarks are "at best not very useful" because they aren't representative of real-world usage, Microsoft was happy to show off just how fast IE9 is. The differences between most of the browsers amount to tiny fractions of a second, but the latest preview of Internet Explorer still manages to edge to the front of the pack in the SunSpider benchmark.

However, Mozilla developer Rob Sayre found his own results using the browser a little odd, so decided to look into it. What he discovered was that IE9 uses 'dead code analysis' to speed up JavaScript performance. The gist of this optimisation is that code that doesn't do anything - such as checks that will always be either positive or negative or calculations where the results aren't used anywhere - is removed or simplified, speeding up the entire process. However, the math-cordic code in the SunSpider benchmark is specifically designed around these sort of operations, so IE9 was able to score very highly by simplifying large chunks of the test.

Sayre saw this as an anomaly and implied that Microsoft was trying to fix the numbers by specifically tuning the browser for this particular benchmark. He even submitted it as a bug to the developers.

Understandably, Microsoft developer Dean Hachamovitch updated the original announcement to comment on the minor stir that this had caused. He denied any specific optimisations, and argued that dead code was relatively common in the wilds of the internet. Returning to his point about the difference between benchmarks and real-world performance, Hachamovitch argued that the dead code elimination did exactly what it was supposed to - removed redundant operations that were slowing the JavaScript engine down.

Sayre wasn't satisfied by this explanation, though, and argued that, while this approach was valid, IE9's implementation was selective, inaccurate and only removed inexpensive operations. It is, however, triggered by the specific operations found in SunSpider, making it impossible to call what the Internet Explorer devs had done a "serious general purpose optimization".

The Mozilla developer admitted that he "could be missing something", especially since the IE9 releases are still developer previews, but appears to be standing by his assertions for the time being.



HEXUS Forums :: 12 Comments

Login with Forum Account

Don't have an account? Register today!
Synthetic benchmarks FTW… erm…
Sounds like a valid optimization to me. The benchmark needs to be updated if this type of optimization should be excluded.
So he is complaining that they have tuned their optomiser for a specific type of test?

Well how about making a better test then, that is more real world, I mean this wasn't a test from MS afterall.

Or you could make your browser into something other than a leaking creaky old thing that gets worse with every revision.
Sayre doesn't need a synthetic test to tell him about this. He already has lots of people who actually uses Firefox in numerous forums and journals and blogs to tell him to get his act together on Firefox. So he should just stop wasting time on this.
Botia
Sounds like a valid optimization to me. The benchmark needs to be updated if this type of optimization should be excluded.

I agree - there's a very good article on this at Ars:

http://arstechnica.com/microsoft/news/2010/11/lies-damned-lies-and-benchmarks-is-ie9-cheating-at-sunspider.ars

I think I side with this - if the benchmark isn't properly constructed (i.e. has dead code) then optimising it out (so long as it's not specific to this particular benchmark) is absolutely valid. And to be frank Mozilla have plenty of optimisation of their own they need to attend to (and hence the delay on the next FF no doubt).