the missed opportunity of acid 3

Ian Hickson released his Acid 3 test a little while ago, and though there have been a couple of changes recently, it’s pretty much settled down in its final form. There’s been a lot of discussion of how different browsers do on the test, and it’s clear that some projects are really buckling down to get to 100 and the pixel-perfect rendering. You might ask why Mozilla’s not racking up daily gains, especially if you’re following the relevant bugs and seeing that people have produced patches for some issues that are covered by Acid 3.

The most obvious reason is Firefox 3. We’re in the end-game of building what I really do believe is the best browser the web has ever known, and we expect to be putting it in the hands of more than 170 million users in a pretty short period of time. We’re still taking fixes for important issues, but virtually none of the issues on the Acid 3 list are important enough for us to take at this stage. We don’t want to be rushing fixes in, or rushing out a release, only to find that we’ve broken important sites or regressed previous standards support, or worse introduced a security problem. Every API that’s exposed to content needs to be tested for compliance and security and reliability, and we already have some tough rows to hoe with respect to conflicts with existing content as it is. We think these remaining late-stage patches are worth the test burden, often because they help make the web platform much more powerful, and reflect real-web compatibility and capability issues. Acid 3′s contents, sadly, are not as often of that nature.

Ian’s Acid 3, unlike its predecessors, is not about establishing a baseline of useful web capabilities. It’s quite explicitly about making browser developers jump — Ian specifically sought out tests that were broken in WebKit, Opera, and Gecko, perhaps out of a twisted attempt at fairness. But the Acid tests shouldn’t be fair to browsers, they should be fair to the web; they should be based on how good the web will be as a platform if all browsers conform, not about how far any given browser has to stretch to get there.

The selection of specifications is also troubling. First, there is a limitation that the specification had to be suitably finished by 2004, meaning that only standards that were finalized during the darkest period of web stagnation were eligible: standards that predate people actively reviving the web platform, through work like the WHATWG’s <canvas> specification. While this did protect us from tests requiring the worst of SVG’s excesses (1.2, with support for such key graphical capabilities as file upload and sockets, was promoted to CR in 2007), it also means that it includes @font-face, a specification which was so poorly thought of that it was removed in 2.1. I can think of no reason to place such time-based restrictions on specification selection other than perhaps to ensure that there has been time for errata to surface — but in the case of CSS, those errata are directly excluded! Similarly, the WHATWG’s work to finally develop an interoperable specification for such widely-used things as .innerHtml were excluded. (If the test is about “fairness”, I could see not wanting to target specifications that were so new that people just hadn’t had time to catch up to them, but again I think that such a test should be built on more long-term criteria than lining up the starting blocks for a developer sprint. We could be using Acid tests as an “experience index” for the web, if you will.)

I also believe that the tests should focus on areas where the missing features can’t be easily worked around; SMIL’s capabilities are available to interested authors via an easy-to-use library, and if Hixie could stomach digging around in the SVG specification I wish he’d spent his time on things like filters or even colour profiles, which lacks are much harder to work around.

People who misread my disappointment or our lack of last minute Acid-boosting patches as a lack of commitment to standards would do well to study our history as a project. For more than a decade, we’ve been dedicated to support of standards, and improvement of those standards, even though it has often been painful to do so. In 1998 we threw away our entire layout engine in order to rebuild on one that provided, we believed, a better basis for the new standards of the day: DOM, CSS, etc. We’ve changed our implementations of <canvas>, of globallocalStorage, of JavaScript — all to track changes in standards, and to avoid conflicting with them. Last night, we disabled cross-site XHR because we aren’t certain what way the spec was going to go, and if, e.g., we ship something that doesn’t send cookies, we would constrain where the spec could go by building developer expectations about that behaviour. (These developer expectations about what sorts of requests can be triggered from cross-domain content are basically the entire reason that we need special work for cross-site XHR mashup technology.) We will fix standards compliance bugs; it’s what we do. But we won’t fix them all with the same priority, and I hope that we won’t prioritize Acid 3 fixes artificially highly, because I think that would be a disservice to web developers and users. Where Acid 3 happens to test something that we believe is important to fix, we will of course pursue it: surrogate pair handling or some of the selector bugs seem like good candidates.

Acid 3 could have had a tremendous positive effect on the web, representing the next target for the web platform, and helping developers prioritize work in such a way as to maximize the aggregate capabilities of the web. Instead, it feels like a puzzle game, and I can easily imagine the developers of the web’s proprietary competitors chuckling about the hundreds of developer-hours that have gone into adding another way to iterate over nodes, or twiddling internal APIs to special case a testing font. I don’t think it’s worthless, but I think it could have been a lot more, especially with someone as talented and terrifyingly detail-oriented as Ian at the helm.

Maybe next time.

[Update: the WebKit checkin to special-case Ahem wasn't one that used a private OS X API, it was one that used an internal CoreGraphics API on Windows; I should have been reading more closely, apologies to the WebKit folks.]

credit where it’s due

There’s going to be a lot of talk this week coming out of MIX, about IE8. Early reports are interesting, if still often hidden behind “NOT PUBLISHED” links on MSDN, but the most interesting thing I’ve seen yet is their change in IE8 rendering mode default. The specific change is a good one, but even more than that I think it’s very promising as a matter of process.

The original decision to make IE8 default to matching IE7′s legacy rendering mode was made in secret, secret even from many of the web experts in the organizations linked to from Dean’s post. Once the conversation was opened to input from the rest of the world’s experts on web content compatibility, they were able to get to a much better decision, and happily that’s reflected in the updated plans for IE8. I hope that this case helps Microsoft understand more generally that significant web-compatibility decisions are too important to be left to closed groups: the web is simply too big, with too many stakeholders, for that to be a workable path to success. (Not that simply operating under “a standards body” is sure to avoid such secrecy, as they can also have related damage, but they at least tend to have more diverse viewpoints, which is a half-measure of some value.)

I was also heartened that they were able to make such a change after announcing it. We’ve all heard before that the IE team wasn’t able to talk about their plans until they were very certain, because people build businesses on those plans and will be harmed are made after an announcement. The conspicuous lack of bankruptcies attributable to WinFS being dropped from Vista aside, that they were able to listen to “global” feedback and make a significant change based on that feedback gives me hope that a new, more open process may be beginning here. Bravo and thanks to Microsoft for listening genuinely and making a change that I think will have a very positive effect on standards-based content on the web.

Some of the items listed in the “IE8 Readiness Toolkit” look pretty interesting, and in at least one case (the “XDomainRequest” API) seem pretty close to the subjects of some recent discussions in standards groups and various open projects about solving similar problems. I’m not sure if Microsoft proposed their API or semantics to that group, or shared the design thinking that went into their choices, but I’m permitting myself to hope anew!

(In light of my previous post about compatibility liability, I was also very interested to see this titbit:

[...]we do not believe any current legal requirements would dictate which rendering mode a browser must use[...]

It’s much more likely that they’re referencing the Opera suit than that they’re talking about Microsoft representatives’ previous claims of lawsuit risk stemming from changes in new product versions, but it made me smile nonetheless.)

suing and blockading for compatibility

One thing that came up periodically when people on the HTML working group were discussing version selectors was the notion that there is legal liability associated with breaking compatibility. Chris Wilson is the person who most frequently brings up this point (unsurprising, given his affiliation and experiences with IE), though he may well not be the only one. I’ll excerpt one example here:

We (Microsoft) have to be in control of our own destiny there. Unless you’re suggesting that the WG would shoulder the financial burden when we (Microsoft) are sued because we broke compatibility and caused some company’s multi-million-dollar intranet app to break.

And later, though not referring to liability but rather a government “lockout”:

A single government who locks us out of their market because we broke their intranet app (even if they were ua-switching and giving us bad content, and it was “clearly their fault”)? Probably a very big deal.

I have a couple of questions, then, for the combined legal minds of the lazy web:

  • What would be the legal basis for a suit by a customer, given the provisions of typical EULAs which explicitly disclaim pretty much all warranty they can? Let’s assume that the compatibility break is caused by an upgrade to a new version of software (Firefox 2 to Firefox 3, for example) that’s under the control of the customer. You can choose your jurisdiction, and assume the worst case for the vendor having end-of-lifed the previous version, etc.
  • Can anyone find an example of such a suit having been brought?
  • Not a legal question, but very much on my mind: given the impact that IE7 had on Korea, why would they have gone ahead and done the release anyway, if it was such a big deal for them to be locked out of a national market?

I invite your informed, or perhaps just creative, speculation on the topic! Personal attacks on Chris Wilson, or rehashed “Microsoft is evil so they got the illuminati to block Korea’s secret sanctions in the UN” conspiracy theories are not welcome, and might well be moderated out or disemvoweled.

X-IE-Version-Freeze

There are a lot of good posts describing problems with IE8′s version selector feature, so I’ll just leave you to read those for some insight into problems that it creates (and how it pushes the IE-compatibility burden off Microsoft and onto other competing browsers in a very impressive way).

They’ve announced what they’re doing in IE8, and as we know from previous conversations with Chris Wilson, they don’t announce until they’re sure:

[W]e have to be very, very careful to be 100% confident when we announce things like “that’s exactly what we’re doing,” or “that’s the date that we are aiming for.” When we don’t, we tend to get a lot of people upset with us, of course, but it’s not just them being upset with us; it’s actually, it can be damaging their business model if they bet on us releasing something in a given timeframe, or bet on us releasing a given feature, and we don’t ship it

So we’re going to see X-UA-Compatible in IE8, with very high likelihood. It’s positioned as something that other browsers could use — if they wanted to ship multiple rendering engines to make download size a further impediment to competing with Microsoft, say, or totally lock themselves out of the mobile market — and the introductory document has examples like “FF=3″ in it.

And we know from the ES4 discussions that Chris believes in different groups working out proposals together, which is why they didn’t make a specific counter-proposal to the one put forward by TG1, after all:

It’s been pointed out that we haven’t made an alternate proposal – well, I’d kinda hoped we could work it out together. “Open to input” should be the way of the web, should it not?

So that makes me wonder: why were no other browser developers involved in this discussion? I guess that would ruin the protective cloak of secrecy, though I don’t know how else people would work things out together. (If Microsoft knows, they’re not telling, but I guess that shouldn’t be surprising?)

The naming of the header is sort of generic, but unless the next big announcement on the IEBlog is the release of IE8′s source code, “render this like IE8 would” really only helps Microsoft, and I think it works against the promising trend of convergence on open standards.