Stay alert

richharris

Rich Harris

Posted on August 10, 2021

Stay alert

A short while ago, Chrome broke the web by disabling alert(), confirm() and prompt() dialogs from cross-origin iframes. The justification was that "the current UX is confusing, and has previously led to spoofs where sites pretend the message comes from Chrome or a different website"; removing the feature was deemed preferable to fixing the UX.

But legitimate uses were affected too. Users of CodePen, the widely-used code-sharing site co-founded by Chris Coyier, suddenly discovered that they were unable to use these functions in their projects, since CodePen runs your code inside a cross-origin iframe to guard against XSS attacks. Reports from other sites followed, and in the ensuing chaos the change was rolled back until 2022.

Hidden in the replies to Coyier's tweet was a surprising statement from Domenic Denicola, an engineer on the Chrome team:

It's best that such teaching sites be prepared for the eventual end state where these are removed from the web platform entirely.

Wait, what?

Reading the intent to remove thread confirms that this is indeed Chrome's stance: blocking dialogs (including onbeforeunload) were a mistake, and their future removal is a fait accompli.

After I tweeted about the situation last week, my notifications tab became a Boschian hellscape, so I'm hesitant to write this post. But there are several aspects to this story that are too important for us not to talk about. It's not just a story about unloved APIs, it's a story about power, standards design, and who owns the platform — and it makes me afraid for the future of the web.

Onramps

Facebook's Dan Abramov pointed out that the changes nuked many programming tutorials. Google's Emily Stark suggested they should use the <dialog> element instead. For the moment, we'll gloss over the fact that <dialog> is sufficiently flawed that Denicola floated removing it from the spec — or that MDN's suggested fallback for browsers that don't support it is none other than alert — and instead consider what this would look like in real life.

Often, when I'm teaching people web development, they begin learning JavaScript by building a simple number guessing game along these lines:

function game() {
  let number = Math.ceil(Math.random() * 100);
  let guess = prompt(`Guess a number between 1 and 100`);

  guess = Number(guess);

  while (guess !== number) {
    if (guess < number) {
      guess = prompt(`Too low! Guess again`);
    } else {
      guess = prompt(`Too high! Guess again`);
    }

    guess = Number(guess);
  }

  alert(`That's right! The number was ${number}`);
}

game();
Enter fullscreen mode Exit fullscreen mode

It's pretty straightforward-looking stuff, but in the space of a few lines of code the students are exposed to many unfamiliar concepts:

  • Data types (strings vs numbers, and converting between them)
  • Functions, both built-in and the ones you write yourself
  • Loops and if-else statements
  • Operators

It's a popular lesson, and even foreshadows future discussions of algorithms (the smartest students soon intuit that they can 'win' by conducting a binary search), but it's hard — easily an hour's worth of material. Imagine now that before they could complete it they were required to learn about the DOM, event handling, and asynchronous programming. Educators gravitated towards blocking dialog APIs for a reason.

Failing to understand why these APIs are so valuable in an educational context is inevitable if you don't consider teachers part of your constituency when designing standards. It's cliché (and only partly accurate) to say that the web used to have better onramps for developers, but there's truth behind the nostalgic grumbling: the web platform's learnability has long been essential to its success. We damage it at our peril.

Hidden signals

The 'primary signal' Chrome uses to determine whether something can safely be removed from the web platform is the number of page views impacted. A feature appearing on 0.001% of page views is considered 'small but non-trivial' usage. (Cross-origin alert is at around 0.006%, significantly above this threshold; with same-origin the figure is 50x higher still.)

It's easy to overindex on the things you can quantify, especially if you're Google. But not all things that count as uses of some feature show up in the data, when the data is predominantly public-facing production websites. Teaching is one such case. There are others.

For example, I've had several experiences in which a well-placed alert was the only way to test hypotheses during debugging. In an ideal world we'd all have well-stocked device labs and be able to remotely inspect our code wherever it's running, no matter how imminent the deadline. Reality isn't always so accommodating.

Even when my code is working as intended — it happens sometimes — I'm likely to reach for alert before adding complex error handling, if I'm building something for myself or my coworkers and I expect errors to be rare occurrences.

And security researchers frequently use alert to demonstrate vulnerabilities. (Yes, in future they could use something less concise and less visible like console.log, but in the meantime years' worth of literature would instantly fall out of date if alert vanished.)

All of these are legitimate uses, but none will affect the metric that determines whether they're important enough to be supported by Chrome. Even when we do focus solely on production websites, usage doesn't necessarily correlate with importance, as noted by Dan Abramov.

Breakage

According to Emily Stark, a security expert on the Chrome team, breakage is something that happens often on the web.

But if that's true, it's very largely because of Chrome. For a long time, 'don't break the web' was considered something of a prime directive in standards work. Recall #smooshgate: a proposal to add a flatten method to Array.prototype turned out to be a breaking change because an ancient version of MooTools, still in use by a handful of sites, added its own incompatible flatten. Disappointingly, some developers argued that breaking the web was acceptable, but TC39 took its backwards compatibility responsibilities seriously and ended up renaming flatten to flat instead. Google's Mathias Bynens wrote:

As it turns out, “don’t break the Web” is the number one design principle for HTML, CSS, JavaScript, and any other standard that’s widely used on the Web.

This time around, the approach was rather more cavalier.

Reasonable people can disagree about the balance of priorities when considering breaking changes, but it's good to be clear-eyed about what 'breakage' means. One of the many anecdotes I heard in the wake of the cross-origin alert changes stood out:

I was attempting to delete my recurring payments account from my local waste management's super old-school site. I was bit by the cross-domain confirm() in Chrome 92. I switched to Firefox to complete.

What if Firefox was no longer an option, either because a cash-strapped Mozilla had stopped developing it, or because they had implemented the now standardized spec changes? We're not talking about the Space Jam website rendering incorrectly, we're talking about people being unable to use essential services on the web. A frequent implication in the discussion last week was that website owners could simply re-engineer their apps to not use blocking dialogs, regardless of the cost of doing so. But many sites are no longer maintained, and they're no less valuable because of it.

We can't normalise the attitude that collateral damage is the price of progress, even if we accept the premise — which I don't — that removing APIs like alert represents progress. For all its flaws, the web is generally agreed to be a stable platform, where investments made today will stand the test of time. A world in which websites are treated as inherently transient objects, where APIs we commonly rely on today could be cast aside as unwanted baggage by tomorrow's spec wranglers, is a world in which the web has already lost.

What if alert is... good, actually?

We're often reminded to use the web's built-in form elements instead of recreating checkboxes and buttons with a <div> salad. Not only are they more accessible than what you'd likely build yourself, the visual consistency makes your app easier for users to navigate even if you consider the default appearance 'ugly'.

Yet when it comes to dialogs, the ugly default is treated as a bug rather than a feature. Why? As Heydon Pickering puts it:

Using alert(), prompt(), and confirm() in an MVP is the closest most devs will get to providing accessible dialogs. Chrome removing them just cuts out that step. Devs can go straight onto building their own underperforming, inaccessible dialogs

In the bad old days, the behaviour of alert was somewhat obnoxious — it would focus the tab in question, and prevent you from navigating away. Thanks to years of hard work, that's no longer the case, to the extent that I'd argue alert is in many cases better than whatever you'd have cobbled together yourself.

There are security issues with cross-origin iframes. I remain unconvinced that removal is a better solution than improving the design in a way that makes their provenance clearer.

Who owns the web?

A common response to last week's kerfuffle was 'use Firefox'. But that's not a solution. Even though the change was proposed by Chromium (the intent to remove preceded any discussion with other browser vendors), Firefox ultimately supported it. That's all it takes for something to become a 'standard' — support from two vendors, and stated opposition from none.

Put differently: when it comes to web standards, browsers call the shots exclusively.

Whenever I've questioned the wisdom of this or that proposal, I've been told I should simply get involved in the standards discussions — they're right there on GitHub! But openness means nothing without the power to effect change, and browsers have all the power. This should strike us as odd — the W3C's priority of constituencies explicitly states that the needs of users and authors (i.e. developers) should be treated as higher priority than those of implementors (i.e. browser vendors), yet the higher priority constituencies are at the mercy of the lower priority ones. (Chrome developers argue that they are acting in the interests of users in this case, but this thread from Mike Sherov makes a convincing case that this is a fig leaf for the real motivation, which is technical debt.)

Meanwhile, we don't seem to be learning from the past. If alert is fair game for removal, then so is every API we add to the platform if the web's future stewards deem it harmful. Given that, you'd think we'd expand the platform's surface area with extreme caution; instead, we're adding APIs at breakneck speed, to the almost-guaranteed detriment of its future stability.

Given Chrome's near-monopoly control of the browser market, I'm genuinely concerned about what this all means for the future of the web. An ad company shouldn't have this much influence over something that belongs to all of us. I don't know how to fix the standards process so that it's more representative of the diversity of the web's stakeholders, but I'm increasingly convinced that we need to figure it out.

💖 💪 🙅 🚩
richharris
Rich Harris

Posted on August 10, 2021

Join Our Newsletter. No Spam, Only the good stuff.

Sign up to receive the latest update from our blog.

Related