Update 2015-12-03: Added Postscript 1 in response to reader feedback. Thanks!
Who cares? It’s the future!
Of course, this isn’t true, but the gospel has spread to some fairly well-known shops:
These encounters are vexing, not only because of how trivial it is to develop these sorts of basic, information-driven Web sites without requiring client-side scripting, but because these sites are intentionally broken for some, with no added benefit to anyone else.2
Actually, they remind me of something else from the past:
Trouble with evangelists
While people still occasionally talk about progressive enhancement at Web design & development conferences, the core philosophy of universal access is often distorted to serve a regressive agenda that makes the lives of lazy developers easier at the expense of minority users’ access to content & services.
“Taking time to design sites that work within the basic constraints of the Web’s primary accessibility model isn’t smart,” they say, “because all these minority users we’ve been told about for years don’t really exist.”
Just because you can’t see them, doesn’t mean they don’t exist
Talking about progressive enhancement often means specifically telling developers to write sites in consideration of assistive technology and user-agents without scripting, because these are the groups that need a small amount of consideration in order for the Web to be at all usable. They aren’t large market segments—unless you count Web spiders—but they aren’t inventions, nor are they simply ghosts from an irrelevant past.
While WAI-ARIA hypothetically3 provides some hope for users that need to use assistive technologies in order to interact with scripted Web pages, there is a growing segment of users who are disproportionately affected by today’s progressive enhancement failures: those who are concerned with their online security and privacy.
This group is often difficult to see in analytics because, for many of them, their intent is typically to avoid being tracked, but it’s not hard to find evidence of their existence. NoScript is the third most popular add-on for Firefox with over 2.3 million active users:
In fact, it’s probably more likely that a user will try browsing the Web today without scripting than at any other time since the 1990s, due to the fact that nearly 2% of Web sites now serve malware4 (many of which are reputable but hacked), antivirus is not sufficient to protect against zero-day threats, and the ramifications of a malware infection are much more serious than they were in the past.
While plenty of experiences do legitimately require client-side scripting, the vast majority of sites that break in the absence of scripting do so due to a lack of care and effort, rather than due to any real requirement for executing arbitrary code on the client. In several cases, sites are broken because people use off-the-shelf libraries like Twitter Bootstrap, which don’t follow PE principles and instead offer unsafe defaults that hide critical navigation behind #-hrefs.
This means that security- and privacy-conscious users are forced to choose between not getting the information they need, messing around in Dev Tools to figure out the magic URL for the actual Web service they need to hit, or increasing their exposure to potential malware infection. Nobody should ever end up with their data encrypted and held for ransom because a lazy Web developer did something stupid like this:
In this case—and in many others—the page’s content has already been delivered and is simply being withheld from the user by CSS:
From a user experience perspective, there’s little rationale for intentionally neglecting progressive enhancement and breaking the Web for anybody by requiring client-side scripting. Only bad libraries and immature tools make breaking the Web seem necessary or right.
Universal access with progressive enhancement isn’t new or difficult, you just have to care a little
You wouldn’t know it by looking at the Web today, but progressive enhancement for even highly complex Web apps has been around for a while. Mature technologies like Wt Toolkit provide fallbacks to HTML forms for rich single-page Web apps without compromising on “modern” user experience. Other server-side frameworks in many languages also support progressive enhancement to improve the core user experience without breaking users.
Without regular instruction and reinforcement about how to develop using progressive enhancement techniques—and with many Web evangelists undermining progressive enhancement by trying to redefine what it means—it’s currently way too trivial and accepted for developers to break the Web. More and more, this is exactly what is happening, and many users of the Web are being disenfranchised as a result.5
If you believe in fundamental principles of the open Web, that content should be universally accessible, then following progressive enhancement principles is a critical part of the job you’re doing every time you crack open a code editor and start writing.
If you’re already ensuring your content is universally accessible, the next step is to talk back to evangelists that try to redefine universal access using metrics and leading questions about who really benefits. Additionally, talk more to other Web developers about how they can add dynamic content without scripting: collapsible sections with
<details> instead of jQuery, menus with
<menu> instead of Bootstrap, and single-tab interfaces with
:active instead of AngularJS. Ask browser vendors and standards committees to add other extremely common UI functionality that is frequently implemented using custom scripts. Use more mature tools that don’t break the Web.
If enough people start talking again about progressive enhancement, the way that they were in the mid-to-late 2000s, we can get back to a more balanced Web where universal access is the norm, not the exception. If not, the platform may be destined to become the next Adobe Flash Player: slow, insecure, inaccessible, and loathed by its users. And then native apps will take over once and for all.
Šime Vidas sent a nice response to this article on Twitter (thanks Šime!):
Yes, it is ironic that the Web may be overtaken by native apps both because people are trying too hard to use it like an application platform instead of the World Wide Web, and because it doesn’t have enough features versus native platforms to remain relevant for application development. (I honestly don’t think the second one is a big deal; native platforms have always been more feature-capable than the Web, but that didn’t stop Web apps from becoming a thing. Platforms are not one-size-fits-all!)
That said, the good news is that Service Workers are actually not necessary for basic offline functionality of Web pages. Offline support has existed in browsers for significantly longer than mobile data connections have existed, since dial-up connections in the 90s were often time-metered and not terribly reliable. So, there are some very old (and some new!) technologies in your Web stack to allow for some offline browsing of content today.
The most important of these technologies is the
Cache-Control HTTP header. It allows Web servers to tell browsers (and proxies) how to cache resources.
As long as
must-revalidate is not included in a
Cache-Control header, browsers and proxies are allowed to serve cached content if a network connection is unavailable, even if the content is above the
max-age specified in the header. When a connection is available, and
max-age is reached, the browser will ask the server if the content has changed or not, and the server will either respond with “304 Not Modified”, or with the newer content.
For preloading content, HTML5 introduced a “prefetch” link type that allows authors to ask user-agents to download some additional resources in the background. This means that even if the user’s connection goes away, the browser’s cache can be primed with related content, so they can browse a bit more while waiting for a connection to return. HTTP 2.0 also includes Server Push, which allows an HTTP server to preemptively transmit content to a browser, which can again be cached and retained for offline use.
HTML5 also originally introduced the HTML cache manifest, which adds more reliability to the offline cache by avoiding random eviction, but it is pretty broken, will always load offline content once filled instead of only using offline content when a network is unavailable, and will only update resources after redownloading the entire manifest. So, I’m really only mentioning it here because it exists already, rather than endorsing it as a worthwhile feature.
Nominally, Service Workers represent only a small additional benefit for offline support by giving authors greater control over network resource fetching, transmission, and invalidation, and by making the local cache much less likely to be evicted by the browser. In practice, browser vendors haven’t worked very hard at making
Cache-Control work effectively for managing offline content, so the benefits of Service Workers are larger.
That said, the non-starter problems with
Cache-Control can be mitigated today with a local caching proxy like Squid, or by browser extensions that can hook into the networking internals of the browser to retain offline content more effectively. HTTP 1.1 also allows for extensions to
Cache-Control, which could be used to add the necessary cache hints that prevent it from being highly reliable today.
According to Wikipedia, progressive enhancement as a distinct term was introduced by Steven Champeon and Nick Finck in Inclusive Web Design for the Future, March 2003. Graceful degradation as a concept has existed for much longer—since at least 1994 for the Web, and quite a bit more in basic fault-tolerant industrial design principles. ↩
WAI-ARIA support is honestly really, really bad, even today. It’s a very complex system. Unlike new browser features that usually just require the browser to implement them, ARIA requires Web page authors, Web browser vendors, assistive technology vendors, and operating system vendors to all correctly implement each assistive technology role and attribute in order for proper information to make it to the end user—and none of them do a very good job at it, because just like nobody cares much about progressive enhancement, nobody cares very much about disabled people. ↩
Over the last week I’ve run into many sites that are completely broken or missing large sections of content when scripting is disabled: AngularJS (broken menus, links, content, & documentation), Bugmenot (broken form), Business Insider (blank content), CloudUp (blank content), Creative Commons (broken menus and forms), Droplr (missing drop content), edX (blank content), Forbes (blank content), Imogen Heap (block message), Madcatz (broken links), Mentimeter (blank content), Microsoft Support (block message), Microsoft Visual Studio Marketplace (spinner), MyFonts (blank content), Netgear (blank content, broken menus), Opera Philadelphia (blank content), Secunia (broken forms), and Sennheiser (block message), to give a sense of how broad this problem has become. It’s not even just marginalised groups that are routinely disenfranchised by lazy developers incorrectly implementing Web tech; hundreds of problems with sites using default Web browser settings are reported annually to webcompat. ↩