Progressive Enhancement and Modern JavaScript

Tuesday, 31 May 2016

Recently, I realised that I've been writing single page applications (SPAs) for about seven years. In January 2010, Firefox 3.6 introduced window.onhashchange and I remember eagerly anticipating code without window.location polling. Then, SPAs held a lot of promise: responsive interfaces, a clean separation between data and UI, less code duplication, and faster development times.

Of course, it was no panacea. In September 2010 Twitter joined the band-wagon with the now infamous hash-bang; a URL pattern designed to offload all navigation to JavaScript. The hash-bang broke a basic building block of the web (the URL), and crucially for Twitter, it made page loads dead-slow: downloading, parsing and executing JavaScript before your request was even understood.

Frameworks and the server

In 2010 we were not just busy breaking the web, we were busy getting organised. Backbone, Angular, and Knockout all delivered a first release that year. Each new framework has inspired imitators and usurpers, some completely re-thinking the DOM along the way (React, 2013). But while JavaScript has travelled far from those early SPAs, the traditional multi-page site was never invited for the ride. To this day, server-side languages and JavaScript frameworks maintain an uneasy relationship. Recently, JavaScript frameworks have tried to circumvent this with their own sever-side rendering: a technique called isomorphic JavaScript.

Isomorphism and graceful degradation

Isomorphic frameworks can skip the load time of a hash-bang (or other client-side) URL, and provide a static page to search engines, by emulating their browser APIs on the server. Now a snapshot of your page can be used as a fallback and your SPA can gracefully degrade. Great news!

But isomorphism is a JavaScript-centric view. I began to wonder how well other languages were supported, and what exactly had happened to Progressive Enhancement?

So, I started asking friends:

Do you still do progressive enhancement? (or at least graceful degradation): if so, what does your JS stack look like today?

Hardly a scientific survey, but the answers on Twitter and offline have been surprisingly consistent: server-side React for graceful degradation, jQuery and possibly shared templates (e.g. Mustache) for progressive enhancement. While the SPA has moved on, progressive enhancement stays steadfastly in 2009. But, in over six years, SPAs still fail to answer the concerns of 2010 effectively. We still re-implement basic browser features, each time in subtly different ways. We still subvert the humble URL. Perhaps that's why many sites still rely on jQuery.

Ideas on modern progressive enhancement (PE)

So, what could modern PE and multi-page JavaScript look like? If we use jQuery, the data structure we share with the server is HTML. That's awkward, most of our APIs use JSON. Shared templates seem more promising, but to abstract away the HTML and focus on the data they should meet some criteria:

Mustache meets the first requirement. Rendering via a HTML-to-Incremental-DOM compiler could meet the second. To meet all three, perhaps it's time to consider new ideas. I've been trying an approach that I'd like to share.

Some experiments

Below, I've included an experimental Magery template. It looks a lot like Handlebars, but in practice has a few changes.

{{#define main}}
    {{#each items}}
  <button onclick="addItem">Add</button>

First, it prevents custom JS extensions and stays close to the logic-less philosophy of Mustache. This keeps it portable. Second, the JavaScript version patches the DOM instead of re-rendering. This minimises DOM updates and keep elements stable:

var templates = Magery.loadTemplates(source);
var data = {title: 'Example'};

Magery.patch(templates, 'main', document.body, data);

By incrementally updating the DOM via patch it becomes convenient to rely on a template for all page updates. As both the server and client share the same template, the two sides are easy to align.

Finally, Magery templates will detect on* attributes and instead attach JavaScript event listeners (see onclick="addItem"). These events can be handled directly on the container by adding a dispatch function. Each event passed to dispatch will also get a reference to its context in the template. This provides a base for event handling no-longer tied to the HTML structure of a page:

container.dispatch = function (name, event, context) {
    if (name === 'addItem') {
        context.items.push({name: 'new item'});

These three features: portable syntax, incremental updates, and event binding, combine to make shared templates much more convenient. The idea would not be to compete with React or Angular in JavaScript 'applications' but instead to consider a new way of solving problems we currently approach with jQuery. With just three characteristics our multi-page site has already gained access to modern JavaScript techniques like Redux.

In future posts, I plan to explore these features using a progressively enhanced example.

The future

I think JavaScript frameworks have been blinkered to the needs of many developers (most websites are not SPAs or run by Node, nor should they be) for too long. We need to find a way to apply the lessons of modern frameworks to the rest of the web - it would be sad if everyone had to run JavaScript on their server and good-old resilient HTML was considered only as a fallback.

What do you think JS for progressive enhancement and multi-page sites should look like? Do you have a good setup already? Please share your ideas! For me, the best web experience has always been a blend of standard browser behaviour and a sprinkling of thoughtful JavaScript.

Source Code

Magery is currently an experiment, but if you'd like to help shape it's future you can find the JavaScript library on GitHub and provide feedback. There's also a server-meets-browser example in my quick and dirty Python version.

Both are available via package managers:

# Front-end JS version
npm install magery

# Python version
pip install magery