About The Author Luigi De Rosa works as Senior Front-End Developer at EPIC in Liège, Belgium. He is passionate about creative development, animations, JavaScript and performance … More about Luigi De Rosa …

Improving User Flow Through Page Transitions

Smashing Newsletter Every week, we send out useful front-end & UX techniques. Subscribe and get the Smart Interface Design Checklists PDF delivered to your inbox. Your (smashing) email Subscribe →

Any time a user’s experience is interrupted, the chance of them leaving increases. Changing from one page to another will often cause this interruption by showing a white flash of no content, by taking too long to load or by otherwise taking the user out of the context they were in before the new page opened.

Transitions between pages can enhance the experience by retaining (or even improving) the user’s context, maintaining their attention, and providing visual continuity and positive feedback. At the same time, page transitions can also be aesthetically pleasing and fun and can reinforce branding when done well.

In this article, we’ll create, step by step, a transition between pages. We will also talk about the pros and cons of this technique and how to push it to its limit.

Examples

Many mobile apps make good use of transitions between views. In the example below, which follows Google’s material design guidelines, we see how the animation conveys hierarchical and spatial relationships between pages.

Why don’t we use the same approach with our websites? Why are we OK with the user feeling like they are being teleported every time the page changes?

How To Transition Between Web Pages

SPA Frameworks

Before getting our hands dirty, I should say something about single-page application (SPA) frameworks. If you are using an SPA framework (such as AngularJS, Backbone.js or Ember), then creating transitions between pages will be much easier because all of the routing is already handled by JavaScript. Please refer to the relevant documentation to see how to transition pages using your framework of choice, because there are probably some good examples and tutorials.

The Wrong Way

My first attempt to create a transition between pages looked more or less like this:

document.addEventListener('DOMContentLoaded', function() { // Animate in }); document.addEventListener('beforeunload', function() { // Animate out });

The concept is simple: Use one animation when the user leaves the page, and another animation when the new page loads.

However, I soon found that this solution had some limitations:

We don’t know how long the next page will take to load, so the animation might not look fluid.

We can’t create transitions that combine content from the previous and next pages.

In fact, the only way to achieve a fluid and smooth transition is to have full control over the page-changing process and, therefore, not to change the page at all. Thus, we have to change our approach to the problem.

The Right Way

Let’s look at the steps involved in creating a simple crossfade transition between pages the right way. It involves something called pushState AJAX (or PJAX) navigation, which will essentially turn our website into a kind of single-page website.

Not only does this technique achieve smooth and pleasant transitions, but we will benefit from other advantages, which we will cover in detail later in this article.

Prevent the Default Link Behavior

The first step is to create a click event listener for all links to use, preventing the browser from performing its default behavior and customizing the way it handles page changes.

// Note, we are purposely binding our listener on the document object // so that we can intercept any anchors added in future. document.addEventListener('click', function(e) { var el = e.target; // Go up in the nodelist until we find a node with .href (HTMLAnchorElement) while (el && !el.href) { el = el.parentNode; } if (el) { e.preventDefault(); return; } });

This method of adding an event listener to a parent element, instead of adding it to each specific node, is called event delegation, and it’s possible due to the event-bubbling nature of the HTML DOM API.

Fetch the Page

Now that we have interrupted the browser when it tries to change the page, we can manually fetch that page using the Fetch API. Let’s look at the following function, which fetches the HTML content of a page when given its URL.

function loadPage(url) { return fetch(url, { method: 'GET' }).then(function(response) { return response.text(); }); }

For browsers that don’t support the Fetch API, consider adding the polyfill or using the good old-fashioned XMLHttpRequest .

Change the Current URL

HTML5 has a fantastic API called pushState , which allows websites to access and modify the browser’s history without loading any pages. Below, we are using it to modify the current URL to be the URL of the next page. Note that this is a modification of our previously declared anchor click-event handler.

if (el) { e.preventDefault(); history.pushState(null, null, el.href); changePage(); return; }

As you might have noticed, we have also added a call to a function named changePage , which we will look at in detail shortly. The same function will also be called in the popstate event, which is fired when the browser’s active history entry changes (as when a user clicks on the back button of their browser):

window.addEventListener('popstate', changePage);

With all of this, we are basically building a very primitive routing system, in which we have active and passive modes.

Our active mode is in use when a user clicks on a link and we change the URL using pushState , while passive mode is in use when the URL changes and we get notified by the popstate event. In either case, we are going to call changePage , which takes care of reading the new URL and loading the relevant page.

Parse and Add the New Content

Typically, the pages being navigated will have common elements, like header and footer . Suppose we use the following DOM structure on all of our pages (which is actually the structure of Smashing Magazine itself):

Animate!

When the user clicks a link, the changePage function fetches the HTML of that page, then extracts the cc container and adds it to the main element. At this point, we have two cc containers on our page, the first belonging to the previous page and the second from the next page.

The next function, animate , takes care of crossfading the two containers by overlapping them, fading out the old one, fading in the new one and removing the old container. In this example, I’m using the Web Animations API to create the fade animation, but of course you can use any technique or library you’d like.

function animate(oldContent, newContent) { oldContent.style.position = 'absolute'; var fadeOut = oldContent.animate({ opacity: [1, 0] }, 1000); var fadeIn = newContent.animate({ opacity: [0, 1] }, 1000); fadeIn.onfinish = function() { oldContent.parentNode.removeChild(oldContent); }; }

The final code is available on GitHub.

And those are the basics of transitioning web pages!

Caveats And Limitations

The little example we’ve just created is far from perfect. In fact, we still haven’t taken into account a few things:

Make sure we affect the correct links.

Before changing the behavior of a link, we should add a check to make sure it should be changed. For example, we should ignore all links with target="_blank" (which opens the page in a new tab), all links to external domains, and some other special cases, like Control/Command + click (which also opens the page in a new tab).

Before changing the behavior of a link, we should add a check to make sure it should be changed. For example, we should ignore all links with (which opens the page in a new tab), all links to external domains, and some other special cases, like (which also opens the page in a new tab). Update elements outside of the main content container.

Currently, when the page changes, all elements outside of the cc container remain the same. However, some of these elements would need to be changed (which could now only be done manually), including the title of the document, the menu element with the active class, and potentially many others depending on the website.

Currently, when the page changes, all elements outside of the container remain the same. However, some of these elements would need to be changed (which could now only be done manually), including the of the document, the menu element with the class, and potentially many others depending on the website. Manage the lifecycle of JavaScript.

Our page now behaves like an SPA, in which the browser does not change pages itself. So, we need to manually take care of the JavaScript lifecycle — for example, binding and unbinding certain events, reevaluating plugins, and including polyfills and third-party code.

Browser Support

The only requirement for this mode of navigation we’re implementing is the pushState API, which is available in all modern browsers. This technique works fully as a progressive enhancement. The pages are still served and accessible in the usual way, and the website will continue to work normally when JavaScript is disabled.

If you are using an SPA framework, consider using PJAX navigation instead, just to keep navigation fast. In doing so, you gain legacy support and create a more SEO-friendly website.

Going Even Further

We can continue to push the limit of this technique by optimizing certain aspects of it. The next few tricks will speed up navigation, significantly enhancing the user’s experience.

Using a Cache

By slightly changing our loadPage function, we can add a simple cache, which makes sure that pages that have already been visited aren’t reloaded.

var cache = {}; function loadPage(url) { if (cache[url]) { return new Promise(function(resolve) { resolve(cache[url]); }); } return fetch(url, { method: 'GET' }).then(function(response) { cache[url] = response.text(); return cache[url]; }); }

As you may have guessed, we can use a more permanent cache with the Cache API or another client-side persistent-storage cache (like IndexedDB).

Animating Out the Current Page

Our crossfade effect requires that the next page be loaded and ready before the transition completes. With another effect, we might want to start animating out the old page as soon as the user clicks the link, which would give the user immediate feedback, a great aid to perceived performance.

By using promises, handling this kind of situation becomes very easy. The .all method creates a new promise that gets resolved as soon as all promises included as arguments are resolved.

// As soon as animateOut() and loadPage() are resolved… Promise.all[animateOut(), loadPage(url)] .then(function(values) { …

Prefetching the Next Page

Using just PJAX navigation, page changes are usually almost twice as fast as default navigation, because the browser does not have to parse and evaluate any scripts or styles on the new page.

However, we can go even further by starting to preload the next page when the user hovers over or starts touching the link.

As you can see, there are usually 200 to 300 milliseconds of delay in the user’s hovering and clicking. This is dead time and is usually enough to load the next page.

That being said, prefetch wisely because it can easily become a bottleneck. For example, if you have a long list of links and the user is scrolling through it, this technique will prefetch all of the pages because the links are passing under the mouse.

Another factor we could detect and take into account in deciding whether to prefetch is the user’s connection speed. (Maybe this will be made possible in future with the Network Information API.)

Partial Output

In our loadPage function, we are fetching the entire HTML document, but we actually just need the cc container. If we are using a server-side language, we can detect whether the request is coming from a particular custom AJAX call and, if so, output just the container it needs. By using the Headers API, we can send a custom HTTP header in our fetch request.

function loadPage(url) { var myHeaders = new Headers(); myHeaders.append('x-pjax', 'yes'); return fetch(url, { method: 'GET', headers: myHeaders, }).then(function(response) { return response.text(); }); }

Then, on the server side (using PHP in this case), we can detect whether our custom header exists before outputting only the required container:

if (isset($_SERVER['HTTP_X_PJAX'])) { // Output just the container }

This will reduce the size of the HTTP message and also reduce the server-side load.

Wrapping Up

After implementing this technique in a couple of projects, I realized that a reusable library would be immensely helpful. It would save me time in implementing it on each occasion, freeing me to focus on the transition effects themselves.

Thus was born Barba.js, a tiny library (4 KB minified and gZip’d) that abstracts away all of this complexity and provides a nice, clean and simple API for developers to use. It also accounts for views and comes with reusable transitions, caching, prefetching and events. It is open source and available on GitHub.

Conclusion

We’ve seen now how to create a crossfade effect and the pros and cons of using PJAX navigation to effectively transform our website into an SPA. Apart from the benefit of the transition itself, we’ve also seen how to implement simple caching and prefetching mechanisms to speed up the loading of new pages.

This entire article is based on my personal experience and what I’ve learned from implementing page transitions in projects that I’ve worked on. If you have any questions, do not hesitate to leave a comment or reach out to me on Twitter — my info is below!

Further Reading on SmashingMag: