Single Page Applications or SPAs have been the talk of the town in web development world. Even major tech giants like Google and Facebook have jumped in and open sourced their frameworks- Angular JS and React JS. And if you look at Github you see SPA frameworks are one of the most starred repos. React has 75k stars, VueJS has 66k and Angular has 57k.

Though SPAs aren't exactly a new thing (in internet terms) and date back as far as 2002. Stuart Morris wrote first one at slashdotslash.com, in it's most primitive form. Earlier SPAs were only used for SaaS dashboards and form heavy apps, but lately it's everywhere.Have you visited, IMdb, Netflix, Youtube or Instagram?You'll be surprised to know that they are all using SPA frameworks!So let's first look and try to understand what is it and do you need it.

What are SPAs and its Advantages

Single Page Application aka SPA is exactly what it sounds like, it's literally made of only one page! These are websites designed to provide a UX like a desktop or mobile application. The HTML, CSS, and JS are retrieved on the first-page load and after that, all that is loaded is dynamically in response to user action. So the view changes dynamically without loading an entire web page. It usually retrieves data from a REST backend.The most obvious advantage is speed of course, it takes very less network resources after the initial page load. That's because after initial page load no more HTML gets sent over, instead only data gets sent over, which consumes lot less time and bandwidth. And this gives it more of app like feel rather than traditional websites.Some frameworks use javascript to dynamically communicate with the web server to create the HTML to render the page, and that's where our problems begin. Crawlers today while advancing at a very rapid rate is not all that well optimized to fully crawl SPAs, so it's tough for them to index all the dynamic contents of modern web apps.

Challenges of SPA and Search Engines

Back when SPAs first emerged there was no way of actually crawling them due to lack of javascript execution in the web crawlers of the Search Engines. Even as time advanced crawlers have faced trouble fully crawling all the dynamic content of web apps following this model. Traditionally SPAs are used in a website where SEO is neither a requirement or desired, but that's changing fast with fast changes in technologies of SPAs as well as crawlers. For example, React provides server-side rendering for SEO friendliness as well as there are various external prerenderers.Showing different content to Google than to normal web visitors is considered as "Cloaking" by Google, but for single page apps, they do have an exception. For SPAs, their crawler will send an _escaped_fragment_ query parameter in the request, and the origin server can then choose to return a document that represents the content a user will actually see when the single page app is running. Or, at least that's what they used to do from 2009 to 2015.In 2015 Google released the following statement:

`In 2009, we made a proposal to make AJAX pages crawl-able. Back then, our systems were not able to render and understand pages that use JavaScript to present content to users. Because "crawlers … [were] not able to see any content … created dynamically," we proposed a set of practices that webmasters can follow in order to ensure that their AJAX-based applications are indexed by search engines.Times have changed. Today, as long as you're not blocking Googlebot from crawling your JavaScript or CSS files, we are generally able to render and understand your web pages like modern browsers. To reflect this improvement, we recently updated our technical Webmaster Guidelines to recommend against disallowing Googlebot from crawling your site's CSS or JS files.`

While Google claims this, Bing, Yandex and other all support Google’s standard for Ajax Crawling. So here are some relevant things to keep in mind while making SPAs that are not hidden behind a login page or where SEO is a need.

What to keep in mind while making SPAs from SEO perspective

So while Google has deprecated it's Ajax Crawling Standard, it has not yet abandoned it. It still uses it but recommends not using it, so follow Ajax Crawling Standard while making SEO relevant websites if possible. Use server-side rendering if possible to render stuff that is of absolute importance. Also, there are various pre-rendering services available whose support covers all the SPA frameworks currently in the market, such as:

prerender.io

Brombone

SEO.js

SEO4Ajax

prerender.cloud

Prerender.io also provides an open-source version of their service for self-hosting.Although currently, the Search Engine Crawlers don't always support SPA indexing 100% that doesn't mean they never will. With the sudden surge in use of SPA usage they are also improving at a tremendous rate. So sit back, relax make SPAs and make sure to optimize it with this 60 Second SEO Checklist.