Of late, there is a great buzz around the use of Progressive Web Apps or PWAs for rendering your websites or applications. The central idea of a PWA is to create an enhanced website that has an app like behavior and qualities for ease of use on any device and platform.

One of the key differences between traditional web apps and PWA is that traditional web applications render HTML on the server. For instance, WordPress, Magento, Ruby on Rails, and some of the most common web apps and frameworks are built this way.

Server-side rendering of web apps deliver complete pages to browsers and crawlers alike, and this is how the content is optimized from the SEO perspective.

In the past couple of years; however, JavaScript-based web applications rendered on the client have gained momentum. These were originally used for complex apps with rich functionality such as Google Docs. But now developers are increasingly using JavaScript along with frameworks such as React and Vue for building web applications that were typically rendered on the server.

Progressive Web Apps are essentially the conclusion of this trend. PWA leverages web technologies like Service workers, Web App Manifests and Cache API to deliver a native app-like web experience. But with the added benefit of performance, seamless page transitions, offline functionality and a home screen on mobile devices.

This led to the rapid adoption of PWA by well-known products and services. Pinterest, for example, rebuild its application in its PWA avatar and saw several benefits. These include improved engagement by 60% and a 40% increase in user-led ad revenue. And many others like Uber, Flipkart, Tinder, Starbucks, etc. adopted the use of PWA for their apparent benefits.

And there are obvious business advantages to PWAs. But what about SEO?

It is a valid concern if you are new to the world of PWAs and toying with the idea of switching over.

Do Bots Crawl PWAs?

All pages of a PWA are considered as a JavaScript site by Google and other search engine BOTs. For a PWA, new URLs can be created, and the bots crawl it just like regular web pages. However, there might be some issues as far as crawling and optimization of the site are considered, and it is good to know the SEO best practices for indexing of your pages.

With PWAs, Google’s crawlers have a few limitations with new JavaScript. It is wise to check if Google supports the specific JavaScript features used in your application. For example, if Google doesn’t support ES6 or other modern JavaScript features that you may have used, consider using tools like Babel to transpile JavaScript files.

Tools like Babel will transpile your JS files to more widely used and supported versions. Alternatively, you can use server-side rendering to present the crawlers with pre-rendered pages.

In addition, here are some great SEO tips that will further improve the usability and searchability of your PWA sites.

Ensure crawlers can access JavaScript and CSS files: Many SEO practitioners still consider it a good practice to block Googlebot’s access to JavaScript and SEO files. This is never a good idea if JavaScript renders the content.

Optimize for other search engines: When it comes to visibility and searchability online, Google is not the only search engine. Where Google may crawl your PWA correctly, it might have issues with other search engines which might be important and necessary for your business.

Take care of page navigation: Typically, page navigation works well with JavaScript apps that are rendered on the client side. But there is one exception, and that is URLs that include fragments. For best compatibility, avoid using # in your URLs.

Using canonical URLs: Many businesses deploy a PWA but also leave their original desktop or mobile sites in place. This may cause the issue of duplicate content if the pages are not canonicalized correctly. Each PWA page should specify its canonical URL (the original page that is meant to be indexed) using the canonical attribute. While enabling PWAs along with non-canonical AMP, then use the rel=”amphtml” tag to specify the AMP urls.

Dynamic serving: Often dynamic serving is used to show different designs based on the user device. Ensure that the content is always the same for search bots and users to avoid cloaking. Secure website

Secure website: Ensure the website is secure and run entirely using HTTPs, with 301-redirects from HTTP to HTTPs and avoiding non-secure resources. You can use an HTTPs migration checklist if you haven’t done that already.

Page load speeds: Pay attention to the loading times of your pages. You can achieve more speed by using PWAs with AMP (PWAMP) and also following some performance optimization best practice. You can use tools like WebPageTest and Lighthouse to find out your time to the first interaction. This is the key page-speed metric Google uses to measure page speed.

Progressive Enhancement: While service workers are excellent, PWAs use progressive enhancement. That is, they should work well even if the browser doesn’t support modern functionality, such as a search engine spider.

Use a Search Console for Testing: Fetch as Google is a helpful tool that allows you to see how Google reads your pages.

Don’t forget the Sitemap: Add a sitemap to your website and register it in the Google Search Console for your website. This is the first thing that Google will use to know the pages that exist on your website.

To sum it up

PWAs already have the unique advantage of being highly usable, lightweight and mobile-friendly. By taking care of the above-mentioned SEO best practices, you can rank better and provide an optimum user experience. If you want further information on this, check out Google’s PWA checklist, which can help you take your PWA from baseline to exemplary levels.



Related Post:

If you are looking for more detailed post on PWA-SEO, you can read SEO for Progressive Web Apps.