Google PageRank was one of the first algorithms used for ranking pages on searches. It takes its name from Larry Page who was a co-founder of Google. The quantity and quality of links to a particular webpage help it get a better rank from PageRank. If a webpage has useful content then other websites would like to refer this webpage on their own sites. How do you judge the quality and relevance of a particular website’s content? Simple, you check how many other websites contain this website’s link.

This algorithm got patented more than two decades back. It was one of the earliest to help rank different webpages. Over the years, this algorithm has got certain updates. These updates have modified the science (and art!) of SEO to a great extent. Given below are some of these Google updates:

• Panda, launched in 2011 – original content rewarded, duplicates and plagiarizers punished

• Penguin, launched in 2012 – shady backlinking caught out

• Pirate, again in 2012 – a big blow for copyright infringers

• Hummingbird, launched in 2013 – enough of keyword stuffing. Three cheers for natural writing!

• Pigeon, launched in 2014 – Think global, but search local

If you are an experienced webmaster, these are already well-known to you. Later, we saw further updates like Mobilegeddon, RankBrain, Possum, Fred etc. Recently this patent saw another update. Google has downplayed the significance of this change. Yet, it would be instructive for webmasters to understand what this change is about. They can then align their SEO efforts in line with this update.

This update ‘measures’ the distance between two webpages at least one of which has a link to the other. Think of a diagram showing each of these pages connected with arrows of different shades. The shade of each arrow would depend on the relevance of the page.

The thickness of each arrow would depend on the distance between the seed page and the webpage. This ‘distance’ would depend on the properties of the page to which the link points to. Also, the shorter the ‘distance’ is, the higher would be the page rank on the search engine result pages of Google.

So how would this affect SEO, if at all? Should search engine optimizing analysts reboot their strategies and rework their styles? Would you need to redesign your website in light of this new algorithm (or reworked algorithm)?

Let’s peel off a few layers of this patent and see what the fine print says.

You build links when authority websites carry links of your website. Let us consider the food business as an example. The agency that awards the coveted Michelin stars to the best restaurants has a website. If your website has its link on it, that’s successful link building. Everyone considers Michelin to be the most authoritative in the restaurant business. A similar link builing would be to Gordon Ramsey’s website. But not so if your website has its link on the website of a local food delivery chain.

In the example above, the food business or the restaurant industry is a huge one. So it is a large niche with millions of pages all over the world. But let’s talk about a very small niche. Say, an obscure religion practiced by only a few thousand people in a few countries. There might be at most a hundred websites devoted to that topic, or even less. Websites dealing with this topic are likely to get a better rank if they have more links with the authority website of this niche. In the original algorithm of Google PageRank, this would not have been possible. A smaller website in the obscure niche would have featured way down on search rankings. This is the game-changing characteristic of this new update to the patent.

What Change Would Webmasters Need to Make to Their Webpages to Do Well on This Algorithm?

Webmasters need to keep the following in mind:

• This internet map has an authority website at its core. The algorithm aims to create such a map.

• Each webpage’s ‘distance’ from such a trusted authority website gets calculated.

• The page is then ranked according to this calculated distance.

• So if your website is spammy, it would get differentiated from the trusted websites.

• If this happens, then your rank would fall before you can do anything to arrest it.

• The algorithm’s filter acts like a qualifying mark for your webpage.

• Once your webpage clears this filter, the other 200 (or more) parameters of ranking rules kick in.

• The importance of metadata descriptions and H1 tags (to name 2) is much lesser now.

• The important things to focus on now are – structured data and schema markups.

• This will allow you to use the perfect markup vocabulary.

• Responsiveness will continue to be very important. AFter this update, nothing changes in the importance of mobile usability.

• Another factor which will continue to be important is the loading speed of your website. Also, navigation from page to page within the website should also be fast and easy.

Wrap Up:

Google has made scores of updates in the last decade. The aim of each of these updates has been to improve the user experience. Also, the goal was to make the search results more useful to the user. Google PageRank update too will work in the same direction. It will make the search experience easier to navigate, and it will also give more useful results. Webmasters might need to focus more on the user, but the basics won’t change too much. It seems like there are exciting times ahead.

More about Google and Google penalties you can find here.