Google rolled out Penguin 4.0 algorithm update

Google rolled out Penguin 4.0 algorithm update on 23rd September, 2016, which will impact search results real-time and at page levels. Penguin 4.0 Google algorithm update is based on page level backlinks rather than the entire site as it was used to be in the previous updates. For e.g. if an inner section or a particular page is linked from a relevant high quality webpage, it will rank on the SERP’s no matter the home page is ranked or not. It is the vice versa for a link from Spammy or manipulated page. The aim is to catch internet perpetrators who spam the web by buying back links to get a boost in search engine ranks. Every new update of Penguin algorithm has been either to make the internet stricter or to allow websites that have been working to remove bad links to retain ranking. The 4.0 update now allows for these changes to be in real time, take effect much faster and be more granular. A good thing about the Penguin 4.0 is that Google will now be able to devalue just spam, based on unique spam signals and not affect the whole website. The basic difference that the Penguin 4.0 update is that changes will be “real time”. Thus, the moment a bad link is pointed to your site, the repercussions will show immediately and as soon as you remove the same, things (ranking) will get back to normal. However, if you have been trying to use unethical tactics that are supposed to boost ranking, this Penguin 4.0 update will catch up really fast and you may not be able to use the same tactics for a considerable amount of time again. The aim of Penguin 4.0 is best described as a means to “catch spam link profiles as quickly as possible and keep low quality sites from ranking well in search results”. Therefore, it now becomes compulsory for websites to regularly audit links and also perform cleanups immediately, when necessary, to be in the game. An Impact of Penguin 4.0 algorithm A significant number of webmasters have received unnatural link warnings after the Google penguin 4.0 update and started working on Google Penguin 4.0 Penalty Recovery. Pre-History of Google Penguin Algorithm Update The Penguin update came as a tsunami back in 2012 and changed the way online marketers, businesses and websites do business or reach out to their audience. Over the years, Google’s Penguin algorithm has been updated several times. However, the latest Penguin 4.0 came a bit late, after almost a two year gap since the last changes. However, Google does ensure that the gap was worth it with real time impacts. The Penguin filter, for those who don’t already know, was introduced to make the internet more user friendly, relevant and qualitative. Once a site has been marked spam, it would take another update to get it back in competition, given they have worked upon the problems. Since the last Penguin update was done in 17th October, 2014, site had to wait for nearly two years for a chance to get back their rankings. However, Google now ensures that “waiting” will now be a thing of the past. The latest, Penguin 4.0 will ensure that pages get marked immediately for bad links and are also forgiven the moment that make the changes, without waiting for a new update to come over. Further, Penguin 4.0 disqualifications are now page-specific. This means that the whole site would no longer be compromise for a bad link in a certain page. Only the bad sections / pages are affected while the rest of the sites rank normally. This also follows that website owners will no longer have to await a confirmation as the process gets automatic. This really makes things very fast. Historically, here’s how back links were treated in the past updates: Penguin 1.0 launched in 24th April, 2012 and impacted ~3.1% of the websites The first time the Penguin algorithm came into being was aimed at curbing “web spam”. Websites indulging in keyword stuffing, cloaking, link schemes, or purposeful duplicate content will be penalized search engine ranking but still encouraged organic SEO. Penguin 1.1 came on 26th May, 2012 The launch of Penguin impacted the internet severely and many already started questioning whether it made the results worse. However, sites that think they have been mistakenly penalized were allowed a web form to inform the same to Google. Simultaneously, the Penguin 1.1 update was released and called to be a “date refresh” process that impacted less than 1.1% of English language websites. Penguin 1.2 released on 5th October, 2012 and impacted ~0.3% of queries Another “date refresh” but this time, it was to take effect across websites in multiple languages and not just limited to English. Penguin 2.0 came on 22nd May, 2013 This was the first time that the Penguin update integrated something different than usual. Penguin 2.0 was considered to be a new algorithm tech to curb bad linking and spam. More than 2.3% of search queries got affected by this update. Penguin 2.1 released on 4th October, 2013 The fifth installment of the Penguin algorithm, and was an improvement to the second generation spam filtering tech. it was a relatively minor change and also affected less than 1% of the results. Penguin 3.0 was released on 17th October, 2014 and impacted ~1% of results This update took over a year to come through and was specifically targeted at sites that still violated Google’s linking policies. Publishers were severely hit in the waiting period as this was the first time it has been this long. Nevertheless, if sites that were ranked foul in the previous update had done the necessary changes, were again reinstated in the search results. The Penguin 3.0 also took into account websites that got huge up-votes by fake accounts and consequently lost their credit and visibility. This was also the time that Google announced that it would be working on an algorithm tech that would make changes in real time and penalize/free websites faster. The wait was another two years, until this 3rd week of September 2016. An overview of bad links & link audit So, what is really meant by “bad links” and why is it being taken so seriously by Google? Google describes them as “any link that is intended to manipulate PageRank or a site’s ranking in Google search result or a part of a link scheme that violates Google Webmaster Guidelines. It will thus include any such behavior that manipulates links to (inbound) and from (outbound) from a website”. All of this effort is being put by Google to make the internet more user friendly and relevant. As an end user, no one wants to be redirected to a site that is not intended for. Further, if a website isn’t showing the type and quality of content it is supposed to, it is as good as a spam and a waste of time for the users. The prime characteristics of bad links include: Questionable / low authority domain This is for webmasters who are pursuing blacklisted and spam sites. While a single low authority page will not hurt too much, engaging in hundreds of them (a whole website dedicated to it) will certainly backfire. Irrelevant source of content The modern Google search algorithm places context as an important part of a website’s/page’s existence. It is highly necessary that the link should take users to relevant content. Well, if you own a website for pizza delivery, you certainly cannot force information and advertisements about lottery tickets! Too repetitive While quantity of content is important, too much of it can also be bad. Google ensures that websites are lauded for diversity but not the same content repeated, just to create ore back links. Suspicious keyword embedding Now, this is a coding thing that will be easily identifiable by modern Google bots. If you are using an anchor text for PageRank shortcuts, it would be the fastest way to welcome a penalty. Being a part of a reciprocal exchange Commenting on each other’s blogs is good but if it is a part of a scheme between two related websites, it will negatively affect the authority of both. Google is very particular about excessive trading of links, even if it comes from verified sources. How to audit backlinks and rectify bad links Bad links have always been a part of the darker side of internet, involving low quality, directories, PR sites, social bookmarking sites, paid links, article directories, link farms, site wide links, reciprocal links and so forth. It is necessary that webmasters thoroughly comb their profile and ensure that there’s no chance of such links being generated. There are actually several ways to identify and remove bad links from a website and involve: Backlink Audit Back link Audit should be a completely manual process. Although, there are several online tools that promise the job, their authority is never guaranteed and you will never want to take chances with the credibility of your website. The precise steps in the Google manual action include: Splitting all links into two categories – no-follow and do-follow Auditing the do-follow links Further, there will be added metrics to take care of and include: Plagiarized/low quality/irrelevant articles, bookmarking, social activity and web directories Forum signatures Blog comments Paid links Links from malicious websites / content scrapper sites / foreign language sites / banned network IPs Side wide footer links Next step will be to categorize these domains into either good or bad. You should also be analyzing the anchor text density for your do-follow links. Also take into account: Long tailed and irrelevant keywords

Generic keywords (“Read more”, “Click here”, Visit page”, etc)

Different landing pages for ach keyword / anchor text

Use of special characters in content Now, you can either reach out to the webmaster or create a disavow file for Google. How to reach out to the webmaster Try getting the email address Alternatively, you can go to WHOIS database for the email address You may also check social media accounts of the owner/business for the contact details How to create a disavow file for Google Go for the indexing version of your website and create a disavow file, listing all backlinks and domains that are potentially harmful to your website. Google's webmaster trends analyst, John Mueller confirms no changes in disavow post penguin 4.0 Save the file as an .UTF or Unicode Upload it at Google Webmasters Tool Lastly, you will need to draft a reconsideration request, explaining all the efforts that you have put into making the site spam free. This should be supported by all documents (screenshots and emails). Apologize for the inconvenience and you should be back in no time, provided everything goes right. We might never fully know the specifics of the Penguin 4.0 but it’s for sure that Google bots will now be chugging all along and track everything. It’s a good thing that Google will no longer nuke the offenders every once in a while, without giving them a chance to recover. However, webmasters will anyhow suffer sweaty palms over their SEO and linking practices. The game just got more interesting! Click here to get a copy of your e-book “Google Penalty Recovery and Prevention Manual” worth $14 absolutely free! Source Links: http://webtise.com/how-will-googles-penguin-4-update-affect-your-business/ http://searchengineland.com/google-updates-penguin-says-now-real-time-part-core-algorithm-259302 https://www.seroundtable.com/google-launch-date-penguin-22694.html http://searchengineland.com/library/google/google-penguin-update http://www.thesempost.com/google-launches-real-time-penguin-40/ http://www.linkresearchtools.com/news/google-penguin-4-0/