Last March we published a study that showed the majority of website traffic (51%) was generated by non-human entities, 60% of which were clearly malicious. As we soon learned, these facts came as a surprise to many Internet users, for whom they served as a rare glimpse of ‘in between the lines’ of Google Analytics.

Since then we were approached with numerous requests for an updated report. We were excited about the idea, but had to wait; first, to allow a significant interval between the data, and then for the implementation of new Client Classification features.

With all the pieces in place, we went on to collect the data for the 2013 report, which we’re presenting here today.

Research Methodology

For the purpose of this report we observed 1.45 Billion visits, which occurred over a 90 day period. The data was collected from a group of 20,000 sites on Incapsula’s network, which consists of clients from all available plans (Free to Enterprise). Geographically, the traffic covers all of the world’s 249 countries, per country codes provided by an ISO 3166-1 standard.

Report Highlights

Bot Traffic is up by 21%

Compared to the previous report from 2012, we see a 21% growth in total bot traffic, which now represents 61.5% of website visitors. The bulk of that growth is attributed to increased visits by good bots (i.e., certified agents of legitimate software, such as search engines) whose presence increased from 20% to 31% in 2013. Looking at user-agent data we can provide two plausible explanations of this growth:

Evolution of Web Based Services : Emergence of new online services introduces new bot types into the pool. For instance, we see newly established SEO oriented services that crawl a site at a rate of 30-50 daily visits or more.

: Emergence of new online services introduces new bot types into the pool. For instance, we see newly established SEO oriented services that crawl a site at a rate of 30-50 daily visits or more. Increased activity of existing bots: Visitation patterns of some good bots (e.g., search engine type crawlers) consist of re-occurring cycles. In some cases we see that these cycles are getting shorter and shorter to allow higher sampling rates, which also results in additional bot traffic.

31% of Bots Are Still Malicious, but with Much Fewer Spammers

While the relative percentage of malicious bots remains unchanged, there is a noticeable reduction in Spam Bot activity, which decreased from 2% in 2012 to 0.5% in 2013. The most plausible explanation for this steep decrease is Google’s anti-spam campaign, which includes the recent Penguin 2.0 and 2.1 updates.

SEO link building was always a major motivation for automated link spamming. With its latest Penguin updates Google managed to increase the perceivable risk for comment spamming SEO techniques, while also driving down their actual effectiveness.

Based on our figures, it looks like Google was able to discourage link spamming practices, causing a 75% decrease in automated link spamming activity.

Evidence of More Sophisticated Hacker Activity

Another point of interest is the 8% increase in the activity of ‘Other Impersonators’ – a group which consists of unclassified bots with hostile intentions.

The common denominator for this group is that all of its members are trying to assume someone else’s identity. For example, some of these bots use browser user-agents while others try to pass themselves as search engine bots or agents of other legitimate services. The goal is always the same – to infiltrate their way through the website’s security measures.

The generalized definition of such non-human agents also reflects on these bots’ origins. Where other malicious bots are agents of known malware with a dedicated developer, GUI, ‘brand’ name and patch history, these ‘Impersonators’ are custom-made bots, usually crafted for a very specific malicious activity.

One common scenario: un-categorized DDoS bot with a spoofed IE6 user-agent.

In terms of their functionality and capabilities, such ‘Impersonators’ usually represent a higher-tier in the bot hierarchy. These can be automated spy bots, human-like DDoS agents or a Trojan-activated barebones browser. One way or another, these are also the tools of “career hackers”, who are proficient enough to create their own malware and opperate their own DDoS Botnets.

The 8% increase in the number of such bots highlights the increased activity of such hackers, as well as the rise in targeted cyber-attacks.

This is also reflective of the latest trends in DDoS attacks, which are evolving from volumetric Layer 3-4 attacks to much more sophisticated and dangerous Layer 7 multi-vector threats.