I'm going to tell you how here, even though I think executing such a script is highly unethical, probably fraud, and something you should not do. I'm telling you about it here because people need to understand how jawdroppingly easy it really is.

So, the goal is mimicking humans. Which means that you can't just send 100,000 visits to the same page. That'd be very suspicious.

So you want to spread the traffic out over a bunch of target pages. But which ones? You don't want pages that no one ever visits. But you also don't want to send traffic to pages that people are paying close attention to, which tend to be the most recent ones. So, you want popular pages but not the most popular or recent pages.

Luckily, Google tends to index the popular, recentish stories more highly. And included with UBot are two little bots that can work in tandem. The first scrapes Google's suggestions searches. So it starts with the most popular A searches (Amazon, Apple, America's Cup) then the most popular B searches, etc. Another little bot scrapes the URLs from Google search results.

So the first step in the script would be to use the most popular search suggestions to find popularish stories on the domain (say, theatlantic.com) and save all those domains.

The first search would be "amazon site:theatlantic.com." The top 20 URLs, all of which would be Atlantic stories, would get copied into a file. Then the bot would search "apple site:theatlantic.com" and paste another 20 in. And so on and so forth until you've got 1,000.

Now, all you've got to do is have the bot visit each story, wait for the page to load, and go on to the next URL. Just for good measure, perhaps you'd have the browser "focus" on the ads on the page to increase the site's engagement metrics.

Loop your program 100 times and you're done. And you could do the same thing whenever you wanted to.

Of course, the bot described here would be very easy to catch. If anyone looked, you'd need to be fancier to evade detection. For example, when a browser connects to a website, it sends a little token that says, "This is who I am!" And it lists the browser and the operating system, etc. Mine, for example, is, "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_8_2) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/31.0.1650.63 Safari/537.36"

If we ran the script like this, an identical 100,000 user agents would show up in the site's logs, which might be suspicious.

But the user agent-website relationship is trust-based. Any browser can say, "I'm Chrome running on a Mac." And, in fact, there are pieces of software out there that will generate "realistic" user agent messages, which Ubot helpfully lets you plug in.

The hardest part would be obscuring that the IP addresses of the visits. Because if 100,000 visits came from a single computer, that would be a dead giveaway it was a bot. So, you could rent a botnet — a bunch of computers that have been hacked to do the bidding of (generally) bad people.