It's bad enough that Google's Web crawler bots have been hijacked by malware and used to spread a SQL injection attack.

What's worse is that this sort of exploit -- turning Web crawlers into attack bots -- dates back almost 15 years. It's one that most any modern software engineering organization ought to know how to thwart, especially Google.

Ars Technica first reported how security researcher Daniel Cid of Sucuri noticed some strange things going on with a client's website. A firewall was blocking requests coming in from Google's address block. That was odd. Even stranger, those requests were all too obviously a variety of SQL injection attack.

What Cid found was that Google's automated site crawlers had apparently been tricked into delivering a malicious payload. The crawlers were following URLs from another site that led to the victim's site -- URLs that had embedded in them, via a simple obfuscation, the very SQL injection attack being performed. Google's bots cheerfully followed the links, not realizing that by doing so they were launching an attack on the victim's server.

The issue raised a major question with both Cid and Steve Ragan of CSO Online: If you're suffering from what amounts to an attack by Google, do you block Google and, thus, risk being delisted?

It also raises an even larger question: Should Google's bots be attempting to identify such attack payloads in the first place, or does the burden of protecting a site fall wholly on the site's owners?

Granted, it's any site owner's responsibility to know the risks of running a public-facing resource. Unfortunately, those risks are multiplying daily: cloud-based DDoS attacks, DNS poisoning, the list goes on.

But as the 800-pound gorilla on the Web, Google has an even greater responsibility to ensure that the very tools allowing the rest of us to search the Web -- and bringing Google a handsome profit in return -- don't become weaponized. It might not be easy to do, as obfuscated-information attacks can be notoriously difficult to ferret out. (Sucuri has found obfuscated malware in what looked like a Joomla template.) But Google must put forth the effort.

For companies that rely on Web search rankings for their business, blocking Google's crawlers is out of the question. At the same time, it's clear that Google's bots are following any URL they encounter with no attempt to sanitize or defang them. If we're all going to continue to play Google's game, Google needs to ensure that the things it does in the dark don't become a source of inadvertent risk for others. The Web may be a risky place, but there's no reason Google shouldn't do its part to make it less so.

This story, "Google's dangerous bots put the whole Web on edge," was originally published at InfoWorld.com. Get the first word on what the important tech news really means with the InfoWorld Tech Watch blog. For the latest developments in business technology news, follow InfoWorld.com on Twitter.