[Today’s guest post is brought to your by Wayne Smallman. Wayne Smallman is the man behind the Blah, Blah! Technology blog: a focal point of his passion for technology, and a hallmark of his business mentality, writing style, and adeptness at making complex technology issues approachable and accessible.]

In providing a search service, Google have an obligation to ensure their service does not knowingly or willingly cause harm to us. So imagine my dismay to see malware being paraded in front of me after an innocuous and totally unrelated search…

Think of it this way, if you bought a magazine on the subject of fishing and it was full of premium rate telephone numbers to companies that looked like legitimate fishing and tackle supplies, but were really fronts for eBay scammers, you’d be pretty well annoyed, right?

So what’s the difference when Google sit back and allow malware to persist on their search engine? There is no difference.

To help cement my point, I’ll be playing Devils Advocate with myself, highlighting the various points and counter-points, to make it a much more nuanced discussion.

Talking about Malware on Google

“Don’t we see people selling booze and smokes on Google? Isn’t that stuff harmful?”

Of course, but for whatever reason, these items are subject to laws. And are ostensibly not made to harm people, despite more recent findings highlighting otherwise. They’re only harmful in excess, while alcohol specifically is mostly harmless in moderation.

The things that Google lists as harmful are essentially constructed with the explicit purpose of causing harm, or at the very least, some level of disruption.

“So what about guns and knives — they’re legal, right? We see people selling them and they’re legal here in the US.”

If we use weapons as another example, then people are buying these things with the intention of inflicting harm, obviously. Or, at the very least, to defend themselves.

However, when I type a search for: “Steve Jobs ‘I like options'”, as I did, then I’m certainly not looking for guns, knuckle dusters, pepper spray, knives, Mace — or malware.

So why am I even seeing these things? If it’s the case that Google can’t guarantee the specificity of a search result, then those things are clearly not related, or offer something that is obviously in conflict with the search query should not be present.

As you’ll probably hear if you were to find yourself in a court of law, there’s a balance of probabilities to be considered, as well as degrees of harm.

But because Google are needlessly placing irrelevant search results in front of me, when I’m clearly looking for something that’s totally harmless, then the degree of harm they’re placing in front of me and everyone else is negligence on their part.

“Google places a warning next to the link. What more do you want?”

You’re making a perilously callous defense for Google here, which I find troubling.

That’s like letting me perform brain surgery, which inevitably kills the patient, then claiming that because neuroscience is still in its infancy, I’m not really to blame.

Not a particularly sound basis for any business, which on the balance of probabilities would suggest heavily that Google are acting with the interests of their business model first and the personal safety of their users second.

Shall we add pedophilia, or necrophelia to the list of things you think should be allowed on Google? Question is, where do you draw a sensible line? Of course, pedophilia is outlawed, but then so too are most of the things that are resultant of me clicking on some item of malware.

And this isn’t the first time I’ve seen such things for innocuous searches, either.

“I don’t need a baby-sitter! Ever heard the saying: ‘once bitten twice shy’? People can look after themselves. If they click on the wrong thing, they won’t do it a second time, will they?”

Thing is, in following your line of reasoning, we clearly place the legality of something second to its availability — which isn’t logical.

In some instances, finding such things as guns, knives and knuckle dusters available to buy is in contravention of laws within the countries where those searches are being performed.

But what we’re wandering into are issues of creating an informational police state. That’s a very different matter all together, one outside the remit of this discussion.

Both you and Google are making the assumption that the average person using the internet is full of scrutiny, endlessly vigilant and the enemy of naivety — the fact is, they’re often diametrically the opposite.

That’s why these ploys work, because some unscrupulous bastard knows only too well people will fall for clicking on something despite the warnings, sometimes just out of curiosity.

And because the degree of harm is set too high, more people with less net savvy will be harmed because people like you are educated just enough to avoid the snares & traps, but not wordly-wise enough to see that such things are wrong.

On the one hand, we have a broadly naive huge mass of people using the internet, who are prone to having their privacy and security compromised. On the other side is a small minority of people who are preying on the aforementioned.

What you’re doing is mounting a defense for Google that effectively places the latter party with equal to or greater privileges than the former. Because the latter party aren’t being dealt with, Google are essentially a market stall to these people.

Google owe it to the people using their search to make every reasonable effort to eradicate malware from their systems. Simply providing a warning isn’t good enough…