Comment It’s scarcely unusual. You’re preparing an email, you start typing an email address, autocomplete fills one in from you, and then you may or may not notice as the email speeds off that it’s going to someone entirely different from the intended recipient. If the email includes personal details of 10,000 people and the person you’ve just sent it to is a journalist, well…

In the case of the autocomplete disaster that’s just happened to Gwent Police, the original error wasn’t spotted until The Register reported it, and was even compounded by a second email alerting a baffled Chris Williams to an updating of the internal phone directory (but no, at least they didn’t send us the directory as well). As we understand it, Gwent has an officer with a similar name, so unbeknownst to himself our Chris Williams had blundered onto a distribution list, and presumably would have continued to receive Gwent bulletins, perhaps even slowly moving up the distribution pecking order.

That’s if The Register hadn’t informed them, and Gwent’s techies hadn’t spent last weekend ripping autocomplete out of their systems. We are slightly wistful over the loss of a bizarre but potentially useful information conduit that we didn’t know we had until the other day, but we do take people’s privacy seriously, and regard it as having been our responsibility not to use the data, to destroy it, and to give whatever help we can to Gwent in order to stop this kind of leak happening again.

However… Although we accept that Gwent also takes this matter very seriously and will make honest and strenuous efforts to control the data it handles, it is the nature of the beast — the Criminal Records Bureau checking regime — that this kind of leak will happen again and again. Autocomplete errors, poor list management and (we suspect) excessive use of the cc field aside, the elephant in the room is that file — why was it even possible for someone to have that volume of sensitive data in a single file, far less to email it out unencrypted?

Is your data transfer really necessary?

And did six people really need a copy of what appears to have been only a part of the force’s CRB check database? There are two aspects to the real problem here. First, in common with much of government and other organisations in the public and private sector, the force’s systems are not set up to just prohibit the bulk transfer of personal data. It’s conceivable that systems could be built in this way, and in the long run we feel it inevitable that they will be built in this way. But a lot of people’s personal information is going to go walkabout on lost notebooks and USB sticks before that happens.

And quite a lot of it is going to go walkabout because the data has to be bulked up to be sent to an external organisation without the existence of an adequate secure channel. The mother of all leaks was perpetrated by HMRC, which in 2007 contrived to lose 25 million personal records in the post. One could (and one did) question why anybody needed that amount of data in the first place, but granted somebody needed some HMRC data, the only way to actually get it to them was what we used to call sneakernet.

No amount of huffing and puffing about security and encryption, and dumping on the poor saps who pressed the buttons is going to change anything — if data needs to be transferred and there isn’t a secure channel, then it’s going to leak.

Now consider what’s happening with the criminal records checking. Millions of people now have to undergo a CRB check in order to get a job, undertake voluntary work or do anything involving children. Records of the personal data of tens of thousands, maybe even hundreds of thousands of people will be collated and exchanged between organisations.

Industrialising privacy invasion

Some of these organisations will be police forces — who obviously have to be involved although they didn’t exactly ask to have to collate big piles of CRB check results, others will be government and others private sector. The CRB will take in lots of money because of the regime, while at the same time industrialising the process by farming it out to the private sector.

We’re not suggesting the private sector’s data handling will be any worse than the public sector’s (au contraire…), but there’s a monster here that won’t be tamed without a revolution/revelation in government IT planning, design, security and privacy awareness. They’re invading our privacy industrially, systemically and on the cheap via ill-conceived and ineffectual checking regimes, then blaming operator error and carrying on regardless. They should stop building this stuff until they’ve learned how to control it. Or preferably, they should stop building this stuff. ®