A Dangerous History

This isn’t a trend that begins with Barrett Brown, or Anonymous, or James Risen, or Wikileaks, or any of the recent headline grabbers. It begins with Dmitri Sklyarov in 2001, arrested at Defcon at the behest of Adobe for breaking their inept DRM.

It begins with the CFAA and the DMCA, but it continues with their ever-widening interpretation. We focus in a case like Brown’s on the government use of these unrealistic laws, but most of their abuse comes from corporations. This is why Oracle killed Aaron’s law, an attempt to reform the CFAA to, among other things, stop making violating a company’s Terms of Service a criminal offense. We saw it with Adobe and Sklyarov, with Diebold’s e-voting machines, and we see it with the innumerable DMCA take down notices recorded by Chilling Effects aimed at the issuers’ competition and critics.

What happened in this latest case, in which I served as a witness, was subtle. Barrett Brown had been originally charged with a felony for posting a link in a chat to an archive of stolen credit cards. This charge was dropped, but then used as a sentencing enhancement by the prosecution. This meant the Brown could serve more time for this alleged crime, despite not being convicted of it or pleading to it. Leaving aside for this moment what a terrible idea this is to have enshrined in our legal system, the prosecution claimed, and the court upheld, that this was relevant conduct — something Brown could serve more time for doing. I don’t believe either the court or the government understand the internet well enough to understand how catastrophic this idea is.

Security professionals and the journalists who work with them (and in cases like mine, work with a broad range of sources of different legal standing) have to piece together what’s happened on the internet from traces left behind. Early on in my career I learned this lesson the hard way. I was approached with evidence that a credit card biller who largely dealt with porn sites had lost their customer database and was ignoring it or covering it up. I had information from two different security researchers — one gave me a file with a million users’ worth of data, the other 22 million. I called the FBI office that had been informed of the situation, but they explained that because the data was considered to be too old, over six months, there wasn’t any interest in pursuing it as a case. The company refused to speak to me, and I ran the story.

Within the next week I had to retract the story and issue an apology — the file of 22 million cards was bogus, one that had been passed around black market carding sites to bilk newbies. In this case, I was that newbie. I was new at Wired, and I not only felt humiliated by my mistake, but terrified my career was going to end before it really got started. My editor forgave me but drove home that in journalism, we check our facts. Scared and upset, I was determined not to let that happen again. This meant I had to examine the data any story was based on, and if I didn’t have the sophistication to understand it, I had to reach out for help from someone who did.

I’ve been inspecting my data and getting help ever since.

In 2011, when Anonymous breached Stratfor and took its customers’ credit cards, I had a lot of claims to verify. Was this really Stratfor’s server? Or just its website? Were the credit cards real credit cards, and if they were, were they really Stratfor’s customers, or purchased from a carding site for a stunt? As Gabriella Coleman noted in her book, a surprising number of Anonymous’ claims have turned out to be true, but not all of them.

So I took the data, and I checked it.

Since my first big blunder I have watched much of the non-technical media repeating hackers’ (and law enforcement’s) claims with breathless and enthusiastic speed. I have criticized my field harshly for this, because we aren’t doing our jobs when we don’t check the claims of politically and personally interested people. I have also, because one should not just criticize and leave, taken a lot of time out of my writing career to help journalists and activists understand better the technology and consequently how to check these claims. I have helped more journalists than I can count, and I intend to keep doing this. I have explained how to use encryption to many news organizations, and with that, taken time to explain how the net works. I have written explainers on security and networked life, and sent them to journalists. I have even started working more on digital literacy issues in children’s education, because journalism isn’t the only place where one informs a polity, and maybe isn’t even the most important.

And then in December I went and told the story of how I did my job in 2011 to a court in Dallas. It was clear that the prosecution considers what I do to check my stories criminal. It’s also clear that they don’t understand how security research works.

Ms. Candina Heath, the prosecutor in Brown’s case, said to me — and here I must paraphrase from memory — Isn’t it true that the people who uncover credit cards generally work for the companies that issue or hold them? I told her that is rarely the case. She protested this was my opinion. I said no, it wasn’t, and gave her the best brief on-the-spot explanation of how the security field actually works: People who work in the field (or in security academia) find or are alerted to abnormalities. They can be everything from a phishing link in their email to a DDoS that’s generating notable traffic to a post including code that exploits a flaw in software. When they investigate, they will look for traces of what happened, and often this leads to a cache of data that has been collected from a set of victims. There’s no way to know what’s in such a cache until you look at it. It can be anything: pictures, personal info, banking details, credit cards. According to this Texas prosecutor, and many more law enforcement agents, once someone grabs this data cache and examines it, passes it along to expert eyes, or to a journalist, they are committing crimes for which they may be ripped from their home, job, family, and the future they expected to have. They may be incarcerated for doing their job.

I may be incarcerated for doing my job.