Watching the watchers

(the case for an open surveillance network and a panopticoin)

Back in 2008, Adam Jackson pointed a webcam out of his apartment facing the street. He lived in an area of San Francisco — the tenderloin — notorious for its street crime. It got popular, and then the death threats started, and Adam took the camera down. At the time I thought “It’s possible to force one person to stop, but if everyone had a street facing camera, the threats would be meaningless. We could crowd-source the monitoring of the streams, alert the community when something of note happened, get the neighborhood to identify the perpetrators, and all the evidence would be on video.” It was an outline for a panopticon of the people. At the time, the idea wasn’t that far-fetched. Crowdsourcing was quickly emerging, bandwidth costs were rapidly falling, video consumption was rising (YouTube was in its early days, and Netflix had just begun streaming), and webcams were proliferating. Due to a number of reasons which I’ll cover later, the public panopticon never arose.

The choice isn’t if we have a panopticon or not, it’s who runs it.

Though the open version didn’t materialize, we still built the panopticon. Largely private, operated by the government and corporate interests, we’ve reached a surveillance scale unprecedented in history. The pernicious effects Orwell predicted have come with it. We’ve become more careful about what we speak of, and the truth has been slowly starved to death and replaced with propaganda. The only difference is we invited Big Brother in and faced the cameras to capture our selfies. We also helpfully posted them on social networks, tagged our pictures with our cohort’s identities (creating an extremely helpful corpus of data for facial recognition), codified our connections, and publicizing our thoughts. The results of this have been dangerously insidious. China is using social data to evaluate the trustworthiness of its citizens, and adding in some face recognition AR along with it for the police to identify trouble makers. The US is collecting all social media information from immigrants, and wants even more data to perform their evaluations. ICE is planning on scanning license plates nationwide, Facebook data is combined with Palantir’s to track people, and private emails with one’s doctor are being used to justify denial of entry into the country. The UK doesn’t wan to feel left out (it did start the trend, after all), and is expanding surveillance both publicly and privately.

When technology reaches common usage, it’s near impossible to pull it back. Not all curves are lines, but one way or another the trend is likely to continue and intensify. There will be more cameras, more of our lives will take place on-line, and more data will be collected about what we do and who we are. Pandora’s box is open and can’t be closed. The only question is, who is watching us, and who is watching the watchers?

Sneakers — No more secrets

Facebook, in its 2006–2009 form, was operating on the idea of a nerd utopia. The premise was simple — if everyone openly shared data about themselves, there’d be no room for judging others. We all have our flaws, and laid bare, it’s much easier for us to see other people as human beings rather than an other. It’s difficult to condemn homosexuality as an abomination when you’re sleeping with other men. It’s harder to imprison people for drug use when there are pictures of you taking drugs. (Though in fairness, or lack thereof, people’s hypocrisy often knows no bounds.) The idea was that if we exposed our true selves, we’d create a society more connected, more resistant to hypocrisy, and nearly immune to selective enforcement of the law. But the future was unevenly distributed. Few people shared publicly for fear of judgement or incrimination. Instead we turned on our privacy filters, used messaging for more personal conversations, and kept our photos limited to friends only. Unfortunately, this wasn’t actually privacy, but “privacy”, with almost all of this data restricted from the public, but freely exposed to corporations and governments.

By keeping our data semi-private (but not actually private), we opened ourselves up to exploitation for profit and compliance. Want some form of actual privacy? Be prepared to pay more for an encrypted device from a company that doesn’t sell your data . Don’t want to turn over customer records to the FBI? Some pictures of you cheating on your spouse might get leaked. Choose not to have a social profile to maintain your privacy? Get ready to be flagged as a security risk.

The point of these examples is to illustrate that it’s not the lack of privacy that’s inherently dangerous, but rather the selective control of that privacy. The latter gives whomever holds the keys leverage over everyone due to selective enforcement, and the discomfort this power imbalance causes is showing up in polling data for trust in public and private institutions. Of really interesting note, Jeremy Bentham, the creator of the original panopticon, came to recognize this core problem, and later in life focused on a system where a minister, the seat of power, is surrounded and watched by the public.

We’re at an inflection point, and the Blockchain can help

There were a number of reasons the public panopticon didn’t materialize; most of those no longer exist.

Bandwidth — though relatively inexpensive even in 2008, a constant stream at a high resolution would eat up ones available pipe and slow down every other online activity. Ten years later broadband speeds have risen and bandwidth is sufficient for streaming to have no real impact. Complexity — Setting up a streaming webcam in 2008 required some technical depth. It’s now dirt simple. Image quality — the resolution of cheaper cameras was poor. Low cost, high-resolution cameras are now ubiquitous. Filtering — Crowdsourced monitoring sounds nice, but in practice in meant people watching a lot of nothing happening. Today we have ML to do the heavy lifting. Incentives — There was little incentive other than civic pride to operate a camera, and tragedy of the commons / diffusion of responsibility resulted. The blockchain can provide an incentive, and I’ll cover that in more detail below. Infrastructure cost — setting up a network of cameras, paying for bandwidth, implementing filtering, and running servers takes significant resources. While the data that the network generates would have value at scale, it would have taken a large amount of invested capital to get to that scale. The blockchain can fix this too.

The case for a panopticoin

Of the issues above, the two which haven’t been addressed by technology’s evolution are the incentivization of nodes in the panopticon, and method of paying for the infrastructure to run it. Though I’m something of a blockchain skeptic, these happen to be perfect use cases for a crypto token.

By adding a node to the network, users can earn tokens. By running the infrastructure, users can mine / create tokens through proof of work. Many proposed blockchain use cases lack a reason for the collective to maintain the chain other than some common benefit, but most of these will fall victim to the tragedy of the commons. Others have incentivization for providing infrastructure in the form of mining, but most of that is predicated on the token appreciating and eventually becoming a common currency, not on the value of the content created by the network. The panopticoin doesn’t fall victim to either issue. There’s individual benefit to add a node, benefit to running the infrastructure, and clear value in the content generated, giving external users an incentive to buy in to the system.

This approach isn’t a panacea. There are huge risks to a public panopticon just as there are to a private one, and I’m not so naive as to minimize them. In a perfect world, we’d have some sense of privacy, and I’m loathe to suggest anything that pushes us in the other direction. However, I also don’t believe a future without mass surveillance is at all likely, so the question of how to do the least amount of harm becomes paramount. If nothing else, I hope this can spark a conversation about how we monitor and control the use, and abuse of surveillance, and perhaps pave a path toward a less frightening tomorrow.

I look forward to seeing you soon.