In 1995, law enforcement officials convinced a judge that Kevin Mitnick had the ability to start a nuclear war by whistling into a pay phone.

Mitnick had become a high-profile hacker on the FBI’s Most Wanted list after a two-year cat-and-mouse chase where he eluded the FBI under various false identities.

One time when he learned that they were on their way to his apartment, he kindly left them a box of donuts in the fridge before disappearing.

His crime? Mitnick had hacked into over 40 major corporations.

He had the uncanny ability to weasel his way into nearly any network’s core — from a military computer to FBI and DMV records.

He had intercepted and stolen computer passwords, altered computer networks, broke into private e-mails, and sweet-talked his way into getting privileged access to proprietary information.

And he did this mainly for the thrills — he was a hacker for the hell of it.

In Mitnick’s eyes, everything on the web is transparent.

If there’s a will, there’s a way for anything you write in email, every conversation you have in chat, every link you’ve visited, to be read — unless it’s been heavily encrypted.

Mitnick had access to information that could do terrible damage — social security numbers, credit card numbers, proprietary software, email logins.

He could have coerced and blackmailed people for lots of money.

But he didn’t. His central addiction was curiosity. It was a game for him.

In 1995, Kevin Mitnick’s games would come to an end — he was sentenced to 5 years in a federal prison and spent 8 months of that time in solitary confinement.

Today, the potential to do damage is exponentially greater in the age of big data, big tech, and information oversharing.

Central databases store what we like, where we’ve been, who our friends are, and whom we’ve slept with.

The list of known data breaches since 2010 has been staggering — it includes LinkedIn, Uber, Equifax, JPMorgan, Sony, Anthem, Citigroup, Dropbox, eBay, Evernote, and many more.

And since many people reuse passwords across similar sites, any password hack can lead to vulnerabilities in other sites.

Meanwhile company employees have been spying on user activity…for purposes related to work, of course.

Recently, it was reported that a whistleblower working at Lyft expressed concern that several employees looked up user data.

The whistleblower used anonymous workplace app Blind to report user privacy concerns (see screenshots above). Allegations include employees using data to stalk ex-lovers, checking where significant others were riding, stalking people they found attractive who shared a Lyft with them, and looking up phone numbers of celebrities (such as Mark Zuckerberg).

Government employees do it too.

The 2013 Edward Snowden files gave a glimpse of the extent of surveillance at the National Security Agency.

The NSA had wide-reaching surveillance tools for searching nearly everything a user does on the Internet.

They had legal authority to request user data from companies including tech giants like Google, Facebook, and Apple.

They were also intercepting 200 million text messages every day.

In some cases, NSA employees had been caught spying on love interests, a practice now referred to as “LOVEINT”.

And it’s not going away — on January 11, 2018, the U.S. House of Representatives voted 256–164 in favor of extending the NSA’s surveillance program for another six years with minimal changes.

Ultimately, these organizations are comprised of humans, none of whom are infallible, and each with their own motivations, biases, and triggers.

And they have all sorts of personal information at their fingertips — information they can use to satisfy whatever desires are pulling at them.

Now consider the implications of wider business AI adoption.

As we increasingly rely on complex algorithms to help us make high-stakes decisions, a not-so-nice version of Mitnick can engineer his way to accessing the code base.

A large attack can disrupt the power grid, shut down hospitals, compromise a national security system, or as Elon Musk fears, start World War III.

Projects like SingularityNet aim to create a decentralized marketplace for AI to enable anyone, not just the big tech companies, to buy and sell AI at scale.

But this will open another can of worms. More actors participating will create more and easier opportunities for social engineering.