It is interesting to see how concerned tech companies like Facebook, Twitter, and Google have become about user privacy. Of course, we know that the impetus for this drive for confidentiality comes primarily from the fallout surrounding the Cambridge Analytica scandal and the EU’s General Data Protection Regulation (GDPR), which goes into effect May 25.

Still, the general feel from notices companies have been sending out is that they have always been concerned with our data privacy, but they just want to be more transparent about it now. The big talking point on data collection is that they (Big Tech) only want to use our data to “improve the user experience.” This altruistic point of view sounds good, but we all know that the data is being used to make money.

But what if there was an ulterior motive to Big Tech’s data collection — something that did not involve improving our experience or making money? What if it was secretly looking to control the world? Well, at least one company was, or at least had put a lot of thought into it.

According to a short film uncovered by The Verge called “The Selfish Ledger,” Google had been thinking about using “total data collection” and social engineering to modify the behavior of entire populations. The nine-minute video examines the possibilities of using Big Data to guide users into conforming to a predetermined agenda. While the video does take a “for the common good” slant by using thought control techniques to solve problems like poverty and global warming, the mere fact that the video is seriously discussing behavior modification on a massive global scale is scary.

The film was made in 2016 by Google X (now just X) head of design Nick Foster and fellow researcher David Murphy for internal use at Google. In it, Foster envisions a future where massive amounts of data are collected on users and stored in what he refers to as a “ledger.” The ledger contains a user's “actions, decisions, preferences, movements, and relationships.”

"We understand if this is disturbing — it is designed to be."

Artificial intelligence will analyze this data. If the ledger finds a gap in the in the information that it needs to understand the user better, it will search for a device that the target might have that could contain the missing piece. If one is not found, the AI could use historical information on the user to design, propose, and deliver a custom product they might want via 3D printing. While the product will be for the use of the consumer, it will also be capable of supplying the ledger with the information it requires.

It is an alarming prospect that one would expect from a dystopian novel, but not from real life.

When asked about the film, an X spokesperson told The Verge, “We understand if this is disturbing — it is designed to be. This is a thought-experiment by the Design team from years ago that uses a technique known as ‘speculative design’ to explore uncomfortable ideas and concepts in order to provoke discussion and debate. It’s not related to any current or future products.”

I find this explanation somewhat hard to swallow. I know of no company that spends time, resources, and money on research that it has no intention of acting upon. Firms big and small are always looking for a return on investment. If the company knows there is little or no ROI, it abandons the idea. Does X expect us to believe that the ROI for The Selfish Ledger was only an internal philosophical discussion amongst employees over coffee?

Google has made considerable strides in the field of artificial intelligence. Although I remain skeptical that it was real, its Duplex AI, demoed at I/O, seemed to pass the Turing test by fooling people into thinking they were talking with a human. Having a so-called “ledger” of user data capable of self-analyzation with an agenda of behavior manipulation is not a big stretch.

Concerns over privacy are at the forefront right now, but once the fervor dies down, I can see Google revisiting the possibilities of The Selfish Ledger in the near future trying to find that ROI.