With distributed ledger technology, there's only one way the concept of privacy can go.

Privacy is a hot topic, almost like everyone woke up one day and realised just how much the Internet knows about them. This has been punctuated by a series of high profile data breaches which have seen millions of people's personal information crop up in online marketplaces, and incidents like the ongoing drama around Facebook's collection of user information.

Along the way, the concept of privacy itself is changing.

The first shift was a recognition that the question of withholding personal information was off the table, and largely incompatible with the digital world. From there it shifted towards discussion of exactly which data should and shouldn't be shared, and how data needs to be protected. Cryptocurrencies and decentralised ledger technology (DLT) have complicated things, introducing transparent and immutable decentralised ledgers to wide public use.

It's only now that regulations are catching up, with new systems like the EU General Data Protection Regulation (GDPR) overhauling a 20-year-old privacy framework for the information age.

A tech gap

One of the reasons privacy issues exploded so quickly is because the ability to collect data has far outpaced the ability to make sense of it. One of the clearest examples of this might be the NSA's "bulk data failure," which saw it collect so much information that it actually became less effective, and unable to use any of it. Social media, smartphone tracking, Google Earth, email keyword scanning and similar rolled in fairly abruptly, while the ability to make sense of the data only came later.

The data came first and then came the ways to make sense of it. This might be why targeted advertising suddenly seems eerily good, and incidents like the Facebook-Cambridge Analytica data scandal, as Wikipedia semi-formally calls it, are suddenly becoming more consequential.

This tech gap – the difference between data collection technology and data analysis technology – may have let the issue creep under the radar for a long time, as more and more data accrues.

A similar tech gap may have emerged between the ability to collect and analyse data, and the ability to secure it once collected.

Privacy by design

In a collect-first-process-later data environment, and without privacy issues taking the limelight, data security was also a latecomer. It was essentially tacked on as a separate layer, in the form of passwords, encryptions and cybersecurity consultants (all of varying quality) for these troves of user data.

Cambridge Analytica, for example, was able to harvest so much data because certain vulnerabilities in the Facebook system allowed it to access all the data that users voluntarily provided. Once users gave Facebook permission to use it, Facebook had no control over where it went. The solution was to ask Cambridge Analytica to delete it, but they didn't. The rest is history.

A quick Google search for data breach brings back plenty of other recent examples. One of the more awkward examples might be the John McAfee-endorsed Bezop cryptocurrency losing data on about 25,000 investors, including name, address, copy of photo ID and other data on file. Some of the larger examples might be Under Armor and Uber losing data on millions of customers, including emails, passwords and similar.

Clearly, this isn't working.

The GDPR recognises as much. Among the protections for users, one of the more intriguing requirements is mandatory privacy by design, rather than just tacking it on as an afterthought.

"Privacy by design as a concept has existed for years now, but it is only just becoming part of a legal requirement with the GDPR," the guidelines read. "At it's (sic) core, privacy by design calls for the inclusion of data protection from the onset of the designing of systems, rather than [as an] an addition... hold and process only the data absolutely necessary... as well as limiting the access to personal data to those needing to act out the processing."

Privacy, on the blockchain

The GDPR requirements, coming into effect on 25 May 2018, are likely to see a surge of interest in distributed ledger technology (DLT)-based data security and user identity systems.

Firstly, because decentralised, open source security measures may theoretically be the only actually secure way of storing data. If a system depends on obscurity or a central authority to remain secure, then it's only secure as long as it remains secret or the central authority behaves itself. In other words, it's not secure.

Secondly, because DLT-based identity systems are increasingly looking like the future. The idea is to make each individual the controller of their own data, which can then be verified as appropriate without being released. There's a huge amount of development going on to this end, from both cryptocurrencies and institutions. Ethereum, for example, is working on creating new identification standards as a high priority, and Microsoft is currently developing its own blockchain ID system.

"Each of us needs a digital identity we own, one which securely and privately stores all elements of our digital identity," Microsoft developer Ankur Patel explains. "This self-owned identity must be easy to use and give us complete control over how our identity data is accessed and used... Rather than grant broad consent to countless apps and services, and have their identity data spread across numerous providers, individuals need a secure encrypted digital hub where they can store their identity data and easily control access to it."

Giving each person self control of personal data and identity is most likely going to be the way forwards. Fines for GDPR violations are likely to clock in at €20 million, so the risks of holding user data will be considerable. Uniquely, DLT identification systems don't have to cut off access to customer data for advertising, verification, research or other purposes. With a suitably engineered system, individuals can willingly opt to reveal certain verified data points anonymously, or even verify their identity without providing any identification documents. This is good news for companies who want to take advantage of user data without actually touching or needing to store that increasingly dangerous stuff.

The system may also help satisfy the contentious "right to be forgotten" requirement of GDPR. This basically means companies have to permanently erase user data when requested or if it's no longer needed for the purpose it was provided for.

The problem, which has instilled varying degrees of panic, is that it's inherently at odds with the permanent and immutable nature of transparent ledgers. Unnecessary private side-chains are being proposed as a solution, but the much simpler and more likely answer is a system that allows each individual to be the controller of their own data, and to reveal it as desired anonymously, or in ways that don't require the data to be stored.

This is likely to be joined by other DLT solutions which focus on securing other elements, like messaging systems.

A secure line

Facebook probably isn't listening in on your calls, but it does scan chats on the supposedly private messenger platform for the purposes of preventing abuse and protecting users. And maybe other reasons too.

With the right paradigms, the solutions to protecting confidential user information are relatively straightforward – put users in control of their own digital identity – and it's increasingly clear how that space will develop. But private communication is a much trickier area, without a completely clear way forward.

Privacy in communications is arguably just as important as security of personal information, or even more so. It can be a relatively easy way to de-anonymise other information, and might be just plain confidential. However, monitoring communications is also just as important for law enforcement, and there are plenty of reasons why perfect confidentiality isn't necessarily ideal.

But here too, technology might have run ahead of existing frameworks for managing it. The recent Telegram issues in Russia might be a testament to this.

Telegram, which has become something of a de-facto network for cryptocurrency communities, is defined by its assurance that all messages are end-to-end encrypted and completely private, at least until someone takes a screenshot. It's safe to assume that it hosts a wide range of illicit communications of all kinds, but it has also become something of a secure space for whistleblowers, inside sources, subversive politics and similar.

Russian authorities previously ordered Telegram to hand over the service's encryption keys, but it refused and argued that it couldn't even if it wanted to because the keys are locally generated on user devices. Since then Telegram has been banned in Russia, but with little success. As Moscow Times reports, the service only appears to have lost about 3% of its Russian users, but Internet disruptions continue all across the country.

The future of communication

The easiest way to think about the issues around privacy in communications might be to think of the topic as a question of whether the general public has "the right to bear industrial strength encryption technology".

Situations like the FBI-Apple phone encryption dispute make it clear that there's still no clear legal answer one way or another. But as the Russia-Telegram situation illustrates, there's probably nothing authorities can do once cutting edge encryption becomes the norm for all online communication.

Once again, distributed ledger, or blockchain, technology is likely to play a role thanks to the extra security of open source decentralisation.

One potential version of this future of communication might be found in the plans of DLT-based messaging platforms like Mingo and others. In the case of Mingo, it's angling to become the de-facto messaging standard through the public version of the Hashgraph platform, by consolidating existing messaging systems, such as Telegram, Twitter, Steam, Discord and Facebook Messenger, onto one DLT-based platform.

The Russia Telegram ban, says Mingo CEO Joe Arthur in an email to finder, "emphasises the importance of decentralised messaging systems".

He also points out that the cryptocurrency transactions blur the lines between sending a message, sending money and communicating confidential information, and that increased consolidation of all these systems on one decentralised public platform is probably a sensible way forward.

"Decentralised private messaging is essential for communication in the technological world," he says. "With the inclusion of a messaging platform that runs on a decentralised technology, you can enable instant currency transfers, faster-messaging systems, and encryption. What many don't understand is that with a decentralised technological backing, you can employ a more secure messaging application. DLT creates a safer habitat for all messengers to function on."

"With centralised systems, it seems as though the 'user' has been forgotten. Encryption emphasises protection and benefits the user's privacy and safety. Enabling a decentralised infrastructure on a messaging platform lets new users access DLT-based benefits without having to understand the deep technicalities associated with it. It simplifies a complex technology."

However, it's not entirely clear how complete confidentiality might balance with regulatory requirements. There's a curious dichotomy between the mandated protection of user data and communication, and the need to make that information available to authorities. It's becoming relatively common for businesses to push back against government requests for user information, and on a completely decentralised platform an organisation might simply be able to shrug at these requests, and honestly point out that it's simply not feasible.

In the case of Mingo, Arthur maintains that there's a happy solution somewhere.

"Messaging domains that run on decentralised frameworks enable efficiency, security, and privacy in a respectable way that's beneficial for all parties," he said. "We believe effective regulation can still be present in a decentralised economy while still adhering to the importance of the user."

If that happy solution is to rush ahead with the technology now and sort out the finicky legal-ethical implications of privacy technology later, that would be par for course.

Disclosure: At the time of writing the author holds ETH, IOTA, ICX, VEN, XLM, BTC, NANO

Disclaimer: This information should not be interpreted as an endorsement of cryptocurrency or any specific provider, service or offering. It is not a recommendation to trade. Cryptocurrencies are speculative, complex and involve significant risks – they are highly volatile and sensitive to secondary activity. Performance is unpredictable and past performance is no guarantee of future performance. Consider your own circumstances, and obtain your own advice, before relying on this information. You should also verify the nature of any product or service (including its legal status and relevant regulatory requirements) and consult the relevant Regulators' websites before making any decision. Finder, or the author, may have holdings in the cryptocurrencies discussed.