The trust in the big data refineries is at an all-time low. Data allegedly was used beyond the scale of the permitted — and imagined — to influence voting results. The situation is dangerous , especially for Facebook. Mark Zuckerberg appeared personally in front of the US Senate and the European Parliament for a hearing.

He didn’t look too bad — in our opinion. But he offered no solutions for the fundamental flaws in today’s data business. The model itself is bugged. And it’s faulty in two ways: It is centralized and therefore prone to data leaks. Either by hacks or abuse of the user terms by its clients. The second seems to be the case in the Cambridge Analytica scandal.

Facebook and the other data companies react with panic and actionism — and aim to improve the protection of their users. At the same time, the EU moved with the EU General Data Protection Regulation (GDPR) to offer better legal protection. It’s a fair guess the US will follow. But none of these moves — Facebook’s and the other data giants’ moves to improve code, transparency, and terms of usage nor the legal action by the US Senate and the EU fix the fundamental flaws of the current data trade: a centralized system and targeted advertising without user consent as the value proposition.

The result is simple: If those flaws are not fixed, the next big data breach seems to be just a matter of time.

People are therefore concerned. And with good reason. wysker, on the contrary, has privacy built into its very DNA. Indeed protecting the user’s data at any cost is our business model; a model that was previously impossible. To understand this critical feature of wysker — we first must understand the “old internet ways.” Only then it becomes obvious how disruptive and different the wysker approach is.



And this is the classic approach as used by Facebook: Users trade their data for services. Each user’s data is of no particular interest for the refineries. Instead, they refine billions of data points generated by billions of users in giant databases. The individual becomes anonymous in the masses. Or at least that is the promise. The data giants record demographics, consumer habits, identified preferences and customer types. From there they can distill tailored offers to their users and advertisers alike. They can target advertising to a whole host of macro and micro target groups — and everything in between. As algorithms advanced over time, the results became quite astonishing. Some refineries claim to be able to predict “purchase intent” with over 60% accuracy. Leading to even better services and even more profitable target groups for advertisers. None of this happens with transparent user consent. Instead we opt-in to it when we sign up, use the service or an integrated app. Users trade data for a “free service”.

Everybody wins — it appears. If there wouldn’t be one problem as Evgeny Morozov points it out in his book “The net delusion”:

„Digitization of information has also led to its immense centralization: One stolen password now opens data doors that used not to exist.“1

It turns out he was right. Not only do we trust the big data refineries with an accumulation of data and power that should not be left in the hands of a few, but they are also by design a security risk. They provide a singular vector of attack. In the end, it was somewhat naïve dealings in handing over user data to a client, that angered users and not a hack. But that doesn’t mean that one couldn’t find plenty cases of hacks of big databases — owned both by corporations and government institutions.

The suspicions of the public that their data is not safe in a centralized system are not without a factual base. Even more, in 2006 a technology consultant proved that he, indeed, only needed access to very few public data points to identify an individual — Morozov points out:

“In 2006 the technology consultant Tom Owad conducted a quirky experiment: In less than a day he downloaded the wishlists of 260,000 Americans, used the publicly disclosed names and some limited contact information of Amazon’s customers to find their full addresses, and then placed those with interesting book requests — like Orwell’s 1984 or the Quran — on a map of the United States.”1

Scared by the recent development, the data giants rush now to increase security and privacies. We believe they do this with the best intentions. Yet, the system remains faulty. Because none of the data giants will change their centralized business model. And with that, the risk remains. Another gap might appear. And the next scandal will cause public anger, stock value to drop, petitions to be signed, governments to discuss regulations — but most likely no change of business model. To stick to the light tone of our medium articles: a square is not a wheel. No matter how much you change the square, it will not roll.

Michael K. Casey and Paul Vigna conclude in “The truth machine”: “We’ve seen how the Internet was co-opted by corporations and how that centralization has caused problems — from creating big silos of personal data for shady hackers to steal to incentivizing disinformation campaigns that distort our democracy.”2

wysker begins at this point. It is no secret, that wysker began its life with the idea of a funky, mobile product browser. Nothing more, nothing less. We honestly are convinced that the combination of shopping, inspiration and UX design it offers is one of a kind. And supercool. Catch 22: it didn’t take us long to figure out that this little gadget we built created data sets that were a bit more than state of the art. For “purchase intent” we could match the giants. Indeed, the app performed so well in that respect during our test runs, that the founders deemed the app unethical in a genuinely Faustian moment. Do not release the devil.

The genie would remain in the bottle until the user privacy issue was solved. But what if there is no central database? Can we decentralize data and trust? Can we put control over data back to those, whom it belongs to? And what does that mean?

Enter the blockchain. The blockchain enables the current wysker business model — which goes far beyond product browsing. It allows data to be traded peer-to-peer. Individual consumer to individual retailer. User to advertiser. And it requires cryptographic consent. There is no central data silo. Instead, all identifying data remains on the user’s device — and it needs a key to be unlocked. The public transaction ledger itself contains no data that could be used to identify a specific person — the user remains pseudonymous. The ledger is additionally split into blocks that control each other. There is no concentrated power. Even if one block is hacked, the others refuse further transaction with it. If one user is hacked, the data of the others remains safe. As Casey and Vigna put it:

“This means two things: nobody can alter the data to suit their own ends, and everybody has greater control over their own data.” 2

Here we go. We decentralized trade, trust, and safety. This idea changed the course of events for wysker. wysker became distributed, cryptographically safe, public in trust — yet private in data. The concept and the technology behind it took us by storm. Our data model predicts purchase intent with high accuracy. The resulting data remains locked and with the user. Who remains in control of what happens with it, and who gets access to it. The app itself can only access “pseudonymized” data points — before this become useful for advertisers, the user must consent to unlock his identity. And he must do this for every single transaction. Thus wysker creates target groups that combine a high purchase intent with individual user consent. For advertisers, this constitutes the perfect target group. And they are willing to pay a premium to each user for it. In other words: it creates real tangible value.

A value that translates into a point system that rewards users for creating data (product views), and for releasing it. The consumer for the first time participates in the data trade. Users can trade the points for products, bonus offers, and discounts. The wysker data model is complete — it guaranteed data safety and at the same time increases the data value of each user. The data model allows the user to finally profit from its data.

And it is all based on keeping user data private and subject to consent. Privacy is designed into wysker because it is not a feature. Privacy is our business model. The wysker App indeed is merely the code that enables this system.

At this point it is time for the big news: Our developers move steadily towards the release of the next wysker versions. These will incorporate all functions necessary to make the wysker data trade work for you. We are beyond excited to take the next and deceive step in the wysker evolution.

1. Morozov, Evgeny. The Net Delusion: How Not to Liberate the World. Penguin, 2012.

2. Casey, Michael J.. Vigna, Paul TRUTH MACHINE: the Blockchain and the Future of Everything. HarperCollins Publishing, 2018.