OK, so it’s fair to say that Facebook has not had the best couple of days. On March 20, a class action complaint was filed against the firm for allowing the ‘political consultants’ Cambridge Analytica to access the personal data of 50 million users (without their permission).

Then Mr Zuckerburg is called before a parliamentary committee in the UK to give evidence. At the same time Facebook was blamed in a shareholder lawsuit filed in San Francisco for the drop in its share price after the whole sordid tale of data harvesting was revealed.

Oh, and nearly $50 billion of value was wiped off the giant’s market capitalization in just two days.

No wonder Mr Z had to briefly stop counting his grey t-shirt collection and issue a statement that admitted to the social network making ‘mistakes’ (an apology viewed as lacking any real remorse).

When it rains, it certainly seems to pour at Menlo Park…

Then amid the sounds of hatches being battened down by Facebook executives near and far came Brian Acton, co-founder of WhatsApp, with four simple words:

“It is time. #deletefacebook”

A breakdown of trust

Now being described as a ‘movement’, #deletefacebook is gathering momentum. It’s also being aided by a deluge of press coverage in the wake of the Cambridge Analytica scandal. Writing in the Guardian, Richard Wolffe predicted that Facebook’s future “is already in serious doubt. It is now a polluted space, where you have no idea if your friends are real, if their posts are disinformation, if the ads are legal, and if your user data is safe”.

In other words the covenant of trust between the social media giant and its users has been compromised (even Zuckerburg has had to admit to a ‘breach of trust’ in his statement).

As Roger McNamee, an early investor in Facebook, told National Public Radio: “The issue is a callous disregard for the privacy of users and a lack of care with respect to data that had been entrusted to Facebook”.

He then went on to say: “I’m not exactly sure what’s going on here, but I’m afraid there is a systematic problem with the algorithms and the business model of Facebook…”

Which points of course to a wider problem: most of the established, centralised players operate the same basic form of business model — based on the collecting and selling of personal data. Your personal data! Indeed, Brian Acton hasn’t just called on users to unsubscribe from Facebook, but also Instagram and WhatsApp (which is essentially the same company).

The result of all this is not just a level of notoriety for Facebook, and a confirmation (if we needed it) that users of the platform come a distant second to the advertising buck in the corporation’s affections. It’s also another stain on the social media fabric that despite being small is in danger of spreading to the other major players.

The importance of reputation

Then there’s the issue of influence. What this recent weaponization of Facebook has shown is that without the correct controls in place, we as users begin to doubt even our closest of acquaintances. We’ll start seeing dark agencies and conspiracies at work, even if they don’t actually exist. Truth will become increasingly harder to grasp, to validate — to trust.

In part this is because political activists (and commercial brands as well) now know that exerting real, meaningful influence is not a top down process anymore, but rather a side-to-side lateral motion. Or put another way, people trust and listen to the views and recommendations of people they know: a small, tightly controlled network of friends and connections who share similar outlooks and moral peculiarities.

Not a chatbot, unknown ‘agent’, or celebrity endorsement.

The challenge therefore is in knowing who’s who and what’s what. Can we trust our own networks to remain free from outside interference? Do we have any way of judging the veracity of information being pushed our way? Is anybody even checking? Questions that bring us back to the control mechanisms for identifying ‘proof’ of identity, and proof of trust that the established players have so far failed to produce.

The value of control

Yet knowing such controls are absent is one thing, waiting for the mainstream platforms to catch up (if they can) and introduce them is another. But what’s the alternative?

At Howdoo, we think we have the answer. Every user on our social network will automatically generate their own proof of contribution (POC) score. The more trusted and active they are, the more their contributions are clicked on, the more their content is viewed — all these factors and more will drive your POC. Get a low score, and we’ll all know about it, meaning users are far more empowered to make honest and informed decisions.

We think this is important, because influence at this micro level is increasingly up for sale. And it should be to! As users we just want transparency as to who’s trying to swing our vote, and then to have the simple choice: do I accept this exploitation of my personal data in return for a value-based reward; or do I hit the off button and sit back confident my digital footprint is on lock-down?

We live in the Age of Influence, and the task now is to make it work for us.

What say you?

Howdoo.