The padlock is the internet's talisman of privacy and safety.

It's in the corner of your browser when you have a secure website connection. It appears on a Twitter protected profile. It indicates where to find Facebook's privacy settings.

But in the words of Inigo Montoya in The Princess Bride, "I do not think it means what you think it means".

Or so said Woodrow Hartzog, a professor of law and computer science at Northeastern University in the United States.

Deceptive design nudges, tricks and goads you into sharing more than you might intend to online, Professor Hartzog argues in his new book, Privacy's Blueprint: The Battle to Control the Design of New Technologies.

And when you think you're in control of your own data, you rarely are.

"If you want to know when social media companies are trying to manipulate you into disclosing information or engaging more, the answer is always," he said.

The 'dark patterns' of privacy

Online platforms, from social media apps to ecommerce sites, want you to feel safe.

If you feel secure, you engage — share posts, send emails, join groups.

Facebook also uses lock and badge symbols in its Settings. ( ABC News: Facebook screenshot )

For companies like Google, Snapchat, Twitter and Facebook, which make the majority of their income from advertising, more action equals more data. All the better to understand you and sell that understanding to brands.

Professor Hartzog said consumers need to watch out for the tricks and symbols that denote safety and control — the padlock icon being one of the most common — and ask themselves whether they can be relied on.

These tricks of perception of privacy can be "dark patterns".

In a URL, a padlock symbol shows a connection is secure. ( ABC News: Google screenshot )

A term coined by British designer Harry Brignull, dark patterns are online design choices that obscure and manipulate a website's true intention or function.

Ian Muir, managing director of IDM Design Labs, said these signifiers have evolved over time.

In the early days of ecommerce, for example, the padlock and the key were good representative symbols. Brands marks, including banking logos, were also used to suggest legitimacy.

"In a broad sense, just having a professional looking site was sometimes a bit misleading, but it was a cue for people to say 'oh, this must be OK'," he said.

A false sense of control

Have you noticed how eager the big technology companies are to tell you that you are in control of your own data?

"It allows companies to over-leverage the notion of control in a way that makes us think we have it, but really our exposure is pre-ordained," Professor Hartzog argued.

Users may also be given choices that oversimplify the impact on their privacy — they can be binary and lack nuance.

Loading

Consider a switch to turn facial recognition on or off, for example.

"If you turn it on, you get all the benefits of facial recognition and the risks that comes with that — if you turn it off, then, like Willy Wonka says, 'Charlie, you get nothing'," he said.

But the alternative can be just as troubling. If you get too much control — dozens of toggles and switches — you drown in a sea of choices.

A 2012 Carnegie Mellon University study suggested that if users are given "fine-grained privacy controls", they may be more willing to risk sharing personal information.

"Privacy control settings give people more rope to hang themselves," one of the study's authors Professor George Loewenstein recently told the New York Times.

"Facebook has figured this out, so they give you incredibly granular controls."

In Facebook's ad settings, for example, you can stop brands that have your contact details from showing you marketing material — but you have to click 'no' to each one individually (I currently have over 300 brands to say 'no' to).

"There's an old saying about security by obscurity," Mr Muir said.

"It doesn't actually create security, but what should be done is obscured by the complexity of it all. It's so complex people can't figure it out, and therefore it is assumed to be safe."

This complexity is also reflected in language, he suggested. The use of double negatives, for example.

Just think about the last time you tried to opt-out of an email newsletter — were you given the option to 'No, Cancel', effectively ending your effort to unsubscribe, or 'Learn More'?



Fight for obscurity

Companies that rely on extracting user data as a business model need users to believe they are safe and in control.

But are we secure and empowered? Professor Hartzog said no, despite what the platforms' privacy settings may suggest

After all, most companies never stop asking for your data.

"Even if you say no, even if you turn all the knobs to 'off', what's going to happen is you're going to get nudged or outright 'platform-bullied' into saying 'yes' over time," he said.

"Anyone who has downloaded Facebook's messenger app and turns notifications off is all too familiar with the ceaseless requests when you open the app to get you to turn notifications on."

Professor Hartzog said we need new platform design rules that respect our need for obscurity. We expect our restaurant conversations to disappear into the ether, for example, so why not our text messages?

Collecting data from even the smallest online interaction makes this obscurity impossible — every choice, every line is loud, collected and stored indefinitely.

But he believes these apps and websites can be designed differently, and regulation may be needed to force the issue.

Lawmakers need to get ahead of this issue, because coming facial recognition capabilities, in Professor Hartzog's view, are an "obscurity-eviscerating" technology.

"We need to focus on making sure the incentives for companies are different, because right now, companies have every incentive to extract every ounce of value from the user," he said.