You're on social media, scrolling for something to capture your attention, and you pause.

There's an image or link that looks interesting. But if you were to click and others — your friends, your family, your employer — were to find out, would that be embarrassing?

While the internet and social media have made us more interconnected than ever, it's also meant constant surveillance by companies — in other words, the feeling that on the internet, we are always being watched. Our clicks are logged, categorised, interpreted and rated.

It's leading to what Tijmen Schep, a Dutch technology critic, calls "social cooling" — a society of increasing social conformity and rigidity, in which we self-censor or second guess what we do online for fear of repercussions.

Schep says it all starts with the collection of our data

"There's a huge business of companies called data brokers," Schep, the author of Design My Privacy, a "beginners guide to ethical design for the Internet of Things", told Lateline.

"They gather data about us, from everything from our cookies through emails — everything they can get their hands on. They're creating reputation scores about us, detailed psychological profiles."

Based on what we post and like and share on social media, these firms use artificial intelligence to infer intimate characteristics, creating valuable data.

That's given rise to something called the reputation economy.

"Where in the past you had your money to increase your value, increasingly your social capital — your reputation — is now being gathered in these systems," Schep said.

"That's really creating large social pressure to be perfect, to be a good citizen."

In next 10 years, Schep said, we will begin to appreciate the wide-ranging effects of this system, "like [how it will impact] your opportunity to get a nice job, to get a cheap bank loan".

"And that will profoundly change how [people] will look at themselves and how they will express themselves," he said.

"There are some companies out there that focus on human risk business. They will look at employees' data — what they say, what they do — and try to predict the ones that will leak, basically creating a dashboard with the employees and their risk scores."

How far could this go? Look at China

China is already taking advantage of this kind of information, Schep said — in fact, they are "weaponising" it.

"They're creating something called the social credit system," he said.

"This is a system where each adult citizen in China will get a score that basically represents how well-behaved of a citizen they are. And that's using all kinds of details — your credit rating ... what you say on social media, what you buy on sites like Alibaba. It combines it all.

"What they say is 'the state wants to help you to be a better citizen, to be a more ethical person, a more normative person'. They don't trust you to be free; they don't want you to develop your own norms. It's been actively developed by the Chinese Government to increase social pressure and keep people in line."

What are the long-term effects of living in a data society?

Mr Schep sees three things happening:

More self-censorship: "People might not do things, not say things, because they're afraid it might affect their score. There's a paradox here where you have free speech, you're just not using it."

"People might not do things, not say things, because they're afraid it might affect their score. There's a paradox here where you have free speech, you're just not using it." Holding society back: "If people are not willing to protest, or not willing to stand up, that might lessen society's ability to change or to evolve."

"If people are not willing to protest, or not willing to stand up, that might lessen society's ability to change or to evolve." Increase risk avoidance: "These reputation systems are very much at odds with taking risks because they want to limit us taking risks; they want us to not stand out."

Which algorithms affect your life? This story is part of a series ABC News is working on looking at the algorithms that affect our lives. Tell us about an algorithm you'd like to know more about.

A small example of social cooling in action, Schep said, took place in 2013, when the former National Security Agency contractor Edward Snowden leaked information about the US Government's electronic surveillance programs.

"There's research by [University of Toronto research fellow] John Penney who points out that after the Snowden leaks, people were less likely to search up terrorism on Wikipedia or Google, for example," he said.

Mr Schep said this kind of environment could dampen values like creativity and individualism that Western societies hold dear.

"There's a bit of a paradox there where these systems, in a way, are limiting divergent thought, are limiting different thinking," he said.

What can we do about this?

Mr Schep puts it this way:

"If oil leads to global warming, then data leads to social cooling."

He said data, like fossil fuels, may create a toxic environment.

"It's creating a lot of problems, it's damaging our social environment, and in the next 10, 20 years we'd have to see the same things [happening] that we saw [with] global warming, like moves towards regulating it, moves towards alternatives," he said.

"Privacy is the right to be imperfect. That's an important thing, because we humans are fundamentally imperfect.

"So, you could say privacy is the right to be human."