It’s the software that Amazon uses to tell you to buy a book you know you’ll never read. And Twitter to persuade you to follow some douchebag. And your local council to tell social workers how to act.

A report by Sky News’s Rowland Manthorpe, based on research by Cardiff University’s Data Justice Lab, revealed that at least 53 local authorities and almost a third of UK police forces are using “predictive algorithms” to determine how to intervene in everything from traffic management to benefits sanctions.

The results of data analytics may significantly constrain and guide [public-sector] decision-making Data Justice Lab

Bristol city council’s integrated analytics hub, for instance, uses data on benefits, school attendance, crime, teenage pregnancy and much more to give people a “risk score” that is then used to flag cases for social work intervention.

For local authorities, such algorithms provide cheap solutions in an age of severely reduced budgets. Their advocates insist that there is nothing to worry about, as computers never make the final decision – they simply aid humans. But as a report from the Data Justice Lab observed, in the “context of deskilling and resource limitations in the public sector, the results of data analytics may significantly constrain and guide decision-making”.

It’s one thing for Amazon to entice me to read Jordan Petersen or Twitter to push me to follow Piers Morgan. It’s quite another for public authorities to use similar algorithms, fed with a mountain of sensitive personal data, to determine who may commit crime or be at risk of abuse.

Such data practices, according to the Cardiff University report, “have become normalised before there has been a chance for broader public discussion”. The fact that these systems are already in place “will serve as a rationale for their continued existence and a means to foreclose debate”. Isn’t it time to have that debate before it’s too late?

• Kenan Malik is an Observer columnist