In the early 1970s, Hannah Arendt wrote a devastating critique of the Pentagon’s Vietnam-era penchant for policy by counting. “The problem-solvers did not judge,” she wrote. “They calculated.”

Exuding the spirit of gamblers rather than statesmen, the decision-makers played “the percentage game”, counting whatever could be counted and ignoring the rest, or the underlying problems, with “an utterly irrational confidence in the calculability of reality”.

With artificial intelligence and machine learning, technologies that are fast becoming very significant actors, “we are in another moment of irrational confidence”, says renowned technology and culture researcher Kate Crawford.

Aiming at population-level predictive gambles, these technologies filter who and what counts, including “who is released from jail, what kind of treatment you’ll get in hospital, the very news that you see”. How we respond is “the biggest challenge facing us for the next 50 years”.

They are embedded in back-end systems … with no consumer-facing interface. Their operations and rules are not apparent Kate Crawford

Crawford and three other women at the leading edge of digital scholarship and activism are headlining the 17th annual Association of Internet Researchers conference in Berlin. Their resounding message is that we have an urgent problem with “the machine logics that bind human and non-human rulers together”.

Crawford points to the recent international outrage at Facebook’s censorship of a Pulitzer prize-winning photograph as the tip of the iceberg. This is a high-profile example on top of “a much larger mass of unseen hybrids of automated and semi-automated decision-making,” she says. “They are embedded in back-end systems, working at the seams of multiple data sets, with no consumer-facing interface. Their operations and rules are not apparent to us.”

These concerns are echoed by Carolin Gerlitz, a digital media professor at the University of Siegen. Today’s digital companies “allow us to act, but in a very fine-grained, datified, algorithm-ready way. They put life to work, by rendering life in Taylorist data points that can be counted and measured” and, of course, valorized.

José van Dijck, president of the Dutch Royal Academy and the conference’s keynote speaker, expands further. Datification is the core logic of what she calls “the platform society”, in which companies bypass traditional institutions, norms and codes by promising something better and more efficient – appealing deceptively to public values, while obscuring private gain.

Van Dijck and peers have nascent, urgent ideas. They commence with a pressing agenda for strong interdisciplinary research – something Crawford is spearheading at Microsoft Research, as are many other institutions, including the new Leverhulme Centre for the Future of Intelligence.

There’s the old theory to confront, that this is a conscious move on the part of consumers and, if so, there’s always a theoretical opt-out. Yet even digital activists plot by Gmail, concedes Fieke Jansen of the Berlin-based advocacy organisation Tactical Tech. The Big Five tech companies, as well as the extremely concentrated sources of finance behind them, are at the vanguard of “a society of centralized power and wealth.

“How did we let it get this far?” she asks.

Crawford says there are very practical reasons why tech companies have become so powerful. “We’re trying to put so much responsibility on to individuals to step away from the ‘evil platforms’, whereas in reality, there are so many reasons why people can’t. The opportunity costs to employment, to their friends, to their families, are so high” she says. But there’s now an additional and deeply pernicious dimension. Even if you did manage to avoid using Facebook, Gmail or an iPhone, we are all part of a “broader tracking universe”, she says.

“We still think we have agency about our participation in these processes at an individual level, but I really don’t think we do any more, or at least that sense that we do is very rapidly diminishing.”

If consumer choice is not enough, and transparency is necessary but not sufficient, Crawford argues we need to look for political levers, as well as for collective, co-created models that change information ecologies. Van Dijck casts this as a restimulation of public values, through understanding, struggle and negotiation. Jansen agrees, emphasizing the need to popularize the conversation, particularly through art and culture – and by not shying from the political, when what we are dealing with is undeniably political.

The striking feeling from Berlin, where these scholars are so incisively exposing the sheer, indescribable power of the tech majors, is that it’s nevertheless long past time for diagnosis: we need solutions.