This week marks the end-of-financial-year here in Australia, which means it’s time for a stock-take: see where we are at, count the positives and negatives, and determine our net position. Are we in the red or the black?

So today, I thought I would do a stock-take of the year in privacy, reviewing both positives and negatives.

Herewith I present to you the Privacy Ledger for the Australian Government, FY 2016-17.

First, the privacy-positive side of the ledger

A data breach notification law was finally passed.

The government backed down on a Bill that would have allowed more sharing of veterans’ personal information without their consent.

The government also backed down on the proposal to allow access to telco metadata for use in civil litigation (although, as the Attorney-General’s Department notes, “Civil litigants will still be able to access data that is not retained solely for purposes of the data retention scheme”).

The Department of the Prime Minister and Cabinet agreed to develop a code with the Office of the Australian Information Commissioner (OAIC) to address concerns about data-handling in the Australian Public Service.

But then there’s also the privacy-negative side of the ledger

It’s not exactly a well-balanced ledger, is it?

This litany of privacy disasters, solely from the Australian Government and just in the past 12 months, simply doesn’t square with the rhetoric about government having or obtaining the social licence necessary for more data-sharing and data analytics.

We already see considerable scepticism from the Australian public about the re-use of their personal information by government for research or policy-making purposes, with the latest survey from the OAIC suggesting that 40% of Australians are uncomfortable with the idea.

I believe that the privacy ledger is so out of balance that we are now witnessing a profound loss of trust in government. This doesn’t just affect the Andie Foxes or the welfare recipients or the people whose metadata is collected by the police; it affects all of us. Because if the public loses faith in the government’s ability to handle personal information properly, then big-ticket, transformational policies and programs will stall, and public benefits will not be realised. When people don’t trust electronic health records, some will avoid medical treatment, thus impacting on public health outcomes. When people don’t trust what the ABS is going to do with their data, some won’t respond to the Census anymore, thus impacting on the quality and public value of the data.

Privacy Commissioner Timothy Pilgrim hinted at this, when he wrote to the Secretary of PM&C recently that, given the “several high profile privacy incidents in recent times”, there is an “urgent need” for action by the Australian Public Service to ensure compliance with privacy law, and “broader cultural change” to improve privacy protections, so as to “facilitate the success of the Australian Government’s broader data, cyber and innovation agendas”.

Pilgrim said that more work is needed by government to “build a social licence for its uses of data”, particularly in relation to proposed new uses and increasingly ‘open’ data. He suggested that social licence can only be built through transparency about intended uses of personal information, and effective privacy governance – the current deficiencies in which were the trigger for his letter. However he also noted that social licence can only be gained when “the broader community must believe that the uses of data which are permitted are valuable and reasonable”.

That letter was written in March, before the latest privacy-invading budget proposals were known. I can only imagine this situation will worsen, as people contemplate proposals like the created of shared e-health records for everyone by default, or the targeted-yet-random drug-testing ministerial thought bubble.

(Did the giant minds at Data61 ever imagine that they and their computing power would be tasked with such a crappy job as sifting through sewage analysis to pin-point drug-taking areas so that welfare recipients in those areas can then be chosen at ‘random’ for drug-testing? There I was complaining about the NSW Government seeking to use our water and electricity consumption data to identify slum landlords, but really, this latest proposal to use Big Data on effluent just boggles the mind. As Denham Sadler said in InnovationAus, “The plan has the potential to damage the “public good” reputation of the CSIRO and its data unit Data61 as its research smarts are press-ganged into a politically charged program.”)

Way to go, AusGov! How to ruin public faith in government data analytics: use it not to find a cure for cancer or to tackle wicked policy problems like child abuse or climate change, but to hunt down and punish vulnerable welfare recipients.

As Fairfax economics editor Peter Martin warns, when analysing the impact of the Centrelink ‘robodebt’ program, the failed promises and the targeting of dissenters: “Eventually we will become so sceptical that we will become impossible to win over, no matter how good the budget.”

Post-budget polling indicated precisely that: people simply no longer believe anything they hear from politicians, or they have stopped listening entirely. Only 26% of respondents thought the government could be trusted, “the lowest level since the poll began this measure in 1969.”

This loss of trust is not just about privacy, but has profound implications for the future of our democratic system of government. It’s time the government did its own stock-take, and realised the need to balance up the privacy ledger, before it is too late.

Anna Johnston is the Director of Salinger Privacy, and a former Deputy Privacy Commissioner for NSW. She will be speaking about how to mitigate privacy risks in projects at the Data+Privacy Asia Pacific Conference in Sydney on July 12.

Editor’s note: These incidents collectively serve to dismantle the public trust, regardless of whether the initial media reporting was wrong, or mistakes were subsequently rectified. Responses from agencies will be added when identified.