"Two dead sticks," reckoned Kath. When it comes to university fee deregulation, the rankings offer no clear answer. Credit:Fairfax Media The deeper meaning: people don't interpret information objectively. We twist it to fit in with our preconceived ideas and (often subconscious) thought patterns. So it is with the latest Times Higher Education world university rankings. Eight Australian universities made the top 200 this year, led by Melbourne University at number 33. Education Minister Christopher Pyne has no doubt. The poor performance of Australian universities compared with our Asian neighbours is a "clarion call" to the Senate crossbenchers to support the uncapping of university fees.

Greens Senator Lee Rhiannon argues the opposite. The rankings show Australia's universities are already "world class" and that fee deregulation will endanger the egalitarian nature of the sector. When it comes to fee deregulation, the rankings offer no clear answer. People will see what they want to see in them. But how much notice should we take of the rankings anyway? Not much, says one of the nation's leading higher education experts. Grattan Institute higher education program director Andrew Norton says The Times rankings are not "terribly high quality".

"They should not be used as a guide for which university to go to and they shouldn't be used as a guide to higher education policy," he says. In particular, he warns that movements up or down the league table – especially small ones – should not be used a reliable verdict on whether a university is improving or declining. And he's not alone. Australian higher education academic Simon Marginson, one of the leading experts on university rankings, is even more damning. "In social science terms they are rubbish," he told an academic conference last year.

The Times rankings are unusual because they attempt to provide a broad assessment of university quality including both teaching and research. That's why the media loves them, and why the experts are especially dubious of them. Single-issue rankings, such as the research-only Shanghai Jiao Tong index, are regarded as more robust. Research quality makes up 60 per cent of The Times rankings – based on surveys, journal citations and research income. Teaching quality makes up 30 per cent – based largely on a "reputation survey" of 10,000 academics. Respondents are asked questions such as, "Where would you send your best graduates for the most stimulating postgraduate learning environment?

According to Marginson, this survey method allows universities to "recycle reputation". "World-leading universities are dominant for decades, even centuries. Once set they readily hold their position … Ranking reinforces the status closure," he said in 2012. "Rankings feed reputation, which feeds into resources, which sustain reputation and maintain ranking position. And so on." The arbitrary weightings required for broad rankings are also problematic. "Why should, say, the number of PhDs awarded be twice as important as the percentage of international staff?" Marginson asks. "If the ratio is reversed and international staff become twice as important as PhDs, dozens of universities move up and down the league tables." Loading

Flawed or not, The Times rankings will remain powerful. As Buzzfeed shows, everyone loves a list. And there are few better marketing tools for universities – especially to attract international students. But if you're using them to decide education policy in this country, they aren't much more useful than two dead sticks.