The problem: much of how this works is a mystery. Facebook wouldn't say exactly how it calculates scores, who gets these scores and how other factors contributed to a person's trustworthiness. Lyons declined to go in-depth on these factors, arguing that they might tip off "bad actors" who could use this knowledge to game the system. There's certainly a degree of truth to that, but it could still leave users wondering whether or not Facebook's reputation scoring is influencing both their own posts and what they see from others.

The approach also appears to partly contradict Mark Zuckerberg's own remarks from recent weeks. In his Recode interview, he claimed it was hard to "impugn intent and to understand the intent" of people pushing false narratives. That's not entirely true -- Facebook can clearly assign reputation values to people who knowingly submit false reports, among other criteria. It's just not gauging the intentions behind the posts themselves.

All the same, there are reasons for Facebook to use reputation rankings. Far right groups have regularly used false reporting for harassment, and investigations into reporting can draw attention to content that runs afoul of Facebook's policies. The firm's executives were skeptical when they saw a surge of activists reporting Alex Jones and InfoWars for promoting hate speech and false conspiracies, but that still drew attention that ultimately led to Facebook banning Jones and InfoWars for policy violations. As nebulous as the rating system is, it might curb abuse and bolster legitimate complaints.