Do you trust me?

Do you trust what you are about to read, assuming you keep reading? (Keep reading!) Do you believe that I comported myself ethically during my reporting, did not make anything up, did not use the work of others without credit?

Let me put that another way:

Do you trust that this article will make you feel better, or correct, about the world? Do you think that I, as the writer, have some connection to you, as part of a community? That I want you to be informed, sure, but also protected?

Both of those paragraphs define trust, but very differently. Which makes it both troubling and a little weird that last week the social network Facebook—in a news release attributed to Adam Mosseri, head of the company’s newsfeed—announced it would start prioritizing “trusted” news sources. “We surveyed a diverse and representative sample of people across the US to gauge their familiarity with, and trust in, various different sources of news,” the release says. “This data will help to inform ranking in News Feed.”

By one estimate, Facebook has 214 million US users, and is a major disseminator of news produced elsewhere. Some of that news is fake; the social network’s users are prone to spreading extreme content, and some of that content is literal propaganda. Russian agents used Facebook to disrupt the US elections in 2016, exposing 140 million people to their trolling. Even Facebook knows it has a problem—in a corporate post, the company’s product manager for civic engagement acknowledged that social media could “corrode democracy,” and he listed Facebook's efforts to expose untruths and deter people from sharing misinformation.

The relationship between Facebook and the news media is, as the site might put it, complicated. Much of the ad money that used to go to independent news outlets now goes to Facebook—the company generated more than $27 billion in ad revenue in the first nine months of last year, topping Comcast and Disney—while advertising in newspapers and magazines fell off a cliff. Money that used to pay for news now pays for Facebook.

So the question you should ask next is not how Facebook can figure out what news organizations people trust. It’s not even whether that’s possible. The question is if that’s even the right question.

Facebook plans to gather its data with a poll. “As part of our ongoing quality surveys, we will now ask people whether they’re familiar with a news source and, if so, whether they trust that source,” writes Facebook founder Mark Zuckerberg on, duh, Facebook.

This has turned out to be literally true. Buzzfeed published the complete poll on Tuesday. It asks which news outlets on a list users are familiar with and how much they trust those “domains.” That’s it.

The five possible answers range from “entirely” to “not at all,” easy to code as one through five (or five through one). “The idea is that some news organizations are only trusted by their readers or watchers, and others are broadly trusted across society even by those who don't follow them directly,” Zuckerberg writes.

So, yeah. That’s probably not going to work.

In his 2002 book Trust and Trustworthiness, the late political scientist Russell Hardin writes that trust itself has at best messy definitions, not widely agreed upon. “Quarrels about what it ‘really’ means sound like the worst of Platonic debates,” Hardin writes. “There is no Platonically essential notion of trust.”

That doesn’t stop him from trying, though. “Trustworthiness,” Hardin says, is the raw stuff, the thing that a person or an institution might possess. “Trust” is what someone feels. It’s a three-part relationship: A trusts B to do X. If you only have two elements, that’s not really trust.