I’m a big Facebook user. It’s where the bulk of the readers of my blog come from, and I love being able to stay connected with my friends across the country and the world. Of course, it comes with its problems. The biggest one (other than the tremendous time sink it can be) might be the fact that Facebook forces me to see what a lot of my friends and acquaintances really think.

It’s much easier to think the best about people when you’re not faced with a torrent of memes and political posts.

“But Bryce,” you say. “Don’t you do the same thing? You’re posting about politics and other current events all the time on this blog.”

True. But I’d like to think I’m pretty consistent in my views. Anyone who meets me in person will see and hear the same things I’m saying and thinking aloud on this blog. And I also liberally hide people from my Facebook feed. (Something I encourage everyone to do.) If you’re not someone I know well, I’ll happily add you as a friend, but that doesn’t mean you get a spot in my feed.

However, even with all the feed gardening, I still get a slew of posts by well intentioned people who, frankly, don’t know what they’re memeing about. In a way, I can’t blame them. Social media has made it all too easy to share information. You’d think I’d applaud that, as a librarian. Yay information!

But a whole ton of what gets shared isn’t true. In some cases, this is accidental. A quote is misattributed, and then that’s passed forward from share to share. In other cases, lies are presented as fact. That’s clearly a problem, but even the misattribution of quotes is an issue for me.

When we start playing loose with facts (including who said what when), then it becomes that much easier to play loose with bigger facts. We shift focus from the source and look only at the content. “It’s the idea that matters,” I hear. “What does it matter who said it?”

It matters so much! Not just who said it, but who they were speaking to. The easy example is anything written in the Onion. Some people don’t realize that’s satire, and it upsets them when that sails past them. But it’s more complicated than that, of course.

I personally don’t believe pretty much anything I read on partisan new sites. And I don’t just mean Fox or MSNBC. (If you follow those sites, fine. But please pair them with some opposing viewpoints.) N0, I mean sites like Drudge or Breitbart. Sites that unapologetically skew the news to present a worldview that just isn’t based in reality.

Information isn’t all of equal value. It needs to be sifted and analyzed. It needs to be validated. Facts are very different than opinions, no matter how much one might wish it were otherwise. Base your worldview on opinions, and you can be in for a rude awakening. Forming an opinion based on fact is fine, but opinions based on opinions gets you into dangerous territory.

We need to pay attention to the small things, because they lead to big things. A personal example: a few years ago, I shared something that basically talked about how frivolous law suits were driving up the cost of health care. I didn’t need to cite anything: everyone knew this was fact. At least, that’s what I believed. But then a lawyer friend spoke up and said that just wasn’t true.

I leapt into action to prove him wrong. I’m a trained librarian, dagnabbit. I can find information with the best of them. Except when I went looking, I discovered . . . he was right. The actual studies done said exactly the opposite of what I had been passing on as fact.

So I admitted I was wrong and changed my mind. Popular opinion doesn’t trump fact. (No pun intended, but it’s a convenient turn of phrase.)

These days, you can find something online that will confirm just about any previously held belief you have. Anything you want to believe, someone will be out there cheering you on and telling you how right you are. How do you tell if what you’re reading is reliable information or just a bunch of fluff?

Use the CRAAP test. It’s a series of questions librarians promote to help people evaluate information.

C is for currency. When did the information come out? Has it been outdated by new findings?

R is for relevance. Does the information actually relate the issue at hand?

A is for authority. Who provided the information, and why are they someone to be believed? What is their background and education, and do they have any ulterior motives?

A is for accuracy. Is the information corroborated by other sources? If five sources say one thing, and the sixth says the opposite, that doesn’t mean you can ignore the five. You have to figure out why it’s saying something different and if it can be believed.

P is for purpose. Why was the information created? Did someone pay for it to be publicized? If so, who? Is it designed to persuade? Is it unbiased?

The thing is, applying this test to everything can be a real pain in the rear. It’s so much easier to just share the post or the meme and forget about it. But if you keep doing that, you can end up having a skewed understanding of what’s actually happening. You end up living in an echo chamber where everything agrees with what you’ve always believed.

Here’s a pro-tip for you: if you haven’t ever had some basic assumptions proved wrong, I don’t think you’re actually thinking that much about the world around you. But that’s just me.

For now, I’d just be a whole lot happier if everyone paid attention to the kind of information they were consuming. Is that too much to ask?

Share this: Facebook

Twitter

Pinterest

Reddit

More

LinkedIn

Print



Email

