From the first messengers who shouted out in cobblestone squares, to the rise of newspapers made possible by the printing press, to the 24-hour news cycle we have today, our obsession with the exchanging of news still burns strong.

As news has shifted online, social media has become the digital messenger, amplifying the reach of breaking news stories. Social media has changed the way people interact with each other in the sharing of information, personal messages, and opinions about current events, personal situations and experiences. Companies that were born in the digital age—like Facebook—first were celebrated as a way for people to more easily and quickly interact across different communities and locations around the world. Many of these same companies have now acknowledged to being used as a tool for spreading misinformation—or fake news—reaching hundreds of thousands, to millions of people.

Objectivity suddenly became difficult to pinpoint with information being spread at such a rapid and uncontrollable pace online. Sharing news has become so frictionless that it makes it easy to forgo a critical lens when we scroll through our social feeds. Reading the comments section doesn’t help either, as it is often filled with abuse, trolling, harassment, racism, and/or misogyny, effectively discouraging constructive discussions, leading to some sites—like The CBC in 2015— disabling their comments on stories about indigenous people due to the lack of resources for moderation, which continue today.

As companies and their teams of coders, engineers and data scientists scramble to fix these problems, a solution may exist in an unlikely place.

Located on top of Burnaby Mountain inside the Discourse Processing Lab, Maite Taboada, professor of Linguistics at Simon Fraser University (SFU), is harnessing the power of big data to make social media and online discussion platforms a better and more reliable place for communication. Her research is at the intersection of linguistics, computational linguistics and data science.