At Facebook, the algorithm doesn’t always know best. In late August, the social network put an algorithm in charge of its “trending” feature, selecting the most popular topics, articles and keywords. The change came after allegations in the spring that the contract workers who curated the news headline feature altered which articles appeared for political reasons.

Yet in recent days, as The Wall Street Journal reports, the “trending” lists have appeared more flawed than when humans were in charge. There have been false stories, misidentified keywords, and celebrity gossip in the place of more serious news.

“This doesn’t seem like an appropriate solution,” said Jason Turcotte, assistant professor of communication at California State Polytechnic University, Pomona. “Algorithms don’t necessarily respond to news values or journalistic ethics. Algorithms respond to keywords, search terms and trends.” In doing away with the team of curators who had run “trending,” Facebook redefined what it perceives as news. It modified the guidelines for the feature, giving more importance to smaller publications. “It’s intentionally broad so we can be inclusive of a wide range of interests,” Facebook said in the guidelines for the staff, published in late August. ... The company was stung by the criticism that “trending” was biased, as alleged in a May report by tech blog Gizmodo. Facebook denied it was biased. Relying on an algorithm distances the company from what appears on the site—though the algorithm itself is written by humans.

However, as The Onion ( comedically ) explains, this algorithm has major risks...

Exposing Millennials to new ideas...

Assuring users that the company’s entire team of engineers was working hard to make sure a glitch like this never happens again, Facebook executives confirmed during a press conference Tuesday that a horrible accident last night involving the website’s algorithm had resulted in thousands of users being exposed to new concepts. “Unfortunately, late Monday evening, a major failure in our news feed program allowed a significant number of users to come into contact with concepts unfamiliar to them,” said CEO Mark Zuckerberg, appearing contrite as he emphasized to reporters that the issue had been resolved and that it was now safe to visit the social media site again without fear of encountering any opinions, notions, or perspectives not aligning with one’s existing worldview. “To those who were forced to read a headline they did not agree with when they visited Facebook yesterday, we are deeply sorry. It’s an inexcusable failing on our part if your viewpoints were not reinforced by what you saw onscreen. I want all Facebook users to know that you’ll never again encounter any ideas on our site that are in any way novel or ideologically challenging to you—that’s my personal promise.” Zuckerberg then concluded the press conference by thanking users for their support, assuring them that a news article confirming their own individual political and personal biases would be directed to their news feeds with more information on Facebook’s policy.

But as WSJ notres, Facebook has ambitions to use the algorithm to expand “trending.”

“A more algorithmically driven process allows us to scale Trending to cover more topics and make it available to more people globally over time,” Facebook said in a statement last month.

Several former curators now wonder if Facebook will simply get rid of the feature... especially if non-liberal, microaggressions escape into the mainstream.