In a statement shared with TechCrunch, Facebook VP of product management Adam Mosseri said that the company is aware of the fake-news problem. "We take misinformation on Facebook very seriously," the statement says. "We value authentic communication and hear consistently from those who use Facebook that they prefer not to see misinformation." The rest of Mosseri's thoughts read as follows:

In Newsfeed we use various signals based on community feedback to determine which posts are likely to contain inaccurate information, and reduce their distribution. In Trending we look at a variety of signals to help make sure the topics being shown are reflective of real-world events, and take additional steps to prevent false or misleading content from appearing. Despite these efforts we understand there's so much more we need to do, and that is why it's important that we keep improving our ability to detect misinformation. We're committed to continuing to work on this issue and improve the experiences on our platform.

Facebook does not label itself a news organization, even though Pew Research Center found in May that 62 percent of adults in the US get their news from social media, and Facebook is a powerhouse in this space.

As BuzzFeed News reported last week, a team of teenagers in Macedonia figured out how to game the Facebook Newsfeed algorithm, and they made up to $5,000 a month circulating fake pro-Trump stories on the site. Their headlines include, "Breaking: Proof surfaces that Obama was born in Kenya - Trump was right all along," and, "Oprah Tells FOX News Host 'Some White People Have To Die.'"

The prominence of these stories speaks to a larger problem of "filter bubbles" on Facebook and other social media sites, where users end up seeing stories and opinions only from sources they agree with. Facebook has not stated how it plans to address the issues of fake news or filter bubbles on the site.

Facebook came under fire recently for getting rid of human editors who curated the site's Trending news section. The Trending stories are now picked by an algorithm with a poor track record of distinguishing truth from fiction.

Facebook CEO Mark Zuckerberg shared his own thoughts on the election, alongside a photo of him holding his daughter as the results poured in. His update read as follows: