A Facebook executive admitted Monday the global social network might be bad for democracy.

Commenting on Russian meddling in the U.S. election, the rise of fake news and the spread of political harassment, Samidh Chakrabarti, Facebook’s civic engagement product manager, wrote: “I wish I could guarantee that the positives are destined to outweigh the negatives, but I can‘t.”

Chakrabarti blogged that Facebook had made a positive impact in the past, such as during the Arab Spring, but he admitted the company faces some big issues today.

“As unprecedented numbers of people channel their political energy through this medium, it’s being used in unforeseen ways with societal repercussions that were never anticipated,” he wrote.

CEO Mark Zuckerberg announced earlier this month his goal for 2018 is to “fix Facebook.”

Chakrabarti admits that Facebook was “far too slow to recognize how bad actors were abusing our platform” in 2016, while detailing some of the ways the platform is trying to neutralize those risks:

Foreign interference: Chakrabarti called Russia’s campaign to influence the U.S. election the “elephant in the room,” adding it is “abhorrent to us that a nation-state used our platform to wage a cyberwar intended to divide society.” He said Facebook is tackling the problem by making politics much more transparent, while balancing the needs of activists in countries where their work might make them targets of corrupt regimes.

Chakrabarti called Russia’s campaign to influence the U.S. election the “elephant in the room,” adding it is “abhorrent to us that a nation-state used our platform to wage a cyberwar intended to divide society.” He said Facebook is tackling the problem by making politics much more transparent, while balancing the needs of activists in countries where their work might make them targets of corrupt regimes. Fake news: Facebook is taking a number of steps to reduce the amount of false news on its platform, including emphasizing “trustworthy news” sources, making it easier to report fake news, and employing an increased number of human fact-checkers.

Facebook is taking a number of steps to reduce the amount of false news on its platform, including emphasizing “trustworthy news” sources, making it easier to report fake news, and employing an increased number of human fact-checkers. Political harassment: Despite plans to hire 10,000 more people this year to work on safety and security, responding to the problem of political hate speech is going to be one of Facebook’s most difficult challenges, particularly given the nuances surrounding this issue. Chakrabarti says it “is likely to remain a challenge” in 2018.

The company said Sunday that in Europe, where the platform has faced some of its most strident criticism, executives will embark on a charm offensive in a bid to stop lawmakers implementing harsh new legislation that could cost the network money or market share.

“We have to demonstrate we can bring people together and build stronger communities,” Elliot Schrage, vice president of global communications, marketing and public policy, told the DLD conference in Germany. “We have overinvested in building new experiences and underinvested in preventing abuses.”