SAN FRANCISCO (Reuters) - Facebook Inc on Monday said a human rights report it commissioned on its presence in Myanmar showed it had not done enough to prevent its social network from being used to incite violence.

The report by San Francisco-based nonprofit Business for Social Responsibility (BSR) recommended that Facebook more strictly enforce its content policies, increase engagement with both Myanmar officials and civil society groups and regularly release additional data about its progress in the country.

“The report concludes that, prior to this year, we weren’t doing enough to help prevent our platform from being used to foment division and incite offline violence. We agree that we can and should do more,” Alex Warofka, a Facebook product policy manager, said in a blog post.

BSR also warned that Facebook must be prepared to handle a likely onslaught of misinformation during Myanmar’s 2020 elections, and new problems as use of its WhatsApp grows in Myanmar, according to the report, which Facebook released.

A Reuters special report here in August found that Facebook failed to promptly heed numerous warnings from organizations in Myanmar about social media posts fueling attacks on minority groups such as the Rohingya.

In August 2017 the military led a crackdown in Myanmar’s Rakhine State in response to attacks by Rohingya insurgents, pushing more than 700,000 Muslims to neighboring Bangladesh, according to U.N. agencies.

The social media website in August removed several Myanmar military officials from the platform to prevent the spread of “hate and misinformation,” for the first time banning a country’s military or political leaders.

It also removed dozens of accounts for engaging in a campaign that “used seemingly independent news and opinion pages to covertly push the messages of the Myanmar military.”

The move came hours after United Nations investigators said the army carried out mass killings and gang rapes of Muslim Rohingya with “genocidal intent.”

Facebook said it has begun correcting shortcomings.

Facebook said that it now has 99 Myanmar language specialists reviewing potentially questionable content. In addition, it has expanded use of automated tools to reduce distribution of violent and dehumanizing posts while they undergo review.

In the third quarter, the company said it “took action” on about 64,000 pieces of content that violated its hate speech policies. About 63 percent were identified by automated software, up from 52 percent in the prior quarter.

Human Rights Watch (HRW) in Asia said the report showed Facebook is working hard to address the issues and should step up its efforts, especially in combating hate speech ahead of the general election in 2020.

“It’s often said that in Myanmar, for all intents and purposes Facebook really is the Internet because of its widespread use among online users -- so Facebook needs to act accordingly to head off what will likely be a tsunami of hate speech and attacks in the 2020 election,” said Phil Roberston, HRW deputy director.

Facebook has roughly 20 million users in Myanmar, according to BSR, which warned Facebook faces several unresolved challenges in Myanmar.

BSR said locating staff there, for example, could aid in Facebook’s understanding of how its services are used locally but said its workers could be targeted by the country’s military, which has been accused by the U.N. of ethnic cleansing of the Rohingya.