By Kirsten Grind and John D. McKinnon

The world's biggest social-media companies, under fire for failing to police content on their sites, have invited an array of outside groups to help them figure out who should be banned and what's considered unacceptable.

That solution is creating a new set of problems -- public fights, complaints and legal battles.

Silicon Valley giants Facebook Inc., Twitter Inc. and Google's YouTube unit have made a concerted push to seek out input from hundreds of groups, a growing number of which lean to the right. The companies have become receptive to behind-the-scenes lobbying as well.

Among the initiatives, Facebook has privately sought advice from the Family Research Council, a conservative Christian public-policy group, and its founder Anthony Perkins, according to people familiar with those meetings. Twitter's Chief Executive Jack Dorsey recently hosted dinners with conservatives, including Grover Norquist, the founder and president of Americans for Tax Reform, which advocates for lower taxes. Advisers on the left include the Southern Poverty Law Center, a civil-rights group that keeps a list of hate groups.

For users frustrated by the lack of clarity around how these companies make decisions, the added voices have made matters even murkier. Meetings between companies and their unofficial advisers are rarely publicized, and some outside groups and individuals have to sign nondisclosure agreements.

And in many cases, posts that are hateful to one group are considered fair game -- or even uncomfortable truths -- to others on the opposite end of the spectrum, opening a whole new arena to continue the political and ideological fights that are often a staple of social media.

When Twitter executives struggled with whether to follow other Silicon Valley companies and remove conspiracy theorist Alex Jones from the platform in August, Mr. Dorsey privately sought counsel from Ali Akbar, a conservative political activist, Mr. Akbar says.

Mr. Akbar advised Mr. Dorsey against kicking off Mr. Jones, despite pressure from users and Twitter employees. Mr. Akbar argued that Mr. Jones hadn't violated any of the site's rules -- a point Mr. Dorsey also made when he explained the decision in a Twitter post. Mr. Dorsey didn't disclose Mr. Akbar's involvement.

"It's important that Jack sought a right-of-center perspective which cannot be found at Twitter," Mr. Akbar says. "Jack was brave."

Twitter ultimately banned Mr. Jones about a month later, citing a violation of its abusive-behavior policy.

Mr. Akbar says in 2018 he also complained to Mr. Dorsey about potential discrimination against a survivor of the shooting at Marjory Stoneman Douglas High School in Parkland, Fla., who was in favor of gun rights. The student wasn't "verified" on Twitter -- a badge given to users that are in the public interest -- while several other survivors of the shooting who were in favor of more gun control were given the recognition.

After Mr. Akbar's intervention, the student's account was verified, Mr. Akbar says.

Twitter spokesman Brandon Borrman says the company and its executives personally maintain many outside relationships "to help us benefit from other perspectives on the critical societal issues we deal with." He says outsiders "never override our rules and no outside adviser makes the ultimate decision or dictates our actions," and that Twitter is working to be more transparent on the outsiders involved in its process.

On the Alex Jones decision, Mr. Borrman says Mr. Dorsey "did not and does not personally make enforcement decisions, he defers to the deep expertise of the team."

The reliance on outside opinions goes along with other initiatives tech companies have launched to build their defenses. Companies have added complex internal guidelines on what kinds of posts should be banned and hired thousands of new employees to review content.

YouTube has boosted its "trusted flaggers" program -- groups that are asked by the company to point out inappropriate content on the site -- from 10 to more than 100 between 2017 and 2018. Twitter's Trust and Safety Council spans about 48 organizations around the world.

Facebook says it now consults with hundreds of organizations after it decided late last year to seek more outside counsel on issues like hate speech and misinformation -- broadly known as "content moderation issues."

The tech companies have found themselves in an impossible situation, given the billions of posts that are generated each month and the conflicting agendas of their users, says Klon Kitchen, who manages tech policy for the conservative Heritage Foundation. The foundation has recently forged a relationship with Facebook.

Mr. Kitchen has advised the company that these issues are not likely to ever go away. "These are problems you manage, not problems you solve," he says.

Peter Stern, who handles Facebook's outside engagement efforts from the company's Menlo Park, Calif., headquarters, says the company now seeks advice from up to a dozen outside groups on each policy decision it makes on its platform. He declined to say which groups are consulted.

"If we change the policy, we're going to hear about it, so we might as well involve them," Mr. Stern says. "We had been doing it, but not in a systemized way."

Adam Thierer, a senior research fellow at the right-leaning Mercatus Center at George Mason University, says he used to consult with Facebook and other tech companies. The futility of trying to please all sides hit home after he heard complaints about a debate at YouTube over how much skin could be seen in breast-feeding videos.

While some argued the videos had medical purposes, other advisers wondered whether videos of shirtless men with large mammaries should be permitted as well.

"I decided I don't want to be the person who decides on whether man boobs are allowed," says Mr. Thierer.

Brian Amerige, a former Facebook senior engineering manager, resigned from the company after seven years in October, in part because he objected to the way it handled which content is considered objectionable.

Mr. Amerige says he felt Facebook was trying to avoid allowing anything controversial on the platform, and hampering free speech in doing so. The move to involve more outside groups -- conservative or liberal -- is in his opinion only making things worse.

"What happens when you have an undefinable principle and you defer to other people? It becomes a bunch of one-off-decisions," he says.

A Facebook spokeswoman declined to comment.

While outside groups are technically unpaid, the tech companies contribute to some of the organizations they are seeking out for guidance. Alphabet Inc.'s Google contributes to more than 200 third-party groups, including the Heritage Foundation, National Cyber Security Alliance, and Americans for Tax Reform, according to the company. Facebook and most other companies don't disclose their donations to outside groups.

Executives see the outreach to a cross-section of groups in part as a form of political protection, to defend against the allegation that they are biased against conservatives, a complaint lodged repeatedly last year by President Donald Trump and Republican lawmakers. Some of the conservative groups tapped recently by tech platforms complain that the companies defer too closely to the Southern Poverty Law Center when defining what constitutes hate speech.

Many companies and other groups rely on the center's list of hate groups, counting nearly 1,000 across the U.S., according to its website. The group also writes about some of those groups on its "Hatewatch" blog.

Keegan Hankes, a senior research analyst at the Southern Poverty Law Center, says the group lobbies tech platforms to remove content it considers hate speech, such as when it successfully asked Facebook to remove content posted by the League of the South, a neo-Confederate group.

A spokesman for League of the South didn't respond to requests for comment.

Mr. Hankes says the center doesn't always win the battle. Groups that say it has too much sway are "overstating the influence that we have," he says.

Gavin McInnes, a conservative activist and founder of the "Proud Boys," which describes itself as "Western Chauvinist" and has recently been linked to violence in New York and other states, has been banned by Twitter, Facebook and Instagram. It's not clear if the companies consulted with the Southern Poverty Law Center on their decisions.

The Southern Poverty Law Center designates the Proud Boys, which Mr. McInnes left in November, as a hate group and has written several online posts about Mr. McInnes. In one post, the group quoted Mr. McInnes saying in a 2016 podcast and YouTube show, "We will kill you. That's the Proud Boys in a nutshell. We will kill you."

Mr. McInnes is planning to sue the Southern Poverty Law Center in the coming days for its role in spreading what he claims is defamatory and false information about him, according to his attorney and a draft of the complaint. The suit plans to mention the decisions by Facebook and Twitter to ban him.

Ronald Coleman, Mr. McInnes's attorney, says the SPLC has made "a very concerted effort to destroy him."

A spokeswoman for the Southern Poverty Law Center declined to comment. A spokesman for Twitter declined to comment. A spokeswoman for Facebook says the center is "one of many groups we work with on our hate-related policies."

Facebook executives have recently reassured some conservative groups that the company doesn't exclusively look toward the Southern Poverty Law Center for advice, according to people familiar with the matter.

(MORE TO FOLLOW) Dow Jones Newswires

01-08-19 1225ET