For weeks, Facebook has been questioned about its role in spreading fake news. Now the company has mounted its most concerted effort to combat the problem.

Facebook said on Thursday that it had begun a series of experiments to limit misinformation on its site. The tests include making it easier for its 1.8 billion members to report fake news, and creating partnerships with outside fact-checking organizations to help it indicate when articles are false. The company is also changing some advertising practices to stop purveyors of fake news from profiting from it.

Facebook, the social network, is in a tricky position with these tests. It has long regarded itself as a neutral place where people can freely post, read and view content, and it has said it does not want to be an arbiter of truth. But as its reach and influence have grown, it has had to confront questions about its moral obligations and ethical standards regarding what appears on the network.

Its experiments on curtailing fake news show that Facebook recognizes it has a deepening responsibility for what is on its site. But Facebook also must tread cautiously in making changes, because it is wary of exposing itself to claims of censorship.