Facebook, Google, YouTube, Microsoft, LinkedIn, Reddit, and Twitter say they're working with one another and government health agencies to ensure people see accurate information about the novel coronavirus and COVID-19.

The companies are hoping to combat fraudulent and harmful content on their platforms, according to a joint statement published on Facebook's website Monday.

The coronavirus pandemic has caused a spike in intentionally false news and profiteering that's testing the industry's ability to crack down on harmful content.

Visit Business Insider's homepage for more stories.

Several of the world's largest social-media companies have announced they're working together to fight misinformation surrounding the coronavirus pandemic and the COVID-19 disease, according to a joint statement published to Facebook's website on Monday.

Facebook, Google and its subsidiary YouTube, Microsoft and its subsidiary LinkedIn, Reddit, and Twitter all cosigned the statement.

"We're helping millions of people stay connected while also jointly combating fraud and misinformation about the virus, elevating authoritative content on our platforms, and sharing critical updates in coordination with government healthcare agencies around the world," the statement said. "We invite other companies to join us as we work to keep our communities healthy and safe."

Social-media companies are under immense pressure to crack down on rampant fake coronavirus cures, false testing methods, and other inaccurate or misleading claims that have spread across their platforms.

Facebook and Twitter have taken steps to ban content about the coronavirus that could cause harm — both platforms say they'll highlight government agency information under searches for coronavirus-related terms.

Google recently announced a 24-hour coronavirus incident-response team and said it would work to remove misinformation from search results and YouTube while also promoting accurate information from health agencies. On Sunday, Google's sister company Verily released an apparently half-finished website meant to direct Americans to testing locations after President Donald Trump announced it prematurely.

But the sheer volume of intentionally false news, which the World Health Organization has called an "infodemic," is testing whether the industry is capable of effectively limiting the spread of misinformation.

Newsguard, which ranks websites by trustworthiness, said in early March that "health care hoax sites" had received more than 142 times as much social-media engagement in the past 90 days as the websites for the Centers for Disease Control and Prevention and the World Health Organization combined.

Even before the COVID-19 outbreak, Facebook, Google, Twitter, and others were already under fire from lawmakers and other critics who argued the companies weren't doing enough to stamp out harmful and misleading content in other contexts like violent extremist, cyberstalking, and political ads.

On Monday, Facebook CEO Mark Zuckerberg said it's easier for Facebook to "take a much harder line" in cases like a global health emergency, while Sundar Pichai, the CEO of Google's parent company, Alphabet, told employees in a memo that this was a pivotal moment for the company, according to Bloomberg. It remains to be seen how much the companies' aggressive efforts will make a difference in halting the spread of harmful coronavirus content.