YouTube is profiting from videos promoting unproven coronavirus treatments, a new report has found, as the company struggles to crack down on misinformation.

The Google-owned tech company is running advertisements with videos pushing herbs, meditative music, and potentially unsafe over-the-counter supplements as cures for Covid-19, according to a report published on Friday by the Tech Transparency Project, a not-for-profit watchdog organization.

The report found at least seven videos hawking such dubious treatments with advertisements from sponsors including Donald Trump’s re-election campaign, Facebook, Liberty Mutual Insurance, the streaming startup Quibi, and Masterclass.com.

After being contacted by the Guardian, YouTube removed four of the videos in question for violating its policies against Covid-19 misinformation. Three of the videos remain as they are not promoting misinformation directly but offering wellness tips, according to a spokesman.

“We’re committed to providing timely and helpful information at this critical time, including raising authoritative content, reducing the spread of harmful misinformation and showing information panels, using WHO data, to help combat misinformation,” the spokesman said.

He said YouTube had removed thousands of videos related to misleading and dangerous coronavirus content in recent weeks.

Many social media platforms have struggled to quell coronavirus misinformation, as the number of people infected tops 1 million.

Companies including YouTube may be forced to rely more heavily on artificial intelligence tools to moderate content while employees are forced to work from home, said Megan Lamberth, a researcher at the Center for a New American Security, a Washington DC-based thinktank.

YouTube is running advertisements with videos pushing herbs, meditative music and potentially unsafe over-the-counter supplements as cures to Covid-19. Photograph: Hakan Nural/Anadolu Agency via Getty Images

“Since the beginning of the pandemic, we’ve seen an enormous rise of misinformation on online platforms,” she said. “Social media companies have tried to respond to the deluge of misinformation, but in many cases, their moderation efforts have not been sufficient.”

TikTok has partnered with the World Health Organization to provide accurate information to users and has put a disclaimer on all videos using the #coronavirus hashtag with accurate information about the pandemic. Other sites, including Facebook, have pushed back against dangerous false cures including cleaning products and cocaine.

The global pandemic has created a situation ripe for the spread of false cures, said Lisa Fazio, a psychology professor at Vanderbilt University who studies misinformation and how it spreads.

“If the most profitable videos are those that feel good and provide easy answers, then it’s highly likely that they will contain misinformation,” she said. “In reality, the current situation is complicated and has few easy answers.”

YouTube initially prohibited the monetization of videos about Covid-19 under its “sensitive events” policy, which bars advertisements on videos regarding armed conflicts, terrorist acts, and “global health crises”.

However, it reversed that policy on 11 March, saying it wanted to “make sure news organizations and creators can continue producing quality videos in a sustainable way”. It then enabled coronavirus video ads for “a limited number of channels”, and on 2 April it expanded monetization of content mentioning or featuring Covid-19 to all creators and news organizations.

Doing so may be encouraging misinformation as a means of making profit, said the Tech Transparency Project report.

“In lifting restrictions on advertising in videos about the coronavirus pandemic, YouTube has made disinformation lucrative for some unscrupulous content creators and a liability for the brands that unwittingly support them,” the report said.