Getting bamboozled by online misinformation can be like trying to charge your smartphone in a microwave: embarrassing, expensive, and mildly explosive. A dubious, highly edited clothing hack leaves you with shredded, unwearable garments. Hot glue, bereft of editing software and careful lighting, turns out to be ill-suited to making sandals. It can also get much, much worse, and the dangerous lies that spring to mind most readily—4chan’s bomb-making instructions, racist conspiracy theories that seem designed to whip people into homicidal fury—aren’t the only ones going around the internet.

Other, seemingly innocuous distortions of reality can be just as lethal. Earlier this year, two teenage girls from China, who were reportedly imitating a video from YouTube channel Ms Yeah, were grievously injured when a homemade alcohol burner exploded in their faces. Fourteen-year-old Zhe Zhe succumbed to her injuries in a nearby hospital. After people began to blame the YouTuber behind the hack, Ms Yeah took to Chinese social media platform Weibo to apologize and promise to never make such videos again. She denied the teenagers had been replicating her clip, but also offered compensation to their families. The girls had been trying to make popcorn in an empty soda can.

The kind of misinformation that resulted in Zhe Zhe’s death is unlikely to be caught as dangerous by any social media bot or algorithm. Though some, like Australian food scientist Ann Reardon, who helms YouTube channel How to Cook That, partly blame algorithms for bad info becoming so widespread. “It’s all about getting views, it’s all about virality, it’s all about making money,” she says in a video debunking several dangerous and fake baking hacks—including one that, in practice, sent molten caramel flying across the room. “They’re making fake stuff because it’s more shareable, it’s more interesting than real stuff.” The proof of that is in the numbers: Reardon’s debunkings are successful and have won her almost 4 million subscribers, but the accounts she’s criticizing have between 15 million and 60 million subscribers.

According to Reardon and others, when they’ve tried reporting dubious clips to YouTube, they’re informed that those kinds of videos do not violate the rules, and they don’t. The same problem exists in some channels that promote questionable beauty products and make pseudoscientific claims about diets and nutrition. These channels aren’t encouraging violence or hate, just telling viewers to consume only raw fruit, to drink copious amounts of celery juice, to avoid vegetables entirely and go carnivore, to eat nothing at all. (Yes, people who believe that they can subsist on light alone do exist, and call themselves “breatharians.”) In many cases, those misapprehensions are the YouTuber’s or Instagrammer’s deeply held beliefs, just like conspiracy theorists and anti-vaccine advocates also believe the information they promote.

The difference is, when someone searches for anti-vax or other well-known conspiracy theories, YouTube will promote vetted “authoritative” content from news organizations, and sometimes surface a “fact check” information panel depending on your location. When asked, YouTube didn’t have many specifics on whether or not they planned to expand this system to other kinds of misleading videos. “Misinformation is a difficult challenge, and we have taken a number of steps to address this,” says YouTube spokesperson Ivy Choi. “Our systems are not perfect, but we’re constantly making improvements, and we remain committed to progress in this space.” Translation: Fact-checking every video and post and considering every possible new form of misinformation is practically impossible, but the company is trying. The inevitable imperfections of social media platforms’ misinformation nets has given rise to a whole new class of online creator: the scientist-influencer debunking false information in their area of expertise.

Debunking misinformation, unless done with utmost tact and empathy, will often enrage those who have been duped and make them cling to the pseudoscientific evidence even harder.

These influencers can be found on every platform from Facebook to Twitter, but apolitical debunkers tend to live on Instagram and YouTube (or often both), because that’s where “lifestyle” misinformation gets traction. Trying out suspect hacks has been a YouTube staple for years, and still is, but recently the genre has expanded to include many, many videos best summed up as “Subject Area Expert Reacts to Internet Malarkey.” Instagram (and hence Facebook) has even formalized its relationship with some experts, like Science Feedback, a nonprofit dedicated to debunking bad science online that recently had to tell Instagram users that no, red spots on bananas were not evidence that the fruits were being injected with HIV-positive human blood.