Instagram and its parent company Facebook are constantly waging a complex battle with bad content on their platforms. But the latest chapter of that fight involves them stepping onto the slippery slope to censorship, worrying artists, people with disabilities, consensual sex workers, and those who are in various ways body- and sex-positive.

As part of a wide-ranging series of updates from Facebook on its content policies, the company said this week that its algorithm would demote content that does not violate Instagram’s community standards, but is considered “inappropriate” or “borderline.”

This means the posts won’t be deleted or banned, but it will be harder to find them in the Explore tab, or through hashtag pages (so if you search for a given hashtag, the borderline content won’t show up or will be buried under all the other posts).

“We’re working to ensure that the content we recommend is both safe and appropriate for our community, and that means we are going to be stricter about what content is recommended to people on Explore and hashtag pages,” an Instagram spokesperson told Quartz. Some of this was foreshadowed by Mark Zuckerberg’s long blog post about content governance in November.

The policy update on borderline posts on Instagram’s Help Center website only talks about sexually suggestive content, but, according to TechCrunch, during a presentation to journalists on April 10, the company gave other examples (Instagram later confirmed this to Quartz).

Other types of these “non-recommendable” posts included those containing misinformation and violence, as well as content that is “graphic/shocking,” represented in the presentation by a drawing of a skull and an image of what appears to be a foot with a skin problem.

As TechCrunch points out, the policy as shown to users is not clear at all, since it only mentions “sexually suggestive” content—despite the company telling journalists the policy was broader—and offers no delineations that could help users understand when their content would be demoted.

“We want to be transparent with the community about the types of content we look at, but we’re also aware that publishing this information in full will allow bad actors to get around our efforts,” a spokesperson told Quartz when we asked about the lack of other examples. “We’re thinking about the right way to publish this information, and we’ll keep the community posted on that.”

But news of the policy update, combined with the lack of clarity, sparked criticism and anxiety among a variety of communities.

“I have given up on Instagram,” commented a sex worker on Twitter.

“Another day another institution policing women’s bodies,” tweeted Emily Sears, a model.

Nate Igor Smith is a photographer who shoots, among other subjects, nude, semi-nude, and sexually explicit photos of women. He says his work likely won’t be impacted, since Instagram isn’t a big part of his business model, but that it will affect many women, including sex workers who depend on being able to promote themselves on social media and have in the past used the internet to make their work safer. “Instagram has incentivized sexy, while also punishing it. So now pushing this stuff down seems like a dumb solution to a problem that doesn’t really exist,” he told Quartz via Twitter direct message. “The idea that nudity is harmful is insane puritanical bullshit that the tech community used to see through, but companies don’t want to see their ads next to nudity.”

Katrin Tiidenberg, a researcher and professor at Talinn University in Estonia, noted in an email that the move is part of a broader set of moves by internet companies that observers have called a “deplatforming of sex” or the “internet war on sex.” Tumblr, the social platform known as one of the last bastions of sexual self-expression, particularly for marginalized communities, completely banned porn and other adult content to widespread outcry late last year. Publicly talking about or celebrating sex—particularly when it’s about personal exploration, and not its commodified version, “using boobs to sell random items”—is still culturally taboo, Tiidenberg said. It “automatically, and without much thinking or discussion, qualifies as content that advertisers are fidgety about.” When social media platforms are giant corporations, that’s what they worry about.

Many commenters, including Tiidenberg, also pointed to the possible role of SESTA/FOSTA, a US law that has made social media companies liable for sex-trafficking content. It “makes social media platforms more wary of anything sexual as it is unclear how they might be penalized for it,” she said.

Instagram has long been criticized for its attitude toward nudity, especially if it’s female. It only allows images of female nipples in photos of breastfeeding or mastectomies. It allows nudity in artwork—but only if it’s sculptures or paintings, not photographs. Even that is inconsistently enforced, with some posts being censored, and some not.

But the new policy goes beyond nudity, as the slides from Instagram’s presentation to journalists suggest.

“RIP anymore underwear/stoma bag photos,” tweeted Hannah Witton, a social media personality who writes about sex and sexual health, but also about living with chronic illness. If “shocking” content is demoted, per images in the presentation, people with disabilities or illnesses who post empowering content online could conceivably have their reach reduced as well.

Caroline Harrison is an artist who posts her intricate drawings on Instagram. In her art, she explores visible signs of illness, people’s attempts to control their own bodies, and the effects of trauma, she says. “A lot of this is borne out of how I grapple with my experiences with mental illness, my friends experiences, body image.”

She hashtags them with “dark art,” “dark surrealism,” or “body horror,” and many people come to her account by searching for those and other hashtags. You can easily see how she could compare her work to the image in Instagram’s press presentation.

Musicians, particularly heavy metal bands, use Harrison’s artwork for their album covers. With her art demoted in the Explore tab, or in hashtag pages, she worries that she might lose commercial opportunities. On the other hand, she points out that the musicians who use her pieces in their promotion materials, and generally like imagery considered “dark” like skulls, could suffer as well.

She’s also worried that failing to gain more followers might affect future work, since galleries ask for an artist’s social media handles and could take follower counts into consideration.

When they depend on social media for distribution and promotion, some artists faced with an all-powerful algorithm are forced to self-censor.

“Hearing about these developments makes me feel as though I need to consider the work I am going to make so that it will fit into the parameters that Instagram will allow,” wrote Caitlin McCormack in an email. She crochets animal skeletons out of string stiffened with glue, and is working on a series involving “very unrealistic” breasts and penises.

Unlike big-name artists who don’t need social media to promote their art, McCormack relies on Instagram for promotion and sales, which are small-scale and often occur via Instagram interactions. She says her reach had already been limited in the past, with her posts failing to appear in hashtag searches. (Instagram has denied this kind of “shadowbanning.”)

“I’m afraid that we’re all going to wind up making work that doesn’t feel sincere, as an attempt to get the attention that we need to gain visibility, instead of having the freedom to make work as we please, and have it be seen via a platform that simply presents content as it is posted, chronologically, without any formulas determining what is and isn’t for people to see,” she said, referring to Instagram’s algorithmically curated feed.

“You can show very old art that could be considered very shocking.”

With the policy update, Instagram is probably creating a new problem for itself. It isn’t easy to draw the line between the appropriate and inappropriate.

“You can show very old art that could be considered very shocking. There’s hundreds of things on view at the [Metropolitan Museum of Art] that would violate Instagram’s policies,” said Robin Cembalest, who has a popular Instagram account where she curates images of art.

The curator of the account Creep Machine, which has 300,000 followers and posts dark and surreal art, brought up the examples of a drawing of a skull for a life drawing class, or Francisco Goya’s “Disasters of War” series as the kind of content where the algorithm could get confused.

“I understand the need to have machine learning/algorithms in place to deal with content at the scale Instagram needs; however, there is also something to be said for a more critical eye when it comes to art,” he said in an email. “Algorithms offer solutions to large-scale issues, but in this case art—and the education and inspiration that comes with it—may suffer.”