YouTube today warned its creator community that video removals may increase during the COVID-19 pandemic. The company said its systems today rely on a combination of technology followed by human review. But the current health crisis is leading to reduced in-office staffing in certain sites, which means automated systems will be removing some YouTube content without human review.

Today, YouTube utilizes machine learning technology to flag potentially harmful content, which is then sent to human moderators for review. But because of the measures YouTube is taking to protect staff, it’s planning to rely more on technology than on people in the weeks ahead.

“We have teams at YouTube, as well as partner companies, that help us support and protect the YouTube community—from people who respond to user and creator questions, to reviewers who evaluate videos for possible policy violations,” YouTube explained in its announcement. “These teams and companies are staffed by thousands of people dedicated to helping users and creators. As the coronavirus response evolves, we are taking the steps needed to prioritize the well-being of our employees, our extended workforce, and the communities where they live, including reducing in-office staffing in certain sites.”

YouTube says allowing its technology to remove some content without human review will allow it to work more quickly to take down potential violations and keep the ecosystem protected.

However, automated technology is not perfect. Many videos will likely be impacted by this shift away from human moderation. And videos may be removed even though their content wasn’t actually against YouTube policy. In those cases, video creators are being asked to appeal the decision, which will allow YouTube’s remaining moderators to take a look at the video in question and make a decision.

Because of these complications, YouTube informs creators they won’t be punished with strikes except in cases where YouTube has “high confidence” that the video content was in violation.

In addition, the company says it will also be more careful about which content gets promoted, including live streams. Some unreviewed content may not be available in search, on the homepage or in recommendations, either.

YouTube last week had said it would allow creators to monetize their videos about the novel coronavirus. Previously, those videos were a part of its advertising guidelines’ policy that prevented monetization of videos about “sensitive events” — like mass shootings, natural disasters or, until now, health crises. YouTube’s original policy was written to protect advertisers from having their brands next to exploitative videos that were capitalizing on some sort of human tragedy for views. But YouTube CEO Susan Wojcicki said the company chose to re-open monetization on coronavirus videos because the topic was now an important part of everyday conversation — not a short-term event of a significant magnitude.

Today’s news, however, may put a damper on the creator community’s interest in making videos about the COVID-19 pandemic in the hopes of gaining more views for their channel. Creators will likely worry about their videos being suppressed by YouTube’s algorithms or even mistakenly removed. And videos may be stuck in a lengthier-than-usual appeals process, given the reduced staffing.

YouTube additionally warned other areas of its business could also be affected going forward, including YouTube user and creator support and reviews, applications for the YouTube Partner Program and even responses on social media channels.

It advised creators to watch the YouTube help center for further changes to the service.

“We recognize this may be a disruption for users and creators, but know this is the right thing to do for the people who work to keep YouTube safe and for the broader community. We appreciate everyone’s patience as we take these steps during this challenging time,” said YouTube.