TikTok directed its moderators to prevent people with “ugly facial looks” or who shot videos in “slums” or “dilapidated housing” from having their posts promoted to its widely viewed “For You” section, according to The Intercept. A TikTok spokesperson confirmed to The Intercept that the policies had once been in place but said the guidelines were “an early blunt attempt at preventing bullying” that are no longer used.

Portions of these discriminatory guidelines have leaked in the past, revealing that TikTok intentionally prevented posts from LGBTQ users and users with disabilities from surfacing in this section. The Intercept has more detail from the leaked documents, showing that they ban people who have an “abnormal body shape” such as a “beer belly” or “ugly facial looks” such as “too many wrinkles.”

“The video will be much less attractive” to new users, TikTok wrote

The guidelines also ban videos from people who appear to be poor. Cracked walls or old decorations are enough to have a video suppressed, according to the leaked guidelines.

While TikTok says the goal was to prevent bullying (and past leaks have referenced bullying as a policy rational), the notes accompanying these restrictions explain that TikTok viewed these traits as less likely to draw in new users. “If the character’s appearance or the shooting environment is not good, the video will be much less attractive, not worthing to be recommended to new users,” reads the guide, which The Intercept says was translated by TikTok from Chinese to English for use globally.

A TikTok spokesperson told The Verge that the guidelines obtained by The Intercept are regional and “were not for the US market.” The company has made many of these changes over the past year. In that time, it hired a US head of trust and safety and launched trust and safety offices in California, Dublin, and Singapore to oversee the development of moderation policies, the spokesperson said.

TikTok also had separate guidelines for moderating live streams that ban “controversial content” like references to the Tiananmen Square protests, Tibet and Taiwan, police, or criticism of “political or religious leaders.” When The Guardian first reported on these guidelines last September, a spokesperson for ByteDance, which owns TikTok, said the rules were no longer in use as of May 2019.

TikTok has been under growing scrutiny for its moderation and data collection practices as the service has exploded over the past year. Much of that goes back to TikTok’s ownership by ByteDance, which is based in China. TikTok has been criticized for appearing to censor pro-democracy protests in Hong Kong, and the US government has floated the possibility that the app is a national security threat. It’s enough pressure that ByteDance has even considered selling the app off (though a spokesperson previously called the sale rumors “completely meritless”).

TikTok also said today that it would stop using China-based moderators to review international content. In a comment to The Wall Street Journal, a TikTok spokesperson indicated that moderators in China had been reviewing some international content but not videos in the US.