Facebook announced today that it’s updating its group privacy settings and working to better moderate bad content breaking the platform’s rules. The platform is renaming its confusing public, closed, and secret group settings to the slightly more straightforward public and private settings, with the option to make private groups visible or hidden to non-members. The change is part of an ongoing effort to improve group safety, through features like giving admins more moderation tools and members the option to see the group’s history and preview its content before accepting or declining an invitation.

The new group settings are also part of the Safe Communities Initiative that the company started two years ago, in an effort to monitor and detect bad content in Facebook groups. The announcement comes in the wake of recent findings that secret Facebook groups have been acting as gathering places for racist, offensive activity — one example coming from earlier last month, when ProPublica found a group of Border Patrol agents joking about migrant deaths.

The name change itself isn’t likely to stop any bad behavior, as secret groups will still be around. Closed groups, which only let current members view group content and see who else is in the group, will now be labeled as private but visible groups. Secret groups, which are hidden from search, but still require an invitation to join, will be changed to a private and hidden group.

Facebook says it uses AI and machine learning to “proactively detect bad content before anyone reports it, and sometimes before people even see it.” The flagged content then gets reviewed by humans to see if it violates Facebook’s Community Standards, but clearly, the system is flawed if offensive groups are still flying under the radar.

In April, Facebook updated its policies to hold admins to higher standards, committing to penalize the overall group if moderators approve posts that break the platform’s rules. To make sure admins will be held responsible for their groups’ behavior, they’ll have access to a tool called Group Quality, which gives them an overview of content that violates Community Standards. Admins will also have an option to share what rules were broken when they decline pending posts, remove comments, or mute members.