Zuckerberg told the audience that people want to communicate freely in private because they can be themselves, but he should know that that's not always a good thing. After all, it's features like Groups -- which Facebook is now putting front and center -- that have created echo chambers where toxic communities thrive. Perhaps the most concerning part of Facebook's "the future is private" strategy, is that it hangs on a "community review process with fairness in mind," which relies on moderators to flag abusive and harmful content in Groups. That, of course, includes misinformation, hate speech, nudity, bullying, harassment and violent posts -- which the company is also trying to combat with artificial intelligence.

Facebook says there are now more than 400 million people who belong to a "meaningful" group on the site, with Zuckerberg noting that the idea is to make such groups "as central as friends." The problem with relying on moderators to police content, however, is that Facebook is essentially placing flagging responsibilities on third parties. In this case, it could be users who may not have the best intentions in mind, and who themselves may be looking to create trouble.

While there's no doubt that there are positive communities on Facebook, it's well-documented how Groups have been exploited by those looking to spread propaganda, fake news and harassment. Yes, private conversations can be great for users who want to feel safe in an online group, but they can also be used to create toxic echo chambers -- the same kind that Facebook has to take down regularly for "coordinated inauthentic behavior." With the company's new privacy-focused vision, harmful Groups may be harder to trace. Not just by Facebook, but by governments, law enforcement, legal experts, researchers and the media.

This could create a dangerous precedent for Facebook, to say the least, at a time when it's still trying to clean up its platform and salvage its damaged reputation.

Beyond focusing on more private groups and relying on moderators, Zuckerberg points to WhatsApp as an example of how Facebook will rework its family of apps to private and encrypted services. The problem, however, is that WhatsApp is far from perfect. In 2018, the spread of misinformation on the app was so bad that it was blamed for inciting lynchings in India. And that wasn't the only time WhatsApp was connected to violence in Southeast Asia. In Myanmar, pervasive hate speech and hoaxes have led to serious issues across the country, which Facebook is still trying to control.