Twitter's rules state that "you may not threaten violence against an individual or a group of people." So when US President Donald Trump tweeted in January 2018 that he had a "nuclear button" that was "much bigger & more powerful" than the nuclear button of North Korean dictator Kim Jong Un and (a few months earlier) that North Korea "won't be around much longer" if it continued its bellicose rhetoric, critics asked Twitter to take the tweets down for violating Twitter's rules.

Twitter rejected those calls. Instead, the social media giant argued that the words of world leaders are newsworthy and that such newsworthiness can trump rules that might otherwise apply. But activists have kept up the pressure on Twitter. So now the company has rolled out a new policy to deal with this kind of situation.

"In the past, we've allowed certain tweets that violated our rules to remain on Twitter because they were in the public's interest, but it wasn't clear when and how we made those determinations," a Thursday blog post says. "To fix that, we're introducing a new notice that will provide additional clarity in these situations."

The new notice will look something like this:

This won't just be a tag next to the tweet—users will have to click through the notice before they can see the tweet itself.

The new system will only apply to government officials with verified accounts and more than 100,000 followers. That includes Donald Trump and a number of other officials, both in the United States and overseas. Twitter will take other steps to limit the distribution of tweets that receive this kind of notice. They won't be featured as top tweets on a user's Twitter timeline, in "safe search" results, or in "recommended tweet" push notifications.

"This notice won't be applied to any Tweets sent before today and, given the conditions outlined above, it's unlikely you'll encounter it often," Twitter says. "We cannot predict the first time it will be used."

Facebook is wrestling with the same issue

Traditionally, sites like Twitter and Facebook have preferred to think of themselves as neutral platforms that host a wide range of content without necessarily endorsing any of it. But as social media sites have become a more prominent part of national conversations, they have faced growing pressure to take a more active role in content moderation.

Facebook, for example, recently got into a feud with House Speaker Nancy Pelosi over a video that was slowed down to make her appear drunk or senile. Pelosi wanted the video removed, but Facebook refused, arguing that it didn't violate Facebook's policies. Eventually, Facebook settled on an approach similar to the one Twitter is announcing today: the Pelosi video stayed up, but Facebook added a notice that the video had been altered. Mark Zuckerberg acknowledged Wednesday that Facebook had been too slow to respond to the fake Pelosi video.