Get breaking news alerts and special reports. The news and stories that matter, delivered weekday mornings.

Don't call it a dislike button.

Facebook is testing a new "downvote" feature on a limited number of public pages as a way to moderate comments, and the company is adamant this isn't the oft-requested dislike button.

When a user "downvotes" a comment, they're given the option to flag it as misleading, offensive or off-topic. Facebook will then hide the comment from them.

Byers Market Newsletter Get breaking news and insider analysis on the rapidly changing world of media and technology right to your inbox. This site is protected by recaptcha

There's no "thumbs down" and the number of "downvotes" a comment receives won't be shared, either with the commenter or publicly. Instead, Facebook said it is a way for the company to solicit feedback.

The test follows the likes of Reddit and Quora, which use "downvoting" as a way to ensure the best comments rise to the top. A Facebook spokesperson said there are currently no plans to expand the test.

The feature is being shown to a "small group," specifically, 5 percent of Android users in the United States with their language set to English, according to Facebook. Additionally, it is only appearing on public page posts, so you still won't be able to "downvote" whatever your crazy uncle has to say on his Facebook page.

There's a reason why it's not a "thumbs down." A dislike button has long been one of the most requested features on Facebook, but it's something Mark Zuckerberg is reluctant to add.

Speaking at a town hall event in September 2015, Zuckerberg said he didn't want Facebook to turn into a place where people "upvoted" and "downvoted" comments.

What people really want, Zuckerberg said at the time, "is the ability to express empathy...Not every moment is a good moment."

One month later, Facebook began testing its set of reactions, before rolling out options to show empathy through "like," "love," "haha," "wow," "sad" and "angry" emoji.