Here’s the latest way to accrue tons of useless internet points: Help YouTube moderate its own platform. YouTube announced plans Tuesday to pass the moderation buck onto its own customers—meaning at least some of YouTube’s comment and video flagging duties will now be in the hands of its regular, and presumably unpaid, users.

YouTube has done what it can to heighten the appeal of its new moderation program by branding said moderators as “heroes.” In a video published to its YouTube Help channel Tuesday, the Google-owned video platform said the program, called YouTube Heroes, awards points to users, and offers a tiered level of moderation duties, ranging from the ability to add subtitles or captions (these are subtitles for voices, but not other sounds), to testing out new YouTube features before they are pushed out to the masses.

YouTube said moderators begin at level one with access to a community dashboard—it’s still unclear what the dashboard contains. Moderators move through levels two to five by earning points; one point by reporting videos that violate YouTube’s content guidelines, one point by contributing one sentence of subtitles later chosen for publication with a video, and 10 points by answering a question on a community forum, that’s later marked as the “best answer.” Moderators gain more moderation powers as they move through each level, such as the ability to flag lots of abusive content at once, attend Google Hangouts and workshops, and directly contact YouTube staff.

Initial response to the program, while muted, appeared to be mostly positive. In its defense, YouTube Heroes may go some way to reducing the level of unrelenting spam and hateful commentary that now plagues almost every popular YouTube video.

The program is certainly an unusual move from YouTube, which has faced criticism that it fails to keep users abreast of feature and policy changes. Most recently, many YouTubers, including the platform’s most popular, PewDiePie, berated the platform for failing to tell users it had demonitized videos—prevented advertisements from appearing on videos, and thereby stopped users from earning ad revenue—that did not meet its content guidelines. (YouTube later said it was rolling out “improved notifications” that inform users when it demonitizes their videos.)

Anyone with a YouTube account, and who is of legal age and willing to receive emails from YouTube, can apply for the program. The platform said it “will only respond to successful applications,” but did not say how it would choose who it allows to moderate.