WAM, in effect, got super powers within Twitter’s moderating environment. After submitting an abuse report to Twitter, users can now also submit one to WAM. WAM will make sure the users’s claims are credible, then “escalate” the report in Twitter’s system, flagging it for immediate handling by the company’s moderators.

While WAM hopes to bring all expedited reports to a “speedy resolution” within 24 hours of receiving them, it cautions, “we’re not Twitter, and we can't make decisions for them.” It instead will advocate for users within the moderation system.

WAM will also be keeping track of whose reports get handled and whose don’t. Using its access to Twitter’s moderation system, WAM will be collecting data on how poorly gendered abuse is handled across the site.

WAM won’t have these super powers forever, nor does it want them. Its executive director Jaclyn Friedman told me that she thought the program’s initial test period would run for only about a month.

Even only a few weeks, she hopes, will give it a sense of how well or poorly abuse reports are handled across the site. It will also let WAM figure out what Twitter’s moderators consider okay.

“We’ll be escalating [harassment reports] even if they don’t fit Twitter’s exact abuse guidelines,” Friedman said. WAM intends to “cast a wider net” and see what Twitter’s moderators address.

WAM is a small nonprofit outside of Boston with a staff of two. Those two employees will be doing all the work: Friedman and WAM’s community manager, Mina Farzad, will personally read and vet every harassment report that WAM receives.

In other words, it’s understood to be a major improvement to the current situation, that two people will now be devoting serious time and attention to Twitter’s harassment problem—even though they work for a small nonprofit that’s effectively donating that time and attention to Twitter, a for-profit and publicly owned corporation.

Twitter held its initial public offering a year ago. In its third-quarter results, issued in late October, it reported increased revenue but a net operating loss of $175 million.

Friedman communicated as much in her interview with me. “I’m frustrated. For all the money they make off their users, not to be able to spend a little more to make this safer...” seems wrong. Though she said she was excited and encouraged by the project, she lamented in no weak terms that it had to be done at all.

“I don’t think we should have to do this work,” she said. “I think it’s a scandal that a tiny, under-resourced nonprofit with two staff members is having to do free labor for them.”

Twitter hasn't issued a press release on the initiative, but did send a short statement saying, “We’re always trying to improve the way we handle abuse issues, and WAM! is one of many organizations we work with around the world on best practices for user safety.”