Nextdoor, your friendly neighborhood social network, was having a racism problem.

Last year, Fusion traced it to the platform's Crime and Safety section, where users were filing incident reports based on little besides the color of a "suspicious person's" skin. But most of the time, those people weren't doing anything suspicious. And if they were, the reports ran light on helpful specifics. "Black man," "African American" and "scary sketchy," are inadequate descriptors, at best. At worst, they implicate an entire race of people.

Nextdoor's users probably didn't mean for them to be. “Ninety-nine percent of Nextdoor’s racial profiling is inadvertent,” says CEO Nirav Tolia, who learned about the troubling messages from the Fusion story. “It’s not a racist person trying to create a firestorm.” Fair enough. Even so, Nextdoor’s user interface did nothing to stop them. So Tolia and his team spent six months building one that could. And they broke a cardinal rule of design to do it.

Friction Isn't Always a Drag

Designers like to crow about minimizing the steps in a process. (See: The three click rule, invisible design, and most things ever written about frictionless interfaces.) But Nextdoor didn't remove steps from its incident reporting process. It added them. "There were a lot of opportunities for things to go more smoothly, and they made an ethical choice to steer away from that," says user-onboarding guru and UX expert Samuel Hulick. Nextdoor's new Crime and Safety section may break numerous conventional rules of user-friendliness—but it does them by design.

That's because friction can be useful. “We needed to take a form-centric approach," Tolia says, "by which I mean guide our members more carefully than a standard white text box.” That's all the old incident report was: a single entry-field. All a user had to do was fill it out and click "Post." It was as easy as sending an email:

NextDoor

The new interface isn't so straightforward. It uses a series of checkpoints to help you evaluate the content of your report. The moment you begin composing a Crime and Safety message. Nextdoor’s interface asks: “What are you posting about?” The choices are a crime, suspicious activity, or other. Select “suspicious activity,” and a list of tips appears on screen, including one reminding you to focus on behavior, not appearance:

NextDoor

From there, Nextdoor asks you to describe the incident in question. If you focus on the suspicious person's race, you hit a roadblock:

NextDoor

In fact, the interface won't even ask for descriptions of people and vehicles until the second step. This section is the crux of the new design. It presents you with eight entry fields, only one of which has to do with race. If that's the only one you fill out, an algorithm stops you from advancing:

NextDoor

To build out this section, Tolia and his team consulted with groups like Oakland advocacy group Neighbors for Racial Justice and the Oakland Police Department. From the OPD, Nextdoor’s design team learned how to pose questions like 911 operators: artfully, thoroughly, and with sensitivity. If a caller says "a dark-skinned man is breaking into a car," the 911 operator asks for elaboration: "OK, what else can you describe?" This type of exchange is what Tolia and his team have baked into Nextdoor's new interface.

The upshot is an incident report that looks not unlike the original, one-step Crime and Safety interface—except now the report includes a thorough and helpful description:

NextDoor

Quantity ≠ Quality

Companies rarely make things difficult for their users on purpose, but it's not unprecedented. Hulick, the UX expert, compares Nextdoor's redesign to Civil Comments, a peer-review platform that requires any user who wants to comment to first review three other comments. It's the online equivalent of taking ten deep breaths before picking a fight.

The trick is to create a system that doesn't feel like a series of reprimands—something Nextdoor's redesign actually suffers from. "Anytime you’re filling out a form online and it admonishes you for not matching up with its own criteria, that’s a universal usability problem," Hulick says. "But this was an admirable decision. The intention behind it seems to be a humanitarian one.”

So far, it seems the redesign worked. A blind study found that racial profiling on Nextdoor is down 75 percent. On the other hand, Tolia says they've also seen a 50 percent increase in abandonment—a lot of the people initiating Crime and Safety reports aren't finishing them.

That's okay, and expected. When you add friction, completion rates inevitably drop off—something Tolia says was expected. That's not an easy decision for a CEO to make. Nextdoor, like all social media platforms, runs on user participation. But in this case, quality trumps quantity, and if the quality of users' posts improves, the platform should tolerate the drop in quantity. Nextdooor knows this new interface can’t prevent all racial profiling, but it does ask its users to stop and think twice. Or three times. Maybe even four. That's a habit worth cultivating, online and off.