As Meredith Ahlberg ushered friends into her home in East Oakland’s Ivy Hill neighborhood for a party on a Saturday in early March, she noticed that her phone was lighting up with notifications. There were new messages from agitated neighbors on the localized social network Nextdoor, warning the neighborhood about “sketchy” men—one in a “white hoodie,” the other “a thin, youngish African American guy wearing a black beanie, white t-shirt with dark opened button down shirt over it, dark pants, tan shoes, gold chain.” These men, the poster wrote, were “lingering” and searching for a nonexistent address.

“Scary sketchy,” a poster commented. One neighbor suggested the situation warranted a call to the Oakland Police Department.

But Ahlberg, who is white, recognized the “suspicious” men: they were her friends, looking for her front door. By the time she saw the posts, her friends had found the correct address and Ahlberg was looking right at the ‘thin, young, black man’ with the gold chain. The co-owner of a clothing store in downtown Oakland, he looked “ridiculously handsome and stylish,” she said in an interview. She was horrified at her neighbors’ assumptions.

“They were friends of mine,” she wrote in a response to the Nextdoor post, explaining how she accidentally gave them the wrong address in a party invitation. “Not sketchy at all. Really great men and wonderful folks, just lost,” she added. “That’s a relief,” one neighbor wrote.

While Nextdoor’s ability to assist in crime-spotting has been celebrated as its “killer feature” by tech pundits, the app is also facilitating some of the same racial profiling we see playing out in cities across the country. Rather than bridging gaps between neighbors, Nextdoor can become a forum for paranoid racialism—the equivalent of the nosy Neighborhood Watch appointee in a gated community.

Ahlberg is an East Coast native who moved to Oakland three years ago; Ivy Hill, where she lives, is what real estate agents call a “transitioning” neighborhood. She appreciates the information-sharing benefits of Nextdoor, but is concerned about the racial profiling that happens there. Since signing up for the app in 2012, Ahlberg has repeatedly seen black people in the neighborhood described as “suspicious” characters. “The most agitated alert messages are, by far, in reference to young black men who are seen as dangerous or a possible threat,” she said.

The same week of Ahlberg’s party, the New York Times wrote about Nextdoor’s venture capital-fueled growth, and its attempts to get community leaders onto its platform. It recounted the usual company lore about Nextdoor’s explosive growth over the last four years, leading to the creation of 53,000 micro-communities in the U.S. with users now sending 5 million messages a day. Like most media coverage of Nextdoor, the Times story didn’t mention the tense racial conversations that often play out there, and sometimes spill outside the app’s walled garden onto the open Internet.

“Racism quietly flourishes in San Leandro,” wrote one blogger citing Nextdoor posts in another Oakland neighborhood. A woman in St. Louis blogged about Nextdoor becoming the leaping off point for a discussion of how black mothers raise their sons. “Nextdoor: In case your Facebook feed isn’t racist enough,” is how a woman in Wisconsin titled a Tumblr post about the discomforting posts she saw on the network.

I signed up for Nextdoor last month to get a read on the type of conversations that happen there. I chose a North Oakland neighborhood where I spent time growing up. It’s a racially-mixed community of retired black people, younger white artists, and quite a few teachers—not a perfectly safe neighborhood, exactly, but another rapidly gentrifying Oakland enclave. The posters there also seemed to see skin color as a reason for suspicion.

This post was met with this response:

Most of what I saw on Nextdoor didn’t surprise me: a post about a garage sale here and a couple reports of gunfire there. But I was taken aback by a post about a whites-only meeting for discussing issues of racial profiling on Nextdoor. I wasn’t invited, apparently, because I am black.

The meeting was organized by Miranda Mickiewicz, an old high school classmate of mine who describes herself as “a white person who acknowledges participating in but tries to mitigate the negative consequences of gentrification.” Mickiewicz wrote that she was organizing the meeting after “notic[ing] a trend on this Nextdoor group that makes me very uncomfortable”—namely, her neighbors claiming that “suspicious” African American people were doing things in the neighborhood. She wanted to tackle the issue of racial profiling in a space where attendees would be more comfortable “voicing their (sometimes offensive and hurtful) internal patterns.” Her hope was to get people to talk about them in hopes of eradicating them, but she worried people wouldn’t be honest in a mixed-race setting. According to a follow-up post, 25 people attended the meeting.

“You’ve got to have face to face interactions because it makes people remember their own humanity, and the humanity of others,” Mickiewicz told me in a follow-up call.

At the meeting, she said, Nextdoor users in the neighborhood discussed how to identify people who actually warrant suspicion. She gave the hypothetical example of an unfamiliar person coming out of a neighbor’s backyard, holding something that she knows to belong to the neighbor: “That’s a totally legitimate time to be suspicious.”

Of course, some suspicions aren’t as well-formed. “There’s an internal prejudice that I have, it’s been pushed on to me by the media, and schools and our whole culture that says: a black guy walking down the street, looking at my house, is suspicious,” she said. “You need to ask yourself if your feelings are based on prejudice or if someone is really in danger.”

For its part, Nextdoor says it doesn’t take an active role in moderating racial profiling by its users. Nextdoor’s guidelines state that users should “refrain from using profanity or posting messages that will be perceived as discriminatory.” Kelsey Grady, head of communications for Nextdoor, says that if a user does cross the line, the company “would interject and potentially suspend a user’s account.” And although Nextdoor communities are user-moderated, the company will step in when a post is flagged for a violation of site guidelines. Grady also claims that Nextdoor’s “real identity” requirement—users must use their full names on the site—makes trolling and abuse less likely. “Typically people have good intent, and sharing crime and safety information is them trying to arm their neighbors with as much information as possible,” she said.

If Nextdoor’s racial profiling problems can’t be solved through heavier moderation, they’ll need to be addressed by the communities themselves, in meetings and community forums like the ones being organized by Mickiewicz and other concerned citizens. We don’t need Starbucks baristas to write #RaceTogether on coffee cups to stimulate conversations about race in our communities. Nextdoor is where it’s already happening. Let’s hope these semi-public, semi-private conversations lead to diverse communities better understanding each other rather than Nextdoor, and similar services, simply becoming yet another place to safely air long-held racial assumptions.

“There seems to be a culture of fear on Nextdoor, where anytime someone feels fear, they call the police,” Mickiewicz said. “This is a misplaced solution to feeling fear, because it can have really serious consequences.”