In our third episode of Ars Technica Live, your intrepid hosts Annalee Newitz and Cyrus Farivar talked to journalist Sarah Jeong about online harassment. Jeong is the author of The Internet of Garbage, a book about how companies and online communities are using technology to cope with harassment and bullying. Watch the video, filmed before a live audience of Ars readers in Oakland, California at Longitude Bar.

Editor's Note: Our apologies for the sound issues. You can hear everything, but there are some crackles and annoyances. We promise to have that fixed for our episode next month.

In our discussion with Jeong, we talked about one of the trickiest aspects of dealing with harassment—that is, what the heck IS it? As Jeong points out, harassment is like spam. It can be almost anything, and the same comment can be perceived as harassment in one context but not in another. There are also no helpful legal definitions of online harassment, either. That's why Jeong takes a pretty skeptical view about many out-of-the-box solutions to harassment online. They are "overbroad," she says. They are "blunt instruments" that have the potential to harm free speech. Nevertheless, we have coped with the spam problem by using blunt instruments that interfere with free speech, too. How often have you found valuable mail in your spam folders? Despite the fact that many of us find ham trapped in our spam filters, we accept these limitations in order to make e-mail readable at all. The question is whether we want to do this when it comes to harassment, bullying, and other unwelcome communication.

We talked about a few systems that companies and communities are using to combat harassment, including Periscope's system where users vote on whether a stream is unacceptable. Other online communities have tried this voting system as well. Twitter users have created a number of blocklists, using apps like Block Together, which allow people to implement a group block of known harassers (there are several flavors of these blocklists, including one that many science journalists use to block trolls who want to yell at us about how evolution is a lie).

Other solutions to online harassment come from an unexpected place: people who make bots on Twitter. When Microsoft's Twitter bot Tay started spouting racist phrases , Jeong wrote about how most bot makers are already familiar with this problem. They have, in fact, come up with a list of common racist phrases that their bots will not tweet. It turns out that crude but simple tools can go a long way toward eliminating racism in our neural networks and other automated chat apps. Of course, as soon as we begin to implement technical solutions to harassment, the situation will turn into an arms race. We've already seen this in the world of spam, where spammers go to great lengths to fool spam filters. One could easily imagine dedicated harassers doing the same thing.

The big difference between spammers and bullies online is that people in the latter camp rarely have a financial incentive. If we make spamming expensive enough, nobody profits from it. But harassers are motivated by psychological factors. Jeong noted that media coverage of harassment is also affected by psychology. Often that means we only hear harassment stories about victims of sexism or sex-themed abuse, and typically these victims are white women. Racially motivated harassment gets underreported, as do other forms of harassment.

Finally, we also discussed the legal repercussions for online harassers. Jeong discussed a few cases where people have been sent to jail for online harassment, which she doesn't feel is the appropriate punishment. We also had a lively conversation with the audience about these issues and more. Watch the video to get all the details.

Coming up next month, on July 27 at Longitude, our guest on Ars Technica Live will be science fiction author and entrepreneur Hannu Rajaniemi. We'll post more details about that soon! Also, keep an eye out for an Ars Technica Live podcast (audio only!), for those of you who don't want to stare at a video for 30 minutes. We'll be announcing how to get the podcast next week.