A couple of days before the Future Today Summit at the 92nd Street Y in uptown Manhattan, a gunman walked into a pizza shop 230 miles south of it, fired into the air, and demanded to see the child sex ring in its basement. The basement didn’t exist, and neither did the child sex ring, but the weird flattening of information that comes with leaving every fact in the world to an algorithm and occasionally checking in on it is very real, and it is here.

It’s what led Edgar M. Welch to easily find a video on InfoWars called “PIZZAGATE: The Bigger Picture,” and text his friend to “watch [the video] on Youtube” a few hours before he failed to find the child sex tunnel behind the fridge where they kept the pepperoni.

This whole thing really messed up the mood for the Future Today Summit, which would typically be one of those events where a guy in a hoodie proclaims that we can cure syphilis if only someone would completely dismantle the taxi industry.

This time, a lot of the attendees were trying to answer one big question: How do we clean up firehoses of non-stop, fetid, feel-good lies like Facebook and Google and Twitter when the companies themselves have no obligation or desire to help?

Sure, there was still the robot that 3D-printed chocolate and a corner full of people goofily stumbling around with a VR headset on, evading reality at all costs. The lineup was diverse and flecked with genius. A researcher from the New York Genome Center. The woman who helped make Watson and the guy who plugs it in.

But then there were the conversations with people not making and breaking tech, but living with it, like academics, ACLU lawyers, government workers, and journalists. The tone of those conversations did not resemble robot chocolate and VR roses.

One panel, dubbed “The Future of the First Amendment,” argued about the perils of asking for help from a private company with an even more private algorithm. Another argued that doing nothing is immensely worse. The ACLU’s Jay Stanley argued at one point that another panelist was basically making a case for government intervention, even though nobody wanted government intervention. They all wanted more transparency from private companies, but no one really knew how to get it.

Here’s the takeaway from some of the leading experts in the fields where there is no longer such thing as a fact: You’re on your own.

And that might be the best way to fix it.

—

“There’s a lot of responsibility that falls on [Facebook] to figure out where they play in this new realm,” Stacey Dixon tells me. “Facebook will come out and say we’re not a news organization, but a lot of people get their news that way. So, inherently, they may actually be one without really wanting to be one.”

Dixon is the Deputy Director of IARPA (Intelligence Advanced Research Projects Activity—think DARPA, but Intelligence, and around since 2006), whose boss’ boss is DNI chief James Clapper and whose agency’s goal is to “deliver innovative technology for future overwhelming intelligence advantage.”

They recently lost a round to a dangerous combo of Macedonian teenagers, propagandists, and rogue idiots trying to stir up shit and load up a Paypal account with Google AdSense money by inventing outlandish clickbait. It likely swung a part of the election and drove a North Carolinian with a gun to a D.C. pizza shop. Life comes at you fast.

But Dixon has thought plenty about it, and she understands the implications on national security. That’s in part why she’s here on a panel called “The Future of Techno-Governing,” And her solution for disinformation, once it reaches the consumer level, is this: We probably shouldn’t be Techno-Governing a damn thing.

“You run the risk of the government becoming the bad guy because they’re trying to shut down someone’s speech. That’s a very delicate line. It gets back to, I don’t know that it is the government’s role,” Dixon said.

“I don’t know that the government has enough bandwidth to actually start indicating, or start pointing out to all these companies, all the things they’re doing that have created all these unintended consequences they had no idea about.”

The unintended consequences were supposed to be the good part, remember. We got plenty of books called things like “Why the Global Culture of Disruption is the Only Hope for Innovation” in the past half-decade. But now we’ve been fully disrupted, and all we got were these bullet holes in the ceiling of a pizza shop.

The ethos of disruption culture was always break it first, and figure out how to fix it later. We’re in the post-break it society. We’re at the fix-it point, and companies have no desire to do it.

The people who are in the best position to stomp out disinformation campaigns are almost certainly the social media and search companies themselves.

“Internally, if there’s data that’s within a company that’s (showing disinformation) having this impact, they’re the ones better positioned to do this than the government is,” said Dixon.

Problem is, companies like Facebook and Google have no obligation to help.

In fact, Facebook’s only legal obligation—since the company went public in 2012—is to act in the best interest of their shareholders. That means increase user growth and advertising revenue. Addict your users, societal implications and ethical obligations be damned.

When particularly disgusting roadblocks show up, the goal for these companies is not to do the right thing, but to minimize negative impact on the perception of their product.

This week, Google was in that precise bind. Reports surfaced that users are bombarded with Holocaust denial when they search the phrase “did the Holocaust really happen?” That left the search giant with a dilemma: override the algorithm and the integrity of the product gets degraded while a damaging news cycle gets extended. Do nothing and conspiracy becomes fact on the world’s largest search engine.

Holocaust denial remains the top search result, because the dollar trumps the ethics. But don’t worry, you can always pay Google to have non-factual and antisemitic information be bumped down a peg.

“I would imagine that there’s enough feedback making it to (executive) levels (of tech companies). I’d be surprised if they had no idea their sites weren’t responsible for some of these things.

“Do you not feel that some of them, though, are feeling somewhat responsible?” asked Dixon.

No. Probably not.

“You’d wonder whether the pressures of society have them do the right thing. That has, in the past, encouraged companies to stop doing some of the more nefarious things that have happened. There’s the social pressure. There’s government pressure—regulations. Then there’s the moral responsibility of the companies. What I’m hearing is that maybe the moral pressure is not the highest priority. So maybe the social is the next one.”

In other words, again: It’s on us. Obnoxious, persistent pressure on the people who can fix our broken technology gets results. It is annoying and it is wildly inelegant, but it works.

—

Facebook, feeling the heat of constant negative press surrounding the fake news plague that has besotted its website over past year, vowed to finally do something about it this week. They’ll be assigning a scarlet letter to all stories that are made up of whole cloth. “This story has been disputed by third-party fact checkers,” the disclaimer will read.

It’s not a perfect solution, since right-wing blogs are already arguing that the capital-T Truth will now be at the discretion of a series of hand-picked fact-checkers, whom websites like InfoWars and The Daily Caller reflexively deemed to have liberal biases.

It is, however, something. And the public outcry had Mark Zuckerberg changing his tune from “99 percent of the content you see on Facebook is authentic” to “these steps will help make spreading misinformation less profitable” in less than a month.

This approach, however, has enormous holes, created mostly by regular people’s love of bullying random strangers on the internet.

While Dixon and I are speaking in the hallway, fittingly, Jon Ronson is taping a podcast episode inside the auditorium. Ronson wrote So You’ve Been Publicly Shamed, about our new public pastime of ruining the lives of private citizens who do something insensitive or stupid and wind up in the heart of an internet outrage hurricane.

That kind of outrage is extraordinarily dangerous for regular people. A dumb joke tweet that goes viral can make private citizens wholly unemployable.

“There’s the responsibility of the individual to know what they’re doing and know that there’s a consequence for doing something,” said Dixon. “But if people give that feedback to these particular apps and particular services, that we’re not going to continue to follow you because we can’t trust you—that can have an effect.”

Weirdly enough, Ronson himself helped create the disinformation scourge entirely by accident, by going on a half-jokey field trip two decades ago. He and then-unknown Alex Jones went to the annual Bohemian Grove gathering, which is like a ritual sacrifice but with none of the death or fun and attended mostly by high-profile people.

Government figures like the Clintons or Bushes or Henry Kissinger had allegedly attended the event. The two snuck into the festivities in 2000 for Ronson’s BBC documentary show.

Ronson believed it to be kind of a stupid frat party and forgot about it. Alex Jones parlayed the experience into an empire of Illuminati conspiracies and crackpot bullshit that served as one of Donald Trump’s most effective opposition shops in the 2016 campaign. That empire is now called InfoWars, which Jones founded.

“That slapstick, that ridiculousness, is now defining our culture,” said Ronson. “Fake news, even if it did just reinforce people’s beliefs, it is a dangerous thing. I don’t believe the alt-right is any bigger than it was in the ‘90s. But now that you have Alex Jones and Stephen Bannon in power, their effect could be profound.”

And that’s the difference: power. The government isn’t going to help quiet disinformation because the executive branch of the United States government is going to be actively spreading it.

Companies have made it clear they will only refine their platforms when their users speak out and don’t stop.

So it’s on you. Organize. Protest for better platforms with a laser-focus. That’s the only way to get stuff done in our new lazy dystopia.

After all, a man who primarily complained about Robert Pattinson and another man’s birth certificate on Twitter for the past decade is now the President of the United States. He was onto something.

“The question is not, ‘What do we do now that we’re in a post-truth society?’” said David Litt, who’s Funny Or Die’s head writer and a former Obama administration speechwriter.

“It’s, ‘What do we do now that we have an incoming administration and their political allies who are very comfortable outright lying, and don’t care if they get caught?’ And how do we deal with that?’”

Outrage culture has fired some practice rounds and missed wildly, but the ammo worked great. It ruined lives of seemingly random private citizens. It had the country fixated on a poacher dentist for an entire month. Combined with disinformation, it put a man with a gun in a pizza shop.

It is also capable of a lot of good, and it might be the only way to ensure transparency from the most powerful media platforms in the world.

“There are challenges that come with this and advantages that come with this. You see citizen journalism has been such a big issue when it comes to criminal justice. They’ve shined a spotlight on where police have acted terribly,” said Litt. “That might not have happened without (those platforms), and at the same time, fake news might not have happened.”

Outrage is the bastard creation of the tech companies who ignored their users’ best interests in an effort to get them hopelessly addicted to their websites.

Well, it worked. Now there’s a lot of them, and they need some answers.