That’s how Michael Patrick Lynch, a philosophy professor at the University of Connecticut, explains what he calls “knowledge polarization,” the idea that technological advancements have contributed to giving us a superficial sense that what we know is right. Because it’s so easy to bury into our silos, there has been a loss of the basic philosophical principle that, ultimately, we live in a common reality. It’s a fundamental concept that seems so obvious, he said, but has been clouded by these artificial disparate realities that are dividing our society.

AD

AD

Lynch, who wrote the book “The Internet of Us: Knowing More and Understanding Less in the Age of Big Data,” spoke here at the international TED conference and told the audience that what is lacking in our public discourse, and what is keeping the sides so entrenched, is a lack of humility. And it’s a deeper humility needed than just being able to admit that you don’t know it all.

“It means seeing your worldview as open to improvement by the evidence and experience of others,” Lynch said in his talk. “That’s more than being open to change, that’s more than self-improvement. It’s seeing your knowledge as being able to be enriched by what others contribute.”

In an interview after his talk, Lynch said our social interactions don’t allow for ambiguity. On social media, you can express only absolute emotions, with a smiley face or a frown, or give something a thumbs up or down. Facebook doesn’t give you the option to click an icon for “maybe,” or “I need more information.” You can only like it, love it, laugh at it, be wowed by it, cry about it or be angry.

AD

AD

But Lynch does not suggest that humility has to come at the expense of convictions. Throwing one’s hands up and saying, “I don’t know anything,” isn’t motivating. Holding strong beliefs is important, but having the humility to accept that those views could be improved upon is what is essential.

“What I’m trying to do is get at that complicated space where when we’re listening to each other, we’re not just waiting for the other person to be quiet so you can say something,” he said. “You’re actually listening so you might learn something, and maybe that you’re wrong about something.”

Lynch is running a research project called Humility and Conviction in Public Life and has enlisted professors across the United States and in Wales and the Netherlands to experiment with ways to spur this kind of deeper dialogue. He is holding conversations at the Hartford Public Library with groups from diverse backgrounds and ideologies to test how debating theories rather than policies or politics might be more productive.

AD

AD

“You don’t put black and white people in a room and say let’s talk about racism — because people retreat to their boxes,” he said. So, he instead has them discuss a line from the Constitution or a philosophical writing. Often they will arrive at similar conclusions.

This approach to discourse is in line with research being done by behavioral social scientist Dan Ariely, a professor at Duke University, who also spoke at TED this week. During his talk, he presented two moral quandaries to the audience. In one case, an AI robot appears to be developing its own consciousness and begs the scientist working on it to not reboot it as he’s supposed to every night. Should the AI be restarted? The other questions asked, should people to able to choose the height, eye color, intelligence and social competence of embryos?

Ariely and his co-presenter, Mariano Sigman, a neuroscientist from Argentina, asked the audience members to assess what was right or wrong in both circumstances and how certain they were about their answer. Then they had break out groups debate the questions for two minutes.

AD

AD

Ariely said he heard from many people after the talk that their perspectives shifted after even a short conversation about the questions.

What Ariely and Sigman were really measuring was people’s humility, how much people would be willing to admit how little they knew for sure.

“We walk around the world with confidence that we know what we’re doing. It’s a humility of saying we don’t know what the right answer is and the intent to try something else,” they said in their joint talk. “It’s about healthy debates and how to change opinions. Figuring out the cases or the times you should be less confident in your beliefs.”

In a separate interview, Ariely described how a similar approach could be used to debate policy. Rather than asking people about whether taxes should go up or down, you could ask them a moral question. As an example, he said, imagine saying to people, we know babies will die from time to die, but what percentage of dying babies should come from households that are in the lowest-25-percent income level.

AD

AD

“We’re not talking about the solution, we’re saying should low-income mothers lose babies more frequently than rich mothers?” he said. “Now we can go back from this: What does it mean for the responsibility of the tax system? When you ask people about deep beliefs, they tend to agree very much with each other, though it feels like we don’t.”

Ariely has a provocative idea for how this could be implemented to improve politics. He thinks people should vote based on their basic beliefs, answering questions about what kind of world they want to live in, what people should have rights to: Clean water? Not having their babies die?

Their answers could be translated by experts to correspond with politicians running for office. But people would no longer vote for the person, they’d vote on their principles and moral beliefs.

AD

AD

“I’m interested in all kinds of ways we can get people to stop freezing in their opinions, to be open,” Ariely said. “I have my own ideology, and, like everyone, I think it’s right, but I want people to be open to revising their opinions because without that you can’t really have a real democracy.”