Science and technology are upending how we learn. We separate the science from the snake oil and look at how parents, teachers, and policymakers respond.

If only we would all just use our rational, scientific minds. Then we could get past our disagreements.

It’s a nice thought. Unfortunately, it’s wrong.

Yale behavioral economist Dan Kahan has spent the last decade studying whether the use of reason aggravates or reduces partisan beliefs. His research shows that aggravation easily wins. The more we use our faculties for scientific thought, the more likely we are to take a strong position that aligns with our political group. That goes for liberals as well as conservatives.

Rather than use our best thinking to reach the truth, we use it to find ways to agree with others in our communities.

“The process is called biased assimilation,” says Kahan. “People will selectively credit and discredit information in patterns that reflect their commitment to certain values.”

Kahan’s research began as a challenge to the contention of some behavioral economists that public policy disagreements are the result of an over reliance on emotion-driven decision making—what the Nobel prize winning psychologist Daniel Kahneman calls “System 1” thinking. These researchers argued that public policy formed by experts using deliberate, analytical decision making processes (“System 2” thinking in Kahneman’s lingo) would be better and less partisan.

Kahan’s research suggests this is wishful thinking.

In one illustrative study, Kahan asked over 1,500 respondents whether they agreed or disagreed with the following statement: “There is solid evidence of recent global warming due mostly to human activity such as burning fossil fuels.” For these same respondents, Kahan also collected information on their political beliefs, and measured their “science intelligence”—a metric based on answers to questions developed by the National Science Foundation, Pew Research Center, and others. These questions are intended to gauge a combination of scientific knowledge and quantitative reasoning proficiency.

When Kahan analyzed the data he found that those with the least science intelligence actually have less partisan positions than those with the most. A Conservative Republican with strong science intelligence will use their skills to find evidence against human-caused global warming, while a Liberal Democrat will find evidence for it. This is also true for issues like fracking, evolution, and the risks associated with gun possession—whatever your preconceived political belief on this issue, you’ll use your scientific intelligence to try to prove you’re right.

In the chart below, the y-axis represents the probability of a person agreeing that human activity caused climate change, and x-axis represents the percentile a person scored on the scientific knowledge test. The width of the bars show the confidence interval for that probability.

This disagreement does not appear for less political questions. For example, you can predict how likely someone is to reach a correct answer to the question ”Are electrons smaller than atoms?” based entirely on their scientific knowledge. Partisanship plays no role.

Perhaps Kahan’s most disconcerting finding is that people with more scientific intelligence are the quickest to align themselves politically on subjects they don’t know anything about. In one experiment, Kahan analyzed how people’s opinion on a unfamiliar subject are affected when given some basic scientific information, along with details about what people in their self-identified political group tend to believe about that subject. It turned out that those with the strongest scientific reasoning skills were the ones most likely to use the information to develop partisan opinions.

Kahan argues that this is actually a very rational way of using our best thinking. “A person who forms a position out of line with her cultural peers risks estrangement from the people on whom she depends on for emotional and material support,” writes Kahan. Better to use your intellectual faculties to stick to the company line.

On a brighter note, Kahan told Quartz that his research on people that score well on a measure called “scientific curiosity” actually show less partisanship. These are people who may want to agree with their group, but they just can’t help themselves—they need to know the truth.

In other words, it is curiosity, not smarts, that helps us come together.