Steel union boss Chuck Jones just accused President-elect Donald Trump of lying "his a-- off," saying Trump's announcement that he had saved over 1,000 jobs from leaving the U.S. was nonsense.

The number is more like 800, Jones told the Washington Post. Trump angrily responded to Jones on Twitter, attacking his union work, but did not deny that he had exaggerated the number.

This is just one more on a long list of fibs Trump has told over the past few years.

Donald Trump has said that Barack Obama founded ISIS.



He said that Texas Senator Ted Cruz's father was involved in the plot to assassinate JFK. He claimed he saw thousands of Muslims in New Jersey celebrate on 9/11. He said that he won the popular vote.



There isn't a shred of evidence to support any of this.



We also know that some of those claims might've come from fake news stories that spread conspiracy theories across the internet.

None of this would matter if lying weren't a game of two. For a lie to work, the liar must also be believed.

For example, it wouldn't matter that there's a conspiracy theory that says Hillary and Bill Clinton run a child sex ring out of a D.C. pizza shop if people didn't believe it. But some people do.

One of those people was a North Carolina man with a semi-automatic rifle willing to drive to Washington and "self-investigate" the situation himself. He managed to fire one or two shots before being arrested.

It sounds like madness because it is, so why do some people keep participating? Why do people believe lies? Research tells us it's actually quite simple: It's because humans are desperate to be in control.

Born believers

Science writer and historian Michael Shermer believes that human beings are conditioned to believe, rather than disbelieve, things. He explained it all in a 2010 Ted Talk called "The pattern behind self deception."

During the talk he asked attendees to do a thought experiment, and pretend they were an early human named Lucy, walking the plains of Africa millions of years ago. Just go with it:

...you hear a rustle in the grass. Is it a dangerous predator, or is it just the wind? Your next decision could be the most important one of your life. Well, if you think that the rustle in the grass is a dangerous predator and it turns out it's just the wind, you've made an error in cognition, made a Type 1 error, false positive. But no harm. You just move away. You're more cautious. You're more vigilant.

On the other hand, if you believe that the rustle in the grass is just the wind, and it turns out it's a dangerous predator, you're lunch. You've just won a Darwin award. You've been taken out of the gene pool.

A "Type 1" error makes you more cautious, but it really costs you nothing to believe that there maybe danger behind the rustling.

What Shermer later calls the "Type 2 error" — not believing in the danger, but actually having it exist — is deadly.

And so we believe. But more than that, in that belief we create patterns. That helps us structure our lives. It gives meaning to what could easily be random. It is from there that Shermer believes we develop things like superstition and conspiracy theories. They make sense of what is random.

I see dead patterns

If this were all humans had to rely on for cognition — our limited brains making sense of that which we can't understand — we would be in big trouble. Thankfully, however, we have verifiable facts. They ensure that what is random not only makes sense, but is also true.

And that's where we get into some even more fascinating research about why people believe overt lies — easy to disprove lies like the kinds Donald Trump tells.

According to Jennifer Whitson, an assistant professor at the McCombs School of Business at The University of Texas at Austin and Adam Galinsky, a professor at the Kellogg School of Management at Northwestern University, people tend to believe lies when they feel vulnerable.

"The less control people have over their lives, the more likely they are to try and regain control through mental gymnastics," said Galinsky. "Feelings of control are so important to people that a lack of control is inherently threatening. While some misperceptions can be bad or lead one astray, they're extremely common and most likely satisfy a deep and enduring psychological need."

To test this, Galinsky and Whitson gathered a group of subjects and put them in situations where they had varying levels control. Then they showed their subjects "snowy" pictures. Some of the pictures were just dots, others actually showed an image.

Ninety-five percent of the time the subjects, no matter what situation they were in, saw images that were actually there. What's interesting though, is that 43% of the time people who were in situations where they had less control saw images that were not there. Their minds were naturally assigning structure, pattern and meaning where there was none.

In normal life people find false patterns in data all the time (think about the stock market). When they do, it's usually because they feel a given situation is out of their control. The man who went to "self-investigate" the pizza parlor was, in a sense, acting very rationally. He was acting on his innate human desire to take control.

Comet Ping Pong, the pizza place attacked because of a conspiracy theory Reuters

Trump or fiction

Of course, the more false the pattern, the more vulnerable you have to be to believe it. And of course, the more gymnastics your brain has to do to accept it.

Enter Donald Trump, a man known for spreading falsehoods. He captured the imaginations of many people who felt vulnerable about the past, present and future. Business Insider's Harrison Jacobs aptly pointed out that he won in parts of the country ravaged by the opiate epidemic — the 'Oxy electorate.'

Talk about vulnerable. These are also places of higher unemployment, where manufacturing jobs have been on the decline for decades. These are places where a rational structure is needed to explain why things got so bad and why they don't seem to be getting better.

But again this is not "rationality" here as we think about it in an economic sense — as a cost-benefit analysis. If it were, individuals would seek the truth no matter what their state of distress, because it is only when a problem is truly understood that it can be solved.

Indeed, Whitson herself has said that "sstrategy is better based on reality, not tempting illusions."

Here's a relevant example. Trump has dumped on China over and over since he entered the national stage. He's said that we're losing jobs to Chinese manufacturing and that the Trans Pacific Partnership was "was designed for China to come in, as they always do, through the back door and totally take advantage of everyone."

In the world of facts, however, we know that thousands and thousands of manufacturing jobs have been lost to automation, not off-shoring to China. We also know that China is not involved with the TPP in any way. In fact, the country has been upset about TPP since talks for the agreement started.

The human desire to feel in control supersedes all of those facts, and in turn, pushes us to believe what may be irrational, but is simple, understandable, and gives us a sense of control.

Think about it: Say you believe all the lies Trump has told about trade, China and the global economy. It's a comforting notion, ultimately, because it means he also has the solutions. Voting for him, then, is a way to take control of that untruth.

Unfortunately, since it's a lie, the problem will remain. Lies never solve anything.