This is a list by Michael Shermer: 25 fallacies that lead us to believe weird things.

(1) Theory influences observation. Heisenberg wrote, “What we observe is not nature itself but nature exposed to our method of questioning.” Our perception of reality is influenced by the theories framing our examination of it.



(2) The observer changes the observed. The act of studying an event can change it, an effect particularly profound in the social sciences, which is why psychologists use blind and double-blind controls.

(3) Equipment constructs results. How we make and understand measurements is highly influenced by the equipment we use.

(4) Anecdotes do not make science. Stories recounted in support of a claim are not scientific without corroborative evidence from other sources or physical proof of some sort.

(5) Scientific language does not make a science. Dressing up a belief in jargon, often with no precise or operational definitions, means nothing without evidence, experimental testing, and corroboration.

(6) Bold statements do not make claims true. The more extraordinary the claim, the more extraordinarily well-tested the evidence must be.

(7) Heresy does not equal correctness. Being laughed at by the mainstream does not mean one is right. The scientific community cannot be expected to test every fantastic claim that comes along, especially when so many are logically inconsistent. If you want to do science, you have to learn to play the game of science. This involves exchanging data and ideas with colleagues informally, and formally presenting results in conference papers, peer-reviewed journals, books, and the like.

(8.) Burden of proof. It is the person who makes the extraordinary claim who has the burden of proving the validity of the evidence.

(9) Rumors do not equal reality. Repeated tales are not of necessity true.

(10) Unexplained is not inexplicable. Many people think that if they themselves cannot explain something that it must be inexplicable and therefore a true mystery of the paranormal.

(11) Failures are rationalized. In science, the value of negative findings is high, and honest scientists will readily admit their mistakes. Pseudoscientists ignore or rationalize failures.

(12) After-the-fact reasoning. Also known as “post hoc, ergo propter hoc,” literally “after this, therefore because of this.” At its basest level, this is a form of superstition. As Hume taught us, the fact that two events follow each other in sequence does not mean they are connected causally. Correlation does not mean causation.

(13) Coincidence. In the paranormal world, coincidences are often seen as deeply significant. As the behavioral psychologist B.F. Skinner proved in the laboratory, the human mind seeks relationships between events and often finds them even when they are not present.

(14) Representiveness. As Aristotle said, “The sum of the coincidences equals certainty.” We forget most of the insignificant coincidences and remember the meaningful ones. We must always remember the larger context in which a seemingly unusual event occurs, and we must always analyze unusual events for their representiveness of their class of phenomena.

(15) Emotive words and false analogies. Emotive words are used to provoke emotion and sometimes to obscure rationality. Likewise, metaphors and analogies can cloud thinking with emotion and steer us onto a side path. Like anecdotes, analogies and metaphors do not constitute proof. They are merely tools of rhetoric.

(16) Ad ignoratum. This is an appeal to ignorance or lack of knowledge, where someone claims that if you cannot disprove a claim it must be true. In science, belief should come from positive evidence, not a lack of evidence for or against a claim.

(17) Ad hominem and tu quoque. Literally “to the man” and “you also,” these fallacies redirect the focus from thinking about the idea to thinking about the person holding the idea. The goal of an ad hominem attack is to discredit the claimant in hopes that it will discredit the claim. Similarly for tu quoque. As a defense, the critic is accused of making the same mistakes attributed to the criticized, and nothing is proved one way or the other.

(18.) Hasty generalization. In logic, the hasty generalization is a form of improper induction. In life it is called prejudice. In either case, conclusions are drawn before the facts warrant it.

(19) Overreliance on authorities. We tend to rely heavily on authorities in our culture, especially if the authority is considered to be highly intelligent. Authorities, by virtue of their expertise in a field, may have a better chance of being right in that field, but correctness is certainly not guaranteed, and their expertise does not necessarily qualify them to draw conclusions in other areas.

(20) Either-or. Also known as the fallacy of negation or the false dilemma, this is the tendency to dichotomize the world so that if you discredit one position, the observed is forced to accept the other. A new theory needs evidence in favor of it, not just against the opposition.

(21) Circular reasoning. Also known as fallacy of redundancy, begging the question, or tautology, this occurs when the conclusion or claim is merely a restatement of one of the premises.

(22) Reductio ad absurdum and the slippery slope. Reductio ad absurdum is the refutation of an argument by carrying the argument to its logical end and so reducing it to an absurd conclusion. Surely, if an argument’s consequences are absurd, it must be false. This is not necessarily so, though sometimes pushing an argument to its limits is a useful exercise in critical thinking; often this is a way to discover whether a claim has validity, especially when an experiment testing the actual reduction can be run. Similarly, the slippery slope fallacy involves constructing a scenario in which one thing leads ultimately to an end so extreme that the first step should never be taken.

(23) Effort inadequacies and the need for certainty, control, and simplicity. Most of us, most of the time, want certainty, want to control our environment, and want nice, neat, simple explanations. Scientific and critical thinking does not come naturally. it takes training, experience, and effort. We must always work to suppress our need to be absolutely certain and in total control ands our tendency to seek the simple and effortless solution to a problem.

(24) Problem-solving inadequacies. All critical and scientific thinking is, in a fashion, problem solving. There are numerous psychological disruptions that cause inadequacies in problem solving. We must all make the effort to overcome them.

(25) Ideological immunity, or the Planck Problem. In day-to-day life, as in science, we all resist fundamental paradigm change. Social scientist Jay Stuart Snelson calls this resistance an ideological immune system: “educated, intelligent, and successful adults rarely change their most fundamental presuppositions.” As individuals accumulate more knowledge, theories become more well-founded, and confidence in ideologies is strengthened. The consequence of this, however, is that we build up an “immunity” against new ideas that do not corroborate previous ones. Historians of science call this the Planck Problem, after physicist Max Planck, who made this observation on what must happen for innovation to occur in science: “An important scientific innovation rarely makes its way by gradually winning over and converting its opponents: it rarely happens that Saul becomes Paul. What does happen is that its opponents gradually die out and that the growing generation is familiarized with the idea from the beginning.”