This is the next installment of a series of articles about the cognitive biases and other irrational tendencies—or in Layman, why and how you err, make decisions, get emotional, and occasionally feel out of control. I recommend read the previous two articles if you haven’t already; this article will make more sense if you do. And with that, here is part three! Biases, Illusions, Delusions, Fallacies & Other Cognitive Errs — An Introduction (pt 3) Conversation Skewers | Language Fallacies | Persuasion Contrast Effect

Consider a ‘special offer’ in your local supermarket, any offer. Upon noticing the offer you will be helplessly more attracted to it than other items without offers—even if it is an offer on something you don’t usually purchase; and, if the offer is good enough, you will purchase. But why? The initial noticing of the offer is an example of the Von Restorff Effect, which is our tendency to notice things that stick out. How persuaded you are to buy, however, depends on how well the Contrast Effect works on your psyche; it depends on the framing you subconsciously put around the offer. For example: ‘75% off’ a packet of smoked salmon in the ‘reduced items’ shelf in Costco; even if you don’t eat salmon, even if you have dinner planned that night and you know full-well that Salmon will not be part of it , even though you know the ‘Best Before’ date is that day, still you will feel compelled to snap up the offer—for it is just too good to pass up.

The contrast effect is in full swing when you consider the object and only the object, and not much else; when you confuse relative value with absolute value; when you are too narrow-minded, too focused, and too engaged to properly evaluate the offer for all its worth. This is a powerful bias, and one that businesses know all too well about.

The next time you are encountered with a special offer, stop, and confront the possibility that the Contrast Effect may be pulling your biological strings.

This effect can also be seen in conversation—more specifically, debates, negotiations, and other difficult conversations. In negotiating, for example, one party may table an extremely unrealistic condition (e.g. a parent company wanting 40% commission) knowing full-well they will not get it; they do so with the intention of coming down, but by setting the bar (the anchor) very high, any returned offer from the other party is likely to still be very high—unless, of course, the other party is playing the contrast game too.

There is a funny—perhaps also cruel—joke you can play on friends. When they say ‘OMG, guess how much I paid for this Ice Cream!?’ as friends so often do you, knowing they are indeed signalling that they paid a lot more than expected, can say ‘hmmm, I don’t know, erm, £10?…’ Be sure to capture their response.

Forer Effect

Horoscopes are popular not because they are profound pieces of mystical beauty, written by sagacious, omniscient beings, but because the information they communicate is so general that by definition, they apply to almost everyone. The Forer Effect is when we encounter a description of our personality and suppose it be something unique and specifically tailored to us, when, in truth, the description is vague, imprecise, and therefore applies to most people who read, see or hear it.

We also experience the Forer Effect with our own feelings, thoughts, sensations and insights; we think they are somehow unique and especial—but most of the time, we are but fooling ourselves.

Simple-Easy

Contrary to what is seems, simple does not mean easy. The simple thing is in fact often the hardest thing to do. Take running, for example:

If you ask a regular runner ‘How do you do it?’, they will likely answer, ‘Easy! You just put your running shoes on and you go out, and you run!’ In other words, it is rather simple.

Now ask someone who is struggling with the practice, and they will answer something like ‘ahhhh…. First you gotta buy an outfit, then you gotta fit it into your schedule, and you gotta get your diet right… then you gotta get dressed and then put your shoes on, and “oh, it raining,” so you have to go find your jacket, then you have to actually run!… then you have to shower, and then eat again… it’s such a drag!!’

See the difference? The regular runner is of course under no illusion: the running part is not easy; the practice, however, is rather simple. But for the non-runner, nothing is simple, not to mention easy. It is usually the non-runner-type of person who goes looking for shortcuts—you know, the kind of person who buys all the gear, who buys the gym membership, who writes down the goals and tells all their friends, who talks the talk better than anyone can talk the talk—but who never actually does the work.

Simple does not mean easy; easy means easy. Complicated does not mean hard; hard means hard.

Nominal Fallacy

The tendency to think that naming something means understanding it; that the label contains explanatory information. It doesn’t. The Nominal Fallacy is more widespread than you think; we talk about such things as ‘happiness’, ’success’, ’nature’, ‘instinct’, ‘health’, ‘wellbeing’ and even ‘bias’ and ‘fallacy’, but do we even understand what we are talking about? Maybe, but because there are so many interpretations of meaning, understanding is rarely mutual.

The Illusion of Transparency

We have a tendency to think we’re coming across very clearly to others—simply because we sound clear to ourselves. Again, seldom is this the case, especially when we have not considered the high possibility of us not being as transparent as we think we are.

Writing, debating, and holding difficult conversations are all effective ways of working on the transparency and coherency of your arguments.

Semantic Fallacy

The Semantic Fallacy is when we confuse the definition of a word with one similar—or worse, having a unique subjective definition—but that is technically and importantly different. We do this all the time without knowing, and usually we get away with it; but in difficult conversations and other matters requiring critical thinking, it can cause many unnecessary problems. In conversation and debate, this can be solved very easily by having each person define their meaning of the word in question, preferably by using words that have no association with the word in question ; this will lay on the table any trivial differences, allowing the conversation to carry on (and no, it doesn’t help at this point to start debating about definitions of words—which you can do after you’ve settled the first matter).

Naturalistic Fallacy

Mistaking natural for good. Our different interpretations of the word ‘natural’ (an example of the Nominal Fallacy) are likely to blame here. To define ‘natural’ technically—and how people use it—is not necessary to identify this fallacy in yourself and others, however: simply listen for the sentence ‘but it is natural!’, or the many other variations of it. All too often, this a sentence that means absolutely nothing. Technically, to say something is natural is to say it is not man-made, but derived from nature; it has nothing to do with what is good or bad.

Why, then, do we make the natural-equals-good mistake? Quite possibly, it is a side-effect of our (mostly justified) views about processed foods, fossil fuels, carbon footprint, declining animal populations, war, and all such negative man-made forces. Advertisers have smarty jumped on this side-effect, which has amplified it exponentially. We see products labelled ‘100% natural’ and automatically think they must be environmentally friendly or good for us; whilst we can sometimes be right, we are biased to believe this is always the case. It is not. It helps to know, therefore, what natural actually means: not man-made.

The Naturalistic Fallacy is especially relevant to this very discussion; our irrational tendencies are natural, but by no means are they good.

The Hypnotising Effects of ‘Because’ and ‘Yes’

When an argument is front-loaded with the word ‘because‘, we are more likely to believe it. It appears to have a sort of hypnotising effect on our attention to detail.

Socrates was known for his syllogistic reasoning method of argumentation. He would attempt to persuade people by firing a series of questions in which the answer returned would inevitably be ‘yes’; he would repeat this for a given number of times before finally delivering his ultimate question, a question, that would either stump the target, or have them blindly say ‘yes’ one final time (Socrates was a skilled sophist). Saying ‘yes’ many times over appears to have the similar hypnotising effect that ‘because’ has on our ability to separate wheat from chaff.

It’s difficult to know whether syllogistic reasoning is so powerful because of the ‘Yes’ Effect, or for some other reason—such as the fact that it tables the important details of any given argument. Done properly it is a great way to settle arguments; but it is easy to see why someone with ulterior motives would adopt it as their primary weapon of persuasion or manipulation.

Colour Theory

Without us knowing, colours have an impact on the way we think. We associate the colour Red, for example, with love, intimacy, hate, anger, danger, and death. Red is the colour of blood; blood reminds us of injury, therefore danger, therefore death. The association survived because it has real relevance.

Red is also the colour produced when blushing. Blushing happens when we receive compliments, or in social situations, and usually has sexual relevance, namely, that we blush when interacting with a potential partner. Blushing—and therefore, red—hence became associated with love.

White tends to generate thoughts of openness, space, godliness and simplicity. Black may represent power, dominance, danger, boldness, importance, or death. We associate Orange with energy, liveliness and happiness. This is such a fascinating phenomenon, in my opinion; it is also one which marketers know all about. Every wonder why the woman in the advert is wearing a red dress, or why Dog the Bounty Hunter and his team where 99% black, or why Facebook and twitter and Microsoft love the colour blue? Colours influence how we think.

Sophistication Fallacy

Our tendency to associate higher quantity or complexity with better. We may think a big, complicated-looking plan is a good plan, when it could in fact be horseshit; its intricacy distorts our perception of value. We may think fancy use of words and fluency are a sign of profundity, but Socrates was just an eminent impresario.

The Conjunction Fallacy is a term commonly used in psychological psychobabble, but it usually focuses only on the addition of premises, whereas the Sophistication Fallacy refers to quantity and apparent complexity; hence, consider the Conjunction Fallacy a subset of the Sophistication Fallacy

Misinformation

In scientific literature, the Misinformation Effect refers to a type of fallibility in memory I will cover below—namely, the tendency of our memories, over time, to deteriorate in accuracy. But I think about it a different way. Many disagreements about impersonal matters are caused not by actual differences of opinion, but by each party having different information, such as a different news source, for example; and it is usually disagreements of this nature that escalate into impossible conversations, or in other words, arguments. Actual disagreements can be prevented—or at minimum, attenuated—by each party laying on the table their information, and its source.

Memory | Knowledge | Learning

The Illusion of Understanding

The difference between surface knowledge and deep knowledge has been known for thousands of years; and pre-internet, it was considerably easy to identify. Surface knowledge is telling your friend how ‘E=mc² is the secret of the universe’, but getting stumped when they ask the most simple question in the universe: Why?

Deep knowledge is being able to explain E=mc².

It is easy to think those who confuse surface knowledge for deep knowledge do so intentionally, but scarcely is that the case, especially not today. The internet has given everyone instant access to most of the information they will ever need; this includes information about things they don’t need but may be interested in, e.g., Einstein’s Theory of Relativity. But merely reading an article about the Einstein’s Theory of Relativity does not automatically result in understanding—far from it. The problem is that it generates in the mind of the reader an impression of understanding—the Illusion of Understanding. Because so many people use the internet today, the Illusion of Understanding is more prevalent than ever. Be on guard.

On a side note, watching debates is a good way to identify this illusion. Generally, you will see those whose arguments are picked apart do one of two things: admit to their lack of depth, or try to divert the subject. What could also happen is that they lose control altogether (see Cognitive Dissonance).

Inattentional Blindness

This is more like random error. We may totally miss important information and make irrational judgements for no other reason than ignorance. Attentional blindness is another beast altogether, and one you entertain at your peril.

Cognitive Overload

There is some debate as to how many pieces of information our working memory can handle. Incidentally, some scientists believe IQ tests are only good for testing working memory—which can be improved. This is not to say they are no good, however; working memory is important, but its limit is clear (for example, how much of what you have read so far can you remember?…). It appears that seven pieces of information may be the golden number. But it is a number, as most people can probably attest, that still seems to high. More realistic is three or four pieces of information at any given moment.

Cognitive Overload, then, is when working memory has reached its capacity, when all the cogs are spinning and it feels like your head’s about to explode ; in this situation, because all the principal equipment is in use, our critical thinking skills become tremendously compromised; and so we become increasingly likely to miss important details, and we find it harder to make connections and to be coherent. In this situation, our attention span is often equivalent to that of a plank of wood. The best antidote is a one or two-hour break, preferably involving nothing related to the work, and a bite to eat.

False Memory

Forgetting where we put our keys, names, phone numbers, the location of our parked car, our next doctor’s appointment, to check the Best-Before date on that block of Stilton, or 90% of the last book we read is only half the problem; the memories that we can recall—those we believe are definitely correct, the experiences and important events that seem so accurately and irrevocably rooted in the heart of our memory banks—well, unfortunately, they are just as vulnerable to error, inaccuracy, and fallacy as our attempts to recall the name of the person we chewed the rag with only five minutes ago.

The Positive Memory Effect refers to our tendency to remember only the positives from previous experiences; we either put in a positive frame what in truth was a dreadful experience, or selectively recall only that which we want to remember. Consider childhood memories, or more specifically, junior school memories: you’ll hear many people say things like ‘oh, those were the “days,” man!’ and ‘I wish I could go back!’ and ‘life was so easy, fun, happy…’ and ‘I wish I’d made the most of it’ but the truth (as it so often does) tells a different story.

Most childhoods contain their fair share of painful moments, worries, and struggles, but these get conveniently pushed to the bottom of the memory pile, or even, forgotten altogether. Most of this happens subconsciously; we are completely ignorant to an automatic process of re-writing that happens our brains every time we recollect and reminisce. Yes, that’s right, re-writing; every time a moment from the past is recalled, the memory itself is changed by not only what you want to remember, but also by the state of your mind as you conjure up the memory, and your reasons for doing it. Once you’ve recalled a memory, the pathway to that (selected) bit of memory gets strengthened. This is how memories rewrite themselves.

There is, in fact, a separate list of biases that are solely connected with memory rather appropriately named, Memory Biases, which explain how memory can be self-serving, distorting, only positive, imaginary, motivated, and even entirely inaccurate. The most important thing to be aware of, though, is just how fallible your memory can be; that it is not only selective, but self-modifying. If you are wise, you won’t put too much trust in memory.

Illusory Truth Effect When a piece of information that initially made an impression on us turns out to be inaccurate or false at a later date, and we are made aware of it, we may still (wrongly) associate it with truth in the future. The reason is that truth, as beautiful as it can be, is never usually portrayed as beautiful; truth cannot be played around with like falsehoods—which can be made into the most wonderful and spectacular stories, which we remember more. A massive newspaper headline makes a much deeper impression on our memory than a five-line apology three weeks later. It is like false information stays rooted in the brain’s ‘truth bucket’, because transferring it to the ‘false bucket’ requires applied effort. Of course, it would not require so much applied effort if the truth was as shocking as the false. Humour Bias We tend to remember people, ideas and experiences of a humorous nature. Think back to your school days: teacher and mates who made you laugh have a more established slot in your memory bank. The reason is emotion; the emotions an experience triggers has a profound impact on the detail of the recordings we make of our experience. As it turns out, laughter is a powerful emotion—perhaps the most powerful—so we cannot help but remember funny people from our past. The best teachers know this and use it to their advantage. Hindsight Bias The ‘I knew it all along!’ bias. We have an inclination to think that past events were predictable, and, if they featured us making a mistake, that ‘we could of done better’. You may be told an answer to something and feel that you knew it all along, and you could actually be right; but the problem is, anyone can say this about anything. We may believe something was predictable, when in reality, there was no way of knowing at the time of the event. Whilst reflecting on past events of your own life you may think you could have done ‘something different’, but it is extremely likely you did your best with the knowledge and resources you had at that time. Hindsight Bias, if not managed, can be crippling to one’s psychological state. The only way to overcome ruminating thoughts is to accept the past, learn the lesson, and move on—better put, first accept the past, then analyse how you think you could have done differently, and then resolve to not repeat it in the future. Generating principles from experiences is the basis of learning. Framing and the Peak-End Effect The framing we put around our experiences has a direct influence on our interpretation of them—that is, how we think about them before, during, and after. Whether we experience loss or gain is the crucial factor. For example: we are able of endure the pains of diet, exercise, childbirth and hard work because in the end we will be rewarded; whereas injury, financial struggle, and depression have very little gain as far as we’re concerned. We would not necessarily use the word ’suffering’ when explaining the former; but it is the first word that pops to mind when explaining the latter. It may help to think of framing as a filter which your brain uses to interpret each experience, a filter that it swaps whenever it deems necessary.

The Peak-End Effect is our inclination to judge an experience not by its totality or duration, but by how we felt when it was most intense (its peak), and/or as it finished (its end). With the latter, for example, we may recall a dreadful experience in a more positive frame purely because the experience ended on a less dreadful note. Memories are fickle.The former (the peak) is slightly more complicated. We are all familiar with the phenomena of ’time slowing down’, and usually can remember an intense event from the past when we felt it. The key word here is intense, because it is in such moments our brain records at a much higher resolution—that is, in times of high emotion our brain writes memories in substantially more detail.

Primacy and Recency Effect We have stronger recall of input (visual, olfactory, tactile, auditory, gustatory, or other bodily senses ) that was presented well—the Primacy Effect. We also have stronger recall of input that was most recent, for example, we’ll have better memory of the end of a book or movie than the beginning—the Recency Effect. This happens subconsciously, which means that if we are asked why it is that we remember a specific piece of information, often the best we manage is a post-hoc rationalisation.

Misreading Events | The Fairness of Life

Just-World Fallacy and Reciprocity Bias We have a hard time accepting the fundamental truth that life is not fair. Mother nature does what she does and cares not about our ideas of morality, philosophy, science, or fairness; despite all our knowledge and wisdom, there is still so much we do not know about the world. We do know, though, that Randomness is actually a thing; actions do not necessarily have predictable nor just consequences fortune lands in the hands of some people the way a penny drops down a drain. Our view of this as ‘unfair’ is just that, our view—and fortune cares nothing about our view.

Wonder where the idea of ‘Karma’ came from? It may be random, of course, but it may just be, the Just-World Fallacy.Reciprocation Bias stems from this false of a Just-world, and it refers to the expectation of expecting good that we have done to be done unto us. E.g. you let people pull out from the side-road not so much because you are good, but because you expect others to let you out; if nobody ever let you out, the chances of you ever letting anyone out would be about as likely as everyone always letting you out. Single Cause Fallacy We humans pride ourselves on our thinking abilities—we like to think we are rational, reasonable, sensible actors. A symptom of this is our desire to find reasons for occurrences, or, more specifically, one reason. Uncertainty is not comfortable; not knowing the reason(s) for something that has significance to us is a problem, and one our brain will rapidly ‘fix’ by coming to what it believes to be the most sensible conclusion(s). There are two problems with this, however. First, the accuracy of the just-about-automatic conclusion our brain reaches depends heavily on our familiarity with the issue at hand—for example: the first explanation that materialises when an electrical appliance suddenly stops working is something like ‘Gotta be a darn old blown fuse!’, and yes, it usually is a darn old blown fuse; but if the appliance plays up, blows up, or, as they commonly do, grows legs and takes off, your reasoning may be a little off.

The second problem is more important; it is that there is rarely a single reason for any event. Generally, most occurrences have more than one cause; this is especially the case when the subject matter is a foggy one, like depression, market fluctuation, or cancer. In such cases, critical mass is what ultimately triggers the event.What’s more, an event may appear to have a single cause, but that cause may have a cause, and that cause may have a cause, and that cause may have a cause, and that cause may have a cause, and that cause may have a cause… —and so I went on. For example, the fuse may have blown—but fuses don’t just blow (unless the appliance is ancient). Does this now mean the cause must be updated? If you are interested finding the true cause—in this case, fixing your appliance—it sort of does. So better put, you have to update your question: you are not looking for what caused the appliance stop working—that was the blown fuse—but rather, what made the fuse blow. Fix that, then fix the fuse, and you may make the electrician grade after all.

If your TV grows feet, it may be time to lock the doors—because tomorrow it may have shins. Presuming it is trying to escape, determining why it wants to is not easy. We cannot identify a single cause until it grows a larynx; but what we can do is hypothesise. Perhaps: it doesn’t like you smoking around it, it is fed up with your whining, it has fallen in love with the TV next door, or it wants to go travelling—it very well could be one of these, or it could be all of them.’The law of compounding is always in motion.’ It may help to think about cause and effect the same way you do about interest on your annual statement (more precisely, Compound Interest): there is a reason for the interest, but it is not one you understand, nor necessarily care about.

The upshot, simply, is to know that your brain likes reasons because it dislikes uncertainty, and it will find them as soon as it can, and then it will sort through them and pick one that fits best; and that it does this for (ironically) reasons you don’t understand, and not because it is rational. Knowing this will serve as a reminder to double-check your lines of reasoning for any unjustified ‘single’ causes. One more thing. Trying to eliminate the Single-Cause Fallacy is not only impossible, but potentially harmful. In order to make real progress—individually, or as a unit— in any domain, it is essential to make presuppositions and hypotheses; without these, even the smallest problem can be overwhelmingly daunting. Progress is driven by, built on top of, and fundamentally reliant upon uncertain belief, and not, as most people think, rationale, absolute truth, or happenstance. But this begs the question, why does not all belief result in progress? Well, it depends on what you mean by ‘belief’… If you mean belief as in the way one tends to believe in Heaven, then, well, sort off— but it is a slow, perhaps sufferable, and enduring type of progress. If you mean belief as in the same way you believe your business will succeed, it absolutely does result in progress.

The former belief is a dogmatic, rigid, unchangeable type of belief; this slows down progress, sometimes causes regression, sometimes deterioration, and sometimes, annihilation. The latter type of belief is uncertain, unfixed, and changeable; it can be updated or ditched when better, more pragmatic and fruitful beliefs comes along.

If your doctor tells you ‘fix up—or you’ll be dead within two years’, you can be darn sure, you will fix up. You will feel compelled to stop drinking, smoking and eating junk, and to start exercising, meditating and laughing. In other words, you will make progress. You could, of course, ignore your doctor, but unless you live an isolated life and no longer give a shit, you will find it very hard not to make changes; friends, family and your own conscience will constantly remind you of your potential impending visit from the Grim Reaper.

The indispensable part of uncertain belief is (believe it or not) the uncertainty. Beliefs allow us to move forward, regardless of their nature. But only beliefs that can be updated, discarded, and turned on their heads can transform us from floundering snails to almighty eagles. Without uncertain belief, the staggering, almost fictitious technological innovation of the past fifty years would not have been possible; at best, it would have made a most splendid science fiction book.

The next and final installment will be the most dense of the four; it will look at truth, money, gambling (and other marketing exploitations), the role of religious and other beliefs on cognition, and perception in general.