The art of thinking clearly rolf dobelli pdf

  1. [Download PDF]-Books The Art of Thinking Clearly By - Rolf Dobelli -Full Pages - ghfghgfhghg
  2. The Art of Clear Thinking
  3. The Art Of Thinking Clearly ( 2013) By Rolf Dobelli
  4. The-Art-of-Thinking-Clearly.pdf

Authorities pose two main problems to clear thinking: first, their track records are often sobering. There are about one million trained economists on the planet. Also by John C. Maxwell.. of the evening, as Steve and I were walking to our car, he said to me How Successful People. This work takes its title from Rolf Dobelli's The Art of Thinking Clearly (), a book Thinking Clearly” by Rolf Dobelli. A Summary of Rolf Dobelli's. “The Art.

Language:English, Spanish, Japanese
Genre:Health & Fitness
Published (Last):03.07.2016
Distribution:Free* [*Registration Required]
Uploaded by: SHAUNA

51955 downloads 138094 Views 19.38MB PDF Size Report

The Art Of Thinking Clearly Rolf Dobelli Pdf

This books (The Art of Thinking Clearly [PDF]) Made by Rolf Dobelli About Books Pub Date: Pages: Language: English. The Art of Thinking Clearly Summary by Rolf Dobelli presents you the most common cognitive mistakes that lead to bad decisions. English] The art of thinking clearly / Rolf Dobelli; translated by Nicky Griffin. . The failure to think clearly, or what experts call a “cognitive error,” is a systematic ,

For example. The failure to think clearly. Sunk Cost Fallacy. A product is not better because is sells more. So before you take the plunge. If we They are good swimmers because of their stop now it will all have been for nothing. It takes it a step further.

We prefer wrong information to no information. Thus, the availability bias has presented the banks with billions in losses. What was it that Frank Sinatra sang? Fend it off by spending time with people who think differently than you think — people whose experiences and expertise are different than yours.

See also Ambiguity Aversion ch. The symptoms were new to me, and the pain was growing by the day. Eventually I decided to seek help at a local clinic. A young doctor began to inspect me, prodding my stomach, gripping my shoulders and knees and then poking each vertebra. To signal its end, he pulled out his notebook and said: Take one tablet three times a day.

The pain grew worse and worse — just as the doctor had predicted. The doctor must have known what was wrong with me after all. After two more days of agony, I finally called the international air ambulance. The Swiss doctor diagnosed appendicitis and operated on me immediately. I replied: That Corsican doctor had no idea. Probably just the same type of stand-in you find in all the tourist places in high season.

Sales are in the toilet, the salespeople are unmotivated, and the marketing campaign has sunk without a trace. In his desperation, he hires a consultant. I can fix it for you — but not overnight. The measures will require sensitivity, and most likely, sales will fall further before things improve.

A year later, sales fall, and the same thing happens the next year. If the problem continues to worsen, the prediction is confirmed. If the situation improves unexpectedly, the customer is happy and the expert can attribute it to his prowess.

Either way he wins. Suppose you are president of a country, and have no idea how to run it. Naturally you leave the duration and severity of the period open. Disasters, floods, fires, death — they are all part of the larger plan and must take place. Believers will view any deterioration of the situation as confirmation of the prophecy, and any improvement as a gift from God. But beware: For example, a career change requires time and often incorporates loss of pay.

The reorganisation of a business also takes time. But in all these cases, we can see relatively quickly if the measures are working. The milestones are clear and verifiable.

Look to these rather than to the heavens. See also Action Bias ch. Imagine that an invisible Martian decides to follow you around with an equally invisible notebook, recording what you do, think and dream. We like to knit this jumble of details into a neat story.

We want our lives to form a pattern that can be easily followed.

We do the same with world history, shaping the details into a consistent story. We comprehend why the Iron Curtain had to fall or why Harry Potter became a best-seller. We simply build the meaning into them afterward. Stories are dubious entities. But apparently we cannot do without them. Why remains unclear. What is clear is that people first used stories to explain the world, before they began to think scientifically, making mythology older than philosophy.

This has led to the story bias. In the media, the story bias rages like wildfire. What do we read the next day? We hear the tale of the unlucky driver, where he came from and where he was going.

We read his biography: The absurd thing: Was it fatigue? If not, was the bridge damaged? If so, by what? Was a proper design even used? Where are there other bridges of the same design? Stories attract us; abstract details repel us. Consequently, entertaining side issues and backstories are prioritised over relevant facts. On the upside, if it were not for this, we would be stuck with only non-fiction books. Here are two stories from the English novelist E.

Which one would you remember better? According to information theory, we should be able to hold on to A better: Advertisers have learned to capitalise on this too. Take a look at it on YouTube. From our own life stories to global events, we shape everything into meaningful stories.

Doing so distorts reality and affects the quality of our decisions, but there is a remedy: Ask yourself: Visit the library and spend half a day reading old newspapers. To experience the effect once more, try to view your life story out of context.

Whenever you hear a story, ask yourself: The omitted elements might not be of relevance. The real issue with stories: See also False Causality ch. In , he emigrated from a tiny Swiss village to Paris to seek his fortune in the movie industry. In August , two months after Paris was occupied, he noted: Their officers also confirmed this to me.

England will fall as fast as France did, and then we will finally have our Parisian lives back — albeit as part of Germany. In retrospect, the actual course of the war appears the most likely of all scenarios.

[Download PDF]-Books The Art of Thinking Clearly By - Rolf Dobelli -Full Pages - ghfghgfhghg

However, just twelve months later, the financial markets imploded. Asked about the crisis, the same experts enumerated its causes: In hindsight, the reasons for the crash seem painfully obvious. The hindsight bias is one of the most prevailing fallacies of all. If a CEO becomes successful due to fortunate circumstances he will, looking back, rate the probability of his success a lot higher than it actually was. One particularly blundering example: But back then, nobody would have dreamed of such an escalation.

It would have sounded too absurd. So why is the hindsight bias so perilous? Well, it makes us believe we are better predictors than we actually are, causing us to be arrogant about our knowledge and consequently to take too much risk. And not just with global issues: It was always going to go wrong, they were just so different. Studies have shown that people who are aware of it fall for it just as much as everyone else. Write down your predictions — for political changes, your career, your weight, the stock market and so on.

Then, from time to time, compare your notes with actual developments. You will be amazed at what a poor forecaster you are. See also Fallacy of the Single Cause ch. He composed numerous works.

The Art of Clear Thinking

How many there were I will reveal at the end of this chapter. How much confidence should we have in our own knowledge? Psychologists Howard Raiffa and Marc Alpert, wondering the same thing, have interviewed hundreds of people in this way. The results were amazing. The researchers dubbed this amazing phenomenon overconfidence.

We systematically overestimate our knowledge and our ability to predict — on a massive scale. The overconfidence effect does not deal with whether single estimates are correct or not.

Rather, it measures the difference between what people really know and what they think they know. However, the professor will offer his forecast with certitude. Overconfidence does not stop at economics: Entrepreneurs and those wishing to marry also deem themselves to be different: In fact, entrepreneurial activity would be a lot lower if overconfidence did not exist.

For example, every restaurateur hopes to establish the next Michelin-starred restaurant, even though statistics show that most close their doors after just three years. The return on investment in the restaurant business lies chronically below zero. Hardly any major projects exist that are completed in less time and at a lower cost than forecasted.

The list can be added to at will. Why is that? Here, two effects act in unison. First, you have classic overconfidence. Second, those with a direct interest in the project have an incentive to underestimate the costs: We will examine this strategic misrepresentation Chapter 89 later in the book.

What makes overconfidence so prevalent and its effect so confounding is that it is not driven by incentives; it is raw and innate.

No surprise to some readers: Even more troubling: Even self-proclaimed pessimists overrate themselves — just less extremely. Be sceptical of predictions, especially if they come from so-called experts. And with all plans, favour the pessimistic scenario. This way you have a chance of judging the situation somewhat realistically.

Back to the question from the beginning: Johann Sebastian Bach composed works that survived to this day. He may have composed considerably more, but they are lost. See also Illusion of Skill ch. Wherever he was invited, he delivered the same lecture on new quantum mechanics. Over time, his chauffeur grew to know it by heart: How about I do it for you in Munich? Later, a physics professor stood up with a question.

The driver recoiled: My chauffeur will answer it. First, we have real knowledge. We see it in people who have committed a large amount of time and effort to understanding a topic.

The second type is chauffeur knowledge — knowledge from people who have learned to put on a show. They reel off eloquent words as if reading from a script. Unfortunately, it is increasingly difficult to separate true knowledge from chauffeur knowledge. With news anchors, however, it is still easy. These are actors. Everyone knows it. And yet it continues to astound me how much respect these perfectly-coiffed script readers enjoy, not to mention how much they earn moderating panels about topics they barely fathom.

With journalists, it is more difficult. Some have acquired true knowledge. Often they are veteran reporters who have specialised for years in a clearly defined area. They make a serious effort to understand the complexity of a subject and to communicate it. They tend to write long articles that highlight a variety of cases and exceptions.

The majority of journalists, however, fall into the category of chauffeur. They conjure up articles off the tops of their heads, or rather, from Google searches. Their texts are one-sided, short, and — often as compensation for their patchy knowledge — snarky and self-satisfied in tone.

The same superficiality is present in business. Dedication, solemnity, and reliability are undervalued, at least at the top.

What lies inside this circle you understand intuitively; what lies outside, you may only partially comprehend. But it is terribly important that you know where the perimeter is. How do you recognise the difference? There is a clear indicator: From chauffeurs, we hear every line except this. See also Authority Bias ch. After five minutes he disappears. One day, a policeman comes up to him and asks: I went to the store, checked a few boxes, wrote his name on it and paid.

As I handed him the copy of the ticket, he balked. I wanted to do that. He looked at me blankly. In casinos, most people throw the dice as hard as they can if they need a high number, and as gingerly as possible if they are hoping for a low number — which is as nonsensical as football fans thinking they can swing a game by gesticulating in front of the TV. T h e illusion of control is the tendency to believe that we can influence something over which we have absolutely no sway.

This was discovered in by two researchers, Jenkins and Ward.

Their experiment was simple, consisting of just two switches and a light. The men were able to adjust when the switches connected to the light and when not. Even when the light flashed on and off at random, subjects were still convinced that they could influence it by flicking the switches. Or consider this example: For this, he placed people in sound booths and increased the volume until the subjects signalled him to stop.

The button was purely for show, but it gave participants the feeling that they were in control of the situation, leading them to withstand significantly more noise. Crossing the street in Los Angeles is a tricky business, but luckily, at the press of a button, we can stop traffic. Or can we? Such tricks are also designed into open-plan offices: Clever technicians create the illusion of control by installing fake temperature dials. This reduces energy bills — and complaints.

Central bankers and government officials employ placebo buttons masterfully. Take, for instance, the federal funds rate, which is an extreme short-term rate, an overnight rate to be precise. Nobody understands why overnight interest rates can have such an effect on the market, but everybody thinks they do, and so they do. The same goes for pronouncements made by the Chairman of the Federal Reserve; markets move, even though these statements inject little of tangible value into the real economy.

They are merely sound waves. And still we allow economic heads to continue to play with the illusory dials.

It would be a real wake-up call if all involved realised the truth — that the world economy is a fundamentally uncontrollable system. Do you have everything under control? Probably less than you think. Do not think you command your way through life like a Roman emperor. Rather, you are the man with the red hat. Therefore, focus on the few things of importance that you can really influence.

For everything else: See also Coincidence ch. Yes, many rats were destroyed, but many were also bred specially for this purpose. Instead of lots of extra scrolls being found, they were simply torn apart to increase the reward.

Similarly, in China in the nineteenth century, an incentive was offered for finding dinosaur bones. Farmers located a few on their land, broke them into pieces and cashed in.

Modern incentives are no better: And what happens? Managers invest more energy in trying to lower the targets than in growing the business. These are examples of the incentive super-response tendency. Credited to Charlie Munger, this titanic name describes a rather trivial observation: Good incentive systems comprise both intent and reward.

An example: Poor incentive systems, on the other hand, overlook and sometimes even pervert the underlying aim. For example, censoring a book makes its contents more famous and rewarding bank employees for each loan sold leads to a miserable credit portfolio. Nobody wants to be the loser CEO in his industry.

Do you want to influence the behaviour of people or organisations? You could always preach about values and visions, or you could appeal to reason.

But in nearly every case, incentives work better. These need not be monetary; anything is useable, from good grades to Nobel Prizes to special treatment in the afterlife. For a long time I tried to understand what made well-educated nobles from the Middle Ages bid adieu to their comfortable lives, swing themselves up on to horses and take part in the Crusades.

And then it came to me: If they came back alive, they could keep the spoils of war and live out their days as rich men. If they died, they automatically passed on to the afterlife as martyrs — with all the benefits that came with it. It was win-win. We would effectively be incentivising them to take as long as possible, right? So why do we do just this with lawyers, architects, consultants, accountants and driving instructors?

My advice: Be wary, too, of investment advisers endorsing particular financial products. They are not interested in your financial well-being, but in earning a commission on these products. These are often worthless because, again, the vendors have their own interests at heart.

What is the old adage? Passion, idiocy, psychosis or malice. See also Motivation Crowding ch. There were days when he felt like he could move mountains, and those when he could barely move. When that was the case — fortunately it happened only rarely — his wife would drive him to the chiropractor. The next day he would feel much more mobile and would recommend the therapist to everyone.

Another man, younger and with a respectable golf handicap of 12, gushed in a similar fashion about his golf instructor. Whenever he played miserably, he booked an hour with the pro, and lo and behold, in the next game he fared much better. As absurd as it seemed, he felt compelled to do it: What links the three men is a fallacy: Suppose your region is experiencing a record period of cold weather. In all probability, the temperature will rise in the next few days, back toward the monthly average.

The same goes for extreme heat, drought or rain. Weather fluctuates around a mean. The same is true for chronic pain, golf handicaps, stock market performance, luck in love, subjective happiness and test scores. In short, the crippling back pain would most likely have improved without a chiropractor. The handicap would have returned to 12 without additional lessons. And the performance of the investment adviser would also have shifted back toward the market average — with or without the restroom dance.

Extreme performances are interspersed with less extreme ones. The most successful stock picks from the past three years are hardly going to be the most successful stocks in the coming three years. Knowing this, you can appreciate why some athletes would rather not make it on to the front pages of the newspapers: The result? The next time he looks at motivation levels, the same people will not make up the bottom few — there will be others.

Was the course worth it? The situation is similar with patients who are hospitalised for depression. They usually leave the clinic feeling a little better. It is quite possible, however, that the stay contributed absolutely nothing. Another example: The following year, the schools had moved up in the rankings, an improvement that the authorities attributed to the programme rather than to natural regression to mean. Ignoring regression to mean can have destructive consequences, such as teachers or managers concluding that the stick is better than the carrot.

For example, following a test the highest performing students are praised, and the lowest are castigated.

In the next exam, other students will probably — purely coincidentally — achieve the highest and lowest scores. Thus, the teacher concludes that reproach helps and praise hinders. A fallacy that keeps on giving. See also Problem with Averages ch. They download and sell stocks like crazy and, of course, completely at random. What happens? After one week, about half of the monkeys will have made a profit and the other half a loss.

The ones that made a profit can stay; the ones that made a loss you send home. In the second week, one half of the monkeys will still be riding high, while the other half will have made a loss and are sent home. And so on. After ten weeks, about 1, monkeys will be left — those who have always invested their money well. After twenty weeks, just one monkey will remain — this one always, without fail, chose the right stocks and is now a billionaire.

How does the media react? And they will find some: Perhaps he sits in another corner of the cage. Or, maybe he swings headlong through the branches, or he takes long, reflective pauses while grooming.

He must have some recipe for success, right? How else could he perform so brilliantly? Spot-on for twenty weeks — and that from a simple monkey? The monkey story illustrates the outcome bias: This fallacy is also known as the historian error. A classic example is the Japanese attack on Pearl Harbor.

The Art Of Thinking Clearly ( 2013) By Rolf Dobelli

Should the military base have been evacuated or not? However, only in retrospect do the signals appear so clear. At the time, in , there was a plethora of contradictory signals. Some pointed to an attack; others did not. To assess the quality of the decision, we must use the information available at the time, filtering out everything we know about it post-attack particularly that it did indeed take place.

Another experiment: To do this, you ask each to carry out a difficult operation five times. With surgeon A, no one dies. With surgeon B, one patient dies. With surgeon C, two die. How do you rate the performance of A, B and C? If you think like most people, you rate A the best, B the second best, and C the worst. You can guess why: You can only really judge a surgeon if you know something about the field, and then carefully monitor the preparation and execution of the operation.

In other words, you assess the process and not the result. Alternatively, you could employ a larger sample, if you have enough patients who need this particular operation: What stands out: To assess the three surgeons purely on the basis of the outcomes would be not only negligent but also unethical.

A bad result does not automatically indicate a bad decision and vice versa. So rather than tearing your hair out about a wrong decision, or applauding yourself for one that may have only coincidentally led to success, remember why you chose what you did. Were your reasons rational and understandable? See also Sunk Cost Fallacy ch.

The sole topic of conversation for the past two months has been bathroom tiles: Rarely have I seen my sister in such anguish. site, the Internet bookseller, has two million titles available. Nowadays, people are bombarded with options, such as hundreds of mental disorders, thousands of different careers, even more holiday destinations and an infinite variety of lifestyles.

There has never been more choice. When I was young, we had three types of yogurt, three television channels, two churches, two kinds of cheese mild or strong , one type of fish trout and one telephone, provided by the Swiss Post. The black box with the dial served no other purpose than making calls, and that did us just fine. In contrast, anyone who enters a phone store today runs the risk of being flattened by an avalanche of brands, models and contract options.

And yet, selection is the yardstick of progress. It is what sets us apart from planned economies and the Stone Age. Yes, abundance makes you giddy, but there is a limit. When it is exceeded, a surfeit of choices destroys quality of life. The technical term for this is the paradox of choice. In his book of the same title, psychologist Barry Schwartz describes why this is so.

First, a large selection leads to inner paralysis. They could try as many as they liked and then download them at a discount. The next day, the owners carried out the same experiment with only six flavours. They sold ten times more jelly on day two. With such a wide range, customers could not come to a decision, so they bought nothing.

The experiment was repeated several times with different products. The results were always the same. Second, a broader selection leads to poorer decisions.

If you ask young people what is important in a life partner, they reel off all the usual qualities: But do they actually take these criteria into account when choosing someone? In the past, a young man from a village of average size could choose among maybe twenty girls of similar age with whom he went to school.

He knew their families and vice versa, leading to a decision based on several well-known attributes. Nowadays, in the era of online dating, millions of potential partners are at our disposal. It has been proven that the stress caused by this mind-boggling variety is so large that the male brain reduces the decision to one single criterion: Finally, large selection leads to discontent. How can you be sure you are making the right choice when options surround and confound you?

The answer is: The more choice you have, the more unsure and therefore dissatisfied you are afterward. So, what can you do? Think carefully about what you want before you inspect existing offers. Write down these criteria and stick to them rigidly. Also, realise that you can never make a perfect decision. Aiming for this, given the flood of possibilities, is a form of irrational perfectionism.

Yes, even in terms of life partners. Only the best will do? In this age of unlimited variety, rather the opposite is true: See also Decision Fatigue ch. So he bought them. Joe Girard is considered the most successful car salesman in the world. His tip for success: Just one sentence salutes them: It means this: Still, the question remains: According to research, we see people as pleasant if A they are outwardly attractive, B they are similar to us in terms of origin, personality or interests, and C they like us.

Consequently, advertising is full of attractive people. In short, the more similar the better. Mirroring is a standard technique in sales to get exactly this effect.

If the downloader speaks very slowly and quietly, often scratching his head, it makes sense for the seller to speak slowly and quietly, and to scratch his head now and then too. That makes him likeable in the eyes of the downloader, and thus a business deal is more likely. Here factor C comes into play: Compliments work wonders, even if they ring hollow as a drum. So-called multilevel marketing selling through personal networks works solely because of the liking bias.

Though there are excellent plastic containers in the supermarket for a quarter of the price, Tupperware generates an annual turnover of two billion dollars. The friends who hold the Tupperware parties meet the second and third congeniality standard perfectly. Aid agencies employ the liking bias to great effect. Their campaigns use beaming children or women almost exclusively.

Never will you see a stone- faced, wounded guerrilla fighter staring at you from billboards — even though he also needs your support. Conservation organisations also carefully select who gets the starring role in their advertisements. They are perhaps just as endangered as pandas, gorillas, koalas and seals — and even more important for the ecosystem.

But we feel nothing for them. The more human a creature acts, the more similar it is to us, the more we like it.

The bone skipper fly is extinct? Too bad. Politicians, too, are maestros of the liking bias. Depending on the make-up and interests of an audience, they emphasise different topics, such as residential area, social background or economic issues. And they flatter us: Each potential voter is made to feel like an indispensable member of the team: A friend who deals in oil pumps told me how he once closed an eight-figure deal for a pipeline in Russia. He shook his head. It turned out that both of us — the downloader and me — were die-hard dinghy fans.

From that moment on, he liked me; I was a friend. So the deal was sealed. Amiability works better than bribery. Although it had a few miles on the odometer, it looked in perfect condition. The next day, I took it out for a spin and stopped at a gas station. I politely declined. Only on the way home did I realise how ridiculous I was to have said no. If I were thinking purely rationally, I would have sold the car immediately. We consider things to be more valuable the moment we own them. In other words, if we are selling something, we charge more for it than what we ourselves would be willing to spend.

To probe this, psychologist Dan Ariely conducted the following experiment: The simple fact of ownership makes us add zeros to the selling price.

In real estate, the endowment effect is palpable. Sellers become emotionally attached to their houses and thus systematically overestimate their value. They balk at the market price, expecting downloaders to pay more — which is completely absurd since this excess is little more than sentimental value. Richard Thaler performed an interesting classroom experiment at Cornell University to measure the endowment effect.

He distributed coffee mugs to half of the students and told them they could either take the mug home or sell it at a price they could specify. In other words, Thaler set up a market for coffee mugs. But the result was much lower than that. We can safely say that we are better at collecting things than at casting them off. Not only does this explain why we fill our homes with junk, but also why lovers of stamps, watches and pieces of art part with them so seldomly. Amazingly, the endowment effect affects not only possession but also near- ownership.

A person who bids until the end of an auction gets the feeling that the object is practically theirs, thus increasing its value. The would-be owner is suddenly willing to pay much more than planned, and any withdrawal from the bidding is perceived as a loss — which defies all logic. However, if you make it to the final stages of the selection process and then receive the rejection, the disappointment can be much bigger — irrationally.

Keep in mind that it can recoup this or more in the blink of an eye. See also House-Money Effect ch. For various reasons, they were all running behind. The pianist wanted to be there 30 minutes early, but he fell into a deep sleep after dinner. The blast was heard all around the village. Miraculously, nobody was killed. The fire chief traced the explosion back to a gas leak, even though members of the choir were convinced they had received a sign from God.

Hand of God or coincidence? Suddenly the phone rang. I picked it up and, lo and behold, it was Andy. But, telepathy or coincidence? How Intel came upon the information is remarkable: Both men were staying in the same hotel in California, and checked out on the same day. After they had left, the hotel accepted a package for Mike Webb at reception. It contained confidential documents about the AM chip, and the hotel mistakenly sent it to Mike Webb of Intel, who promptly forwarded the contents to the legal department.

How likely are stories like that? The Swiss psychiatrist C. But how should a rationally minded thinker approach these accounts? Preferably with a piece of paper and a pencil. Consider the first case, the explosion of the church. Draw four boxes to represent each of the potential events. The first possibility is what actually took place: Pay special attention to how often the last case has happened: Suddenly, the story has lost its unimaginable quality.

So, no: And anyway, why would God want to blow a church to smithereens? What a ridiculous way to communicate with your worshippers! And it must not be just Andy: We tend to stumble when estimating probabilities. In sum: Improbable coincidences are precisely that: What would be more surprising would be if they never came to be.

So you keep your mouth shut for another day. When everyone thinks and acts like this, groupthink is at work: Thus, motions are passed that each individual group member would have rejected if no peer pressure had been involved. Groupthink is a special branch of social proof, a flaw that we discussed in chapter 4. In March , the U. In January , two days after taking office, President Kennedy was informed about the secret plan to invade Cuba.

Three months later, a key meeting took place at the White House in which Kennedy and his advisers all voted in favour of the invasion. However, nothing went as planned. The Cuban air force sank the first two and the next two turned around and fled back to the U. On the third day, the 1, survivors were taken into custody and sent to military prisons. That such an absurd plan was ever agreed upon, never mind put into action, is astounding.

All of the assumptions that spoke in favour of invasion were erroneous. Also, it was expected that, in an emergency, the brigade would be able to hide in the Escambray mountains and carry out an underground war against Castro from there.

A glance at the map shows that the refuge was miles away from the Bay of Pigs, with an insurmountable swamp in between. And yet, Kennedy and his advisers were among the most intelligent people to ever run an American government. What went wrong between January and April of ? Psychology professor Irving Janis has studied many fiascos. He concluded that they share the following pattern: No one wants to be the naysayer who destroys team unity.

Finally, each person is happy to be part of the group. Expressing reservations could mean exclusion from it. The business world is no stranger to groupthink. A classic example is the fate of the world-class airline Swissair. Here, a group of highly paid consultants rallied around the former CEO and, bolstered by the euphoria of past successes, developed a high-risk expansion strategy including the acquisition of several European airlines.

If you ever find yourself in a tight, unanimous group, you must speak your mind, even if your team does not like it. Question tacit assumptions, even if you risk expulsion from the warm nest. She will not be the most popular member of the team, but she might be the most important.

See also Social Proof ch. Which do you play? If you win the first game, it changes your life completely: The probability of winning is one in million in the first game, and one in 10, in the second game. So which do you choose? Our emotions draw us to the first game, even though the second is ten times better, objectively considered expected win times probability.

Therefore, the trend is towards ever-larger jackpots — Mega Millions, Mega Billions, Mega Trillions — no matter how small the odds are. In a classic experiment from , participants were divided into two groups. The members of the first group were told that they would receive a small electric shock.

The result were, well, shocking: Participants in both groups were equally stressed. Next, the researchers announced a series of reductions in the probability of a shock for the second group: The result: This illustrates that we respond to the expected magnitude of an event the size of the jackpot or the amount of electricity , but not to its likelihood.

The proper term for this is neglect of probability, and it leads to errors in decision-making. We invest in start-ups because the potential profit makes dollar signs flash before our eyes, but we forget or are too lazy to investigate the slim chances of new businesses actually achieving such growth.

Similarly, following extensive media coverage of a plane crash, we cancel flights without really considering the minuscule probability of crashing which, of course, remains the same before and after such a disaster. Many amateur investors compare their investments solely on the basis of yield. But then again, we have no natural feel for this so we often turn a blind eye to it.

Back to the experiment with the electric shocks: Only when the probability reached zero did group B respond differently to group A. Suppose a river has two equally large tributaries. So, method A or B? Method A is three times as good! This fallacy is called the zero-risk bias. A classic example of this is the U. Instituted to achieve zero risk of cancer, this ban sounds good at first, but it ended up leading to the use of more dangerous but non-carcinogenic food additives.

It is also absurd: Each farm would have to function like a hyper- sterile computer-chip factory, and the cost of food would increase a hundredfold. Economically, zero risk rarely makes sense. One exception is when the consequences are colossal, such as a deadly, highly contagious virus escaping from a biotech laboratory. We have no intuitive grasp of risk and thus distinguish poorly between different threats. The more serious the threat and the more emotional the topic such as radioactivity , the less reassuring a reduction in risk seems to us.

An irrational response, but a common one. We sat trying to make conversation while her three children grappled with one another on the floor.


Suddenly I remembered that I had brought some glass marbles with me — a whole bag full. I spilled them out on the floor, in the hope that the little angels would play with them in peace. Far from it: Among the countless marbles there was just one blue one, and the children scrambled for it.

All the marbles were exactly the same size and shiny and bright. But the blue one had an advantage over the others — it was one of a kind. I had to laugh at how childish children are!

In August , when I heard that Google would launch its own email service, I was dead set on getting an account. In the end I did. But why do we get trapped into such thinking habits? Well, it is safe to say that most people discredit information that is not in line with their beliefs. In fact, when individuals face people who disagree with their opinions, they think of them as poorly informed idiots.

Nowadays the problem is even bigger since all around the Internet you can find materials that will reinforce your confirmation bias, regardless of its nature. However, the next time you search the Internet hoping to find those individuals that support your opinion, keep in mind that search engines tailor the results you get, so you will not stumble upon data that is different from your views.

Furthermore, we constantly face story bias, as well. Story bias exists since people have a hard time remembering bare facts and explain the world through stories. As engaging as they may be, stories lead to reality distortion.

One of the stories we tell ourselves is believing we are intelligent and can make the right predictions based on our knowledge. However, we tend to overestimate ourselves, and the amount of information we have. The world is changing all the time, and most of the time we know less than what we think we know. Failing to face this ego-shattering fact leads to underestimating costs and length of projects.

Fix this error by becoming a pessimist in terms of estimating the time and costs for some future undertaking. Furthermore, people often confuse the message with the messenger. In other words, they judge the credibility of a certain message based on who conveys it. Ever since ancient times people blindly believed figures that hold some sort of authority. What you need to do is question everything around you, as well as yourself. Understand that no one can know everything. So, become one of those people who invest in your circle of competence, and do not s pread your attention all over the place.

Having limits is human. The sooner you see it, the sooner you will become happier. Outcome Bias 2. Loss Aversion and Fear of Regret 3. Alternative Blindness Outcome Bias Outcome bias is the tendency to assess decisions based on the results they give. However, a bad result does not necessarily mean that a decision was bad.

Every result is a sum of both decisions and external factors such as luck or timing. Loss Aversion and Fear of Regret Gaining something will make you happy, but losing the same thing will make you much more distressed.

Related articles:

Copyright © 2019