Thinking, Fast and Slow (English)

Thinking, Fast and Slow (English)

INTRODUCTION

In this summary, you will find the most important ideas contained in Kahneman’s bestseller, Thinking, fast and slow. Do you want to know why this psychologist received the Nobel Prize in Economic Sciences ? Then continue reading.Before we start, we would just like to make something clear. The word heuristics is very important and should be clarified before we begin with our summary. Although it looks a bit daunting, it’s actually pretty simple: heuristics are rules and mental short-cuts people use to process information more efficiently. They are like a set of presumptions that usually lead to good predictions.The most important idea that you will able to draw from this book is the fact that the human intellect has two “modes” of functioning.

The first is fast and intuitive, but sometimes irrational and hasty. On the other hand, the second mode is rational, slow, calculative and prudent. The point of this book is not to downgrade one mode and praise the other. Rather, the goal is to point out the good as well as bad sides of each mode of functioning.Let’s start with the rational side. One may be too quick to make conclusions and say that the rational side is always superior to the intuitive side. However, this is not always the case. The exact and detailed differences are better explained later in the text, but for now, it can be mentioned that the intuitive side saves our energy in everyday, routine situations when there isn’t much need to employ the slow and calculative mode of functioning. Of course, sometimes we need to just sit down and think carefully about a problem. These are situations in which we employ our rational thinking.

Finally, the rational, slow half and the intuitive, fast one aren’t necessarily rivals. They often work together. For example, “raw” data can be gathered and organized by the fast, intuitive system, after which the more careful, slower side takes over.Heuristics and biases will be the main focus of this summary. Heuristics are groups of rules that, in most situations, lead us to satisfying and good results. They are not, however, always relevant, and sometimes we may make very silly mistakes by relying on heuristics. The fast system relies heavily on heuristics and biases, and, because this system is unable to experience doubt, it presumes that these hypotheses are always and inevitably true. Unfortunately, heuristics and biases sometimes backfire, and this is the moment when the slow system should take over. This book will help you to recognize these situations when usual presumptions are no longer working, and when slower, deeper processing is needed.

TO READ OR LISTEN COMPLETE BOOK CLICK HERE

FAST AND SLOW THINKING

Anchoring is a type of heuristic that we employ when we have a certain referent point ; for example when someone asks you: “Did Nelson Mandela spend more or less than 30 years in prison?” you may say “I don’t know,” and give some random answer. But, most interestingly, if someone asks you immediately after to give an estimation of the number of years Nelson Mandela spent in prison, you will most likely write a number around 30, because the first question made you biased in that direction. Now, everything works well when anchors aren’t set randomly. In the Nelson Mandela example, the anchor wasn’t set randomly because Mandela did spend about 30 years in jail. That way, the anchor helps you to give the best estimation possible. Let’s see another situation when this anchoring heuristic may be used against you. Some people like to go around pawn shops and try to sell some stuff they deem valuable. Steve, for example, did this often, and sometimes brought home significant amounts of money. However, this time, he stumbled upon a very greedy pawn-shop owner, a genius in haggling.

Steve brought some item that looked almost trivial, like some trifle you usually find in the attic. It was a little figurine of a soldier. It looked interesting, and Steve wanted to see if he could get a few dollars for it. So he brought it in the pawn shop and started arranging the price. Out of the  blue, the owner asked him, “What do you think, does this little figurine cost more or less than 25 dollars?” Steve was intrigued, and jokingly said that it must cost more than 25 dollars. The owner didn’t respond to his reply and just asked him how much he was requesting for the little toy. Steve was a little taken aback, but he just mumbled something like, “30 dollars.”The owner gave it to him without trying to lower it a bit or anything. This looked a bit suspicious, but Steve just took the money and got out. The next day, he saw the same toy in the shop, with the price- 200 dollars. It became obvious to him that the owner tricked him into believing that the real price of the toy was around 25, and this deceitful anchor ultimately fooled Steve.

TO READ OR LISTEN COMPLETE BOOK CLICK HERE

Availability works by the rule, “If you can think of it, it must be important.” Like any other heuristic on which the fast system relies, most of the time the heuristic of availability is very useful. For example, when somebody asks you, “What’s the biggest city in Europe?” you may instantly say “London,” not because you know for sure that it is the biggest city in Europe, but just because it first came to your mind. And you wouldn’t be too far away, because London is the second biggest city in Europe, after Moscow.This is a situation when the availability heuristic works in your favor. On the other hand, there are numerous examples of this heuristic catching people off guard, and rushing them into unnecessary and irrational conclusions. Large plane crashes are one of the most shocking and media-covered events in the world. For days and weeks after this kind of catastrophe, the news is still full of reports about these kinds of events. And when you ask people, especially the ones who regularly watch news and reports, about the possibilities of car crashes and plane crashes, people tend to underestimate the possibility of a car crash happening, and overestimate  that of a plane crash.

This is because pictures of plane crashes are more readily available in their memories, and it is much easier to retrieve these images than those of car crashes, even though car crashes are much more common and deadly.Let’s see another example. We all know that sharks are dangerous. Even if we neglect the negative impact Spielberg’s film “Jaws ” had on the beliefs around sharks, we can exemplify the heuristic of availability by the people’s perception of sharks’ deadliness compared to other animals. Most of us may make a great mistake if we were to assess the likelihood of a shark attack. Similar to the first example, because shark attacks are so dramatic, graphic, bloody, etc., they are “forced” by numerous news agencies, as they attract more viewers. Because of this, the memory of a shark attack is ready and steady in our brain, ultimately making us greatly overestimate the possibility of a shark attacking us, when, actually, we are more likely to die from falling plane parts.

The sunk-cost fallacy is, simply put, a situation when people continue to invest in a failed asset, even though it is obvious that even the first investment was irrational, and that every subsequent one is sheer lunacy. However, often otherwise very sane people do this in order to avoid feeling regret and like they’ve made a mistake. This is, of course, a way to salvage a little bit of pride and to keep their ego intact.There is something called “The Concorde Fallacy,” after the real-life situation that happened around the then-revolutionary model of airplane, Concorde, the joint project of the UK and French governments. Both governments continued investing in the project even after it became obvious that its economic sustainability was null, and that no one wanted to use Concord airplanes because they were so unsafe . However, because even the initial investments in this project were astronomical, it was almost impossible for high officials to admit their failure and abort the project altogether.

Framing is one of the most interesting heuristics and biases, and the one widely employed by news agencies and media magnates. One of the best examples of framing is when we slightly alter a question to emphasize and attenuate what we want. For example, people will respond very differently to these two essentially identical questions:
1. “Would you consent to a medical procedure that had a 90% chance of survival?
2. “Would you consent to a medical procedure that had a 10% chance of death?

When we use the first way of asking the same question, we make the listener focus on the good side. Because of this, when we ask people if they would accept an operation that has a 90% chance of success, they are more likely to accept.On the other hand, with the second question, we emphasize the bad side: the mortality rate. This way, people become focused on negative characteristics, and are thus more likely to reject the operation.In short, the way we ask questions is very important and can alter the response we’re going to get. Let’s take another example. Kris loved to go to parties. But he was a bit skeptical before going anywhere because he wanted to “get the most of it.” His friends found it a bit difficult to coax him into going with them, but, eventually, the solution occurred to them. They simply asked him to go to the club with them and talked about how they had so much fun the last time they went out. They chose not to talk about the bad things that happened to them.

TO READ OR LISTEN COMPLETE BOOK CLICK HERE

SHARE
Subscribe
Notify of
Or
guest

0 Comments
Inline Feedbacks
View all comments