Amazon.com: Customer reviews: Thinking, Fast and Slow
Skip to main content
.us
Hello Select your address
All
EN
Hello, sign in
Account & Lists
Returns & Orders
Cart
All
Disability Customer Support Clinic Customer Service Best Sellers Amazon Basics Prime Today's Deals New Releases Music Books Registry Fashion Amazon Home Pharmacy Gift Cards Toys & Games Sell Coupons Computers Automotive Video Games Home Improvement Beauty & Personal Care Smart Home Pet Supplies Health & Household Luxury Stores Audible Handmade Amazon Launchpad
Celebrate Black History Month

  • Thinking, Fast and Slow
  • ›
  • Customer reviews

Customer reviews

4.6 out of 5 stars
4.6 out of 5
33,508 global ratings
5 star
75%
4 star
15%
3 star
6%
2 star
2%
1 star
2%
Thinking, Fast and Slow

Thinking, Fast and Slow

byDaniel Kahneman
Write a review
How customer reviews and ratings work

Customer Reviews, including Product Star Ratings help customers to learn more about the product and decide whether it is the right product for them.

To calculate the overall star rating and percentage breakdown by star, we don’t use a simple average. Instead, our system considers things like how recent a review is and if the reviewer bought the item on Amazon. It also analyzed reviews to verify trustworthiness.

Learn more how customers reviews work on Amazon
See All Buying Options

Top positive review

All positive reviews›
Graham H. Seibert
5.0 out of 5 starsAnnotations on Kahneman's table of contents - a survey of logic and illogic
Reviewed in the United States 🇺🇸 on March 15, 2012
When you come late to the party, writing the 160th review, you have a certain freedom to write something as much for your own use as for other readers, confident that the review will be at the bottom of the pile.

Kahneman's thesis is that the human animal is systematically illogical. Not only do we mis-assess situations, but we do so following fairly predictable patterns. Moreover, those patterns are grounded in our primate ancestry.

The first observation, giving the title to the book, is that eons of natural selection gave us the ability to make a fast reaction to a novel situation. Survival depended on it. So, if we hear an unnatural noise in the bushes, our tendency is to run. Thinking slow, applying human logic, we might reflect that it is probably Johnny coming back from the Girl Scout camp across the river bringing cookies, and that running might not be the best idea. However, fast thinking is hardwired.

The first part of the book is dedicated to a description of the two systems, the fast and slow system. Kahneman introduces them in his first chapter as system one and system two.

Chapter 2 talks about the human energy budget. Thinking is metabolically expensive; 20 percent of our energy intake goes to the brain. Moreover, despite what your teenager tells you, dedicating energy to thinking about one thing means that energy is not available for other things. Since slow thinking is expensive, the body is programmed to avoid it.

Chapter 3 expands on this notion of the lazy controller. We don't invoke our slow thinking, system two machinery unless it is needed. It is expensive. As an example, try multiplying two two-digit numbers in your head while you are running. You will inevitably slow down. NB: Kahneman uses the example of multiplying two digit numbers in your head quite frequently. Most readers don't know how to do this. Check out "The Secrets of Mental Math" for techniques. Kahneman and myself being slightly older guys, we probably like to do it just to prove we still can. Whistling past the graveyard - we know full well that mental processes slow down after 65.

Chapter 4 - the associative machine - discusses the way the brain is wired to automatically associate words with one another and concepts with one another, and a new experience with a recent experience. Think of it as the bananas vomit chapter. Will you think of next time you see a banana?

Chapter 5 - cognitive ease. We are lazy. We don't solve the right problem, we solve the easy problem.

Chapter 6 - norms, surprises, and causes. A recurrent theme in the book is that although our brains do contain a statistical algorithm, it is not very accurate. It does not understand the normal distribution. We are inclined to expect more regularity than actually exists in the world, and we have poor intuition about the tail ends of the bell curve. We have little intuition at all about non-Gaussian distributions.

Chapter 7 - a machine for jumping to conclusions. He introduces a recurrent example. A ball and bat together cost $1.10. The bat costs one dollar more than the ball. How much does the ball cost? System one, fast thinking, leaps out with an answer which is wrong. It requires slow thinking to come up with the right answer - and the instinct to distrust your intuition.

Chapter 8 - how judgments happen. Drawing parallels across domains. If Tom was as smart as he is tall, how smart would he be?

Chapter 9 - answering an easier question. Some questions have no easy answer. "How do you feel about yourself these days?" Is harder to answer than "did you have a date last week?" If the date question is asked first, it primes an answer for the harder question.

Section 2 - heuristics and biases

Chapter 10 - the law of small numbers. In the realm of statistics there is a law of large numbers. The larger the sample size, the more accurate the statistical inference from measuring them. Conversely, a small sample size can be quite biased. I was in a study abroad program with 10 women, three of them over six feet. Could I generalize about the women in the University of Maryland student body? Conversely, I was the only male among 11 students and the only one over 60. Could they generalize anything from that? In both cases, not much.

Chapter 11 - anchors. A irrelevant notion is a hard thing to get rid of. For instance, the asking price of the house should have nothing to do with its value, but it does greatly influence bids.

Chapter 12 - the science of availability. If examples come easily to mind, we are more inclined to believe the statistic. If I know somebody who got mugged last year, and you don't, my assessment of the rate of street crime will probably be too high, and yours perhaps too low. Newspaper headlines distort all of our thinking about the probabilities of things like in and terrorist attacks. Because we read about it, it is available.

Chapter 13 - availability, emotion and risk. Continuation.

Chapter 14 - Tom W's specialty. This is about the tendency for stereotypes to override statistics. If half the students in the University area education majors, and only a 10th of a percent study mortuary science, the odds are overwhelming that any individual student is an education major. Nonetheless, if you ask about Tom W, a sallow gloomy type of guy, people will ignore the statistics and guess he is in mortuary science.

Chapter 15 - less is more. Linda is described as a very intelligent and assertive woman. What are the odds she is a business major? The odds that she is a feminist business major? Despite the mathematical impossibility, most people will think that the odds of the latter are greater than the former.

Chapter 16 - causes trump statistics. The most important aspect of this chapter is Bayesian analysis, which is so much second nature to Kahneman that he doesn't even describe it. The example he gives is a useful illustration.
* 85% of the cabs in the city are green, and 15% are blue.
* A witness identified the cab involved in a hit and run as blue.
* The court tested the witness' reliability, and the witness was able to correctly identify the correct color 80% of the time, and failed 20% of the time.
First, to go to the point. Given these numbers, most people will assume that the cab in the accident was blue because of the witness testimony. However, if we change the statement of the problem so that there is a 20% chance that the blue identification of the color was wrong, but 85% of the cabs involved in accidents are green, people will overwhelmingly say that the cab in the accident was a green madman. The problems are mathematically identical but the opinion is different.
Now the surprise. The correct answer is that there is a 41% chance that the cab involved in the accident was blue. Here's how we figure it out from Bayes theorem.
If the cab was blue, a 15% chance, and correctly identified, an 80% chance, the combined probability is .15 * .8 = .12, a 12% chance
If the cab was green, an 85% chance, and incorrectly identified, a 20% chance, the combined probability is .85 * .2 = .17, a 17% chance
Since the cab had to be either blue or green, the total probability of it being identified as blue, whether right or wrong, is .12 + .17 = .29. In other words, this witness could be expected to identify the cab as blue 29% of the time whether she was right or wrong.
The chances she was right are .12 out of .29, or 41%. Recommend that you cut and paste this, because Bayes theorem is cited fairly often, and is kind of hard to understand. It may be simple for Kahneman, but it is not for his average reader, I am sure.

Chapter 17 - regression to the mean. If I told you I got an SAT score of 750 you could assume that I was smart, or that I was lucky, or some combination. The average is only around 500. The chances are little bit of both, and if I take a test a second time I will get a lower score, not because I am any stupider but because your first observation of me wasn't exactly accurate. This is called regression to the mean. It is not about the things you are measuring, it is about the nature of measurement instruments. Don't mistake luck for talent.

Chapter 18 - taming intuitive predictions. The probability of the occurrence of an event which depends on a number of prior events is the cumulative probability of all those prior events. The probability of a smart grade school kid becoming a Rhodes scholar is a cumulative probability of passing a whole series of hurdles: studying hard, excelling in high school, avoiding drink and drugs, parental support and so on. The message in this chapter is that we tend to overestimate our ability to project the future.

Part three - overconfidence

Chapter 19 - the illusion of understanding. Kahneman introduces another potent concept, "what you see is all there is," thereinafter WYSIATI. We make judgments on the basis of the knowledge we have, and we are overconfident about the predictive value of that observation. To repeat their example, we see the tremendous success of Google. We discount the many perils which could have totally derailed the company along the way, including the venture capitalist who could have bought it all for one million dollars but thought the price was too steep.

Chapter 20 - The illusion of validity. Kahneman once again anticipates a bit more statistical knowledge than his readers are likely to have. The validity of a measure is the degree to which an instrument measures what it purports to measure. You could ask a question such as whether the SAT is a valid measure of intelligence. The answer is, not really, because performance on the SAT depends quite a bit on prior education and previous exposure to standardized tests. You could ask whether the SAT is a valid predictor of performance in college. The answer there is that it is not very good, but nonetheless it is the best available predictor. It is valid enough because there is nothing better. To get back to the point, we are inclined to assume measurements are more valid than they are, in other words, to overestimate our ability to predict based on measurements.

Chapter 21 - intuitions versus formulas. The key anecdote here is about a formula for predicting the quality of a French wine vintage. The rule of thumb formula beat the best French wine experts. Likewise, mathematical algorithms for predicting college success are as least as successful, and much cheaper, than long interviews with placement specialists.

Chapter 22 - expert intuition, when can we trust it? The short answer to this is, in situations in which prior experience is quite germane to new situations and there is some degree of predictability, and also an environment which provides feedback so that the experts can validate their predictions. He would trust the expert intuition of a firefighter; there is some similarity among fires, and the firemen learns quickly about his mistakes. He would not trust the intuition of a psychiatrist, whose mistakes may not show up for years.

Chapter 23 - the outside view. The key notion here is that people within an institution, project, or any endeavor tend to let their inside knowledge blind them to things an outsider might see. We can be sure that most insiders in Enron foresaw nothing but success. An outsider, having seen more cases of off-balance-sheet accounting and the woes it can cause, would have had a different prediction.

Chapter 24 - the engine of capitalism. This is a tour of decision-making within the capitalist citadel. It should destroy the notion that there are CEOs who are vastly above average, and also the efficient markets theory. Nope. The guys in charge often don't understand, and more important, they are blind to their own lack of knowledge.

Part four - choices

This is a series of chapters about how people make decisions involving money and risk. In most of the examples presented there is a financially optimal alternative. Many people will not find that alternative because of the way the problem is cast and because of the exogenous factors. Those factors include:

Marginal utility. Another thousand dollars is much less important to a millionaire than a wage slave.

Chapter 26 - Prospect theory: The bias against loss. Losing $1000 causes pain out of proportion to the pleasure of winning $1000.

Chapter 27 - The endowment effect. I will not pay as much to acquire something as I would demand if I already owned it and were selling.

Chapter 28 - Bad Events. We will take unreasonable risk when all the alternatives are bad. Pouring good money after bad, the sunk cost effect, is an example.

Chapter 29 - The fourfold pattern. High risk, low risk, win, lose. Human nature is to make choices which are not mathematically optimal: buying lottery tickets and buying unnecessary insurance.

Chapter 30 - rare events. Our minds are not structured to assess the likelihood of rare events. We overestimate the visible ones, such as tsunamis and terrorist attacks, and ignore the ones of which we are unaware.

Chapter 31 - Risk policies. This is about systematizing our acceptance of risk and making policies. As a policy, should we buy insurance or not, recognizing that there are instances in which we may override the policy. As a policy, should we accept the supposedly lower risk of buying mutual funds, even given the management fees?

Chapter 32 - keeping score. This is about letting the past influence present decisions. The classic example is people who refuse to sell for a loss, whether shares of stock or a house.

Chapter 33 - reversals. We can let a little negative impact a large positive. One cockroach in a crate of strawberries.

Chapter 34 - Frames and reality. How we state it. 90% survival is more attractive than 10% mortality.

Part V. Two selves: Experience and memory

Our memory may be at odds with our experience at the time. Mountain climbing or marathon running are sheer torture at the time, but the memories are exquisite. We remember episodes such as childbirth by the extreme of pain, not the duration.

Lift decision: do we live life for the present experience, or the anticipated memories? Are we hedonists, or Japanese/German tourists photographing everything to better enjoy the memories?
Read more
3,205 people found this helpful

Top critical review

All critical reviews›
ahall
3.0 out of 5 starsRepetitive
Reviewed in the United States 🇺🇸 on October 3, 2017
First, for reasons explained below I would not buy this as an audio book.

I have mixed feelings about this book for various reasons. The first 200 pages (Part 1 and 2) are heavily focused on the author trying to convince the reader that it is better to think statistically rather than instinctively / intuitively. After stating countless studies to support his premise, the author (very briefly) in Chapter 21 admits that “formulas based on statistics or on common sense” are both good forms to develop valued algorithms – Doesn’t common sense fit into instinct or intuition? Later in the same chapter the author concedes that intuition adds value but only to the extent that the individual bases it off sufficient research. To me, the way most of the book was written, especially in Parts 1 and 2, was a little over the top. The chapters are short and each one cites at least one study that the author or someone else performed. It becomes example after example after example and redundant. The beginning chapters seem as if the author put a group of journal articles together to develop part of the book. Don’t get me wrong, many of the studies are really interesting and I find them very helpful, I just believe that it became a little redundant. However, there is some evidence that also says that many of the studies referenced in this book were not able to be reproduced, adding more speculation on the evidence supporting the author’s premise.

Furthermore, the book is very interactive with the reader and some parts are a little condescending. For example, in the Introduction, the author poses a question to the reader asking whether or not a personality description means the person in question is a farmer or a librarian. Rather than assuming that the multitude of readers may come up with different responses, the author states “Did it occur to you that there are more than 20 male farmers.” While I understand where the author was going with the question, the author presumed that the readers would only answer one way and this recurs throughout the book. Another example in Chapter 16 assumed that the reader came up with the wrong answer and even stated that the most common answer to this question is wrong, however, the author does not explain how to come up with the correct answer.

Since this book is very interactive, I wouldn’t purchase the audio book. I do have both the hard copy and the audio book and further noticed that there were a few mistakes between the hard copy and the audio. Sometimes the mistake was quite minimal such as words were flip-flopped but at the end of Chapter 17 the author asks a question which requires some thought and work by the reader. The total in the audiobook was completely off. Instead of stating the total at 81 million (as in the hard copy) the audio book read it as 61 million and the Total for another part of the question in the same example was 67.1 million in the audio book instead of 89.1 million as the hard copy stated.

All in all, a good part of the book is intriguing. The author clearly has conducted extensive research throughout his career and was able to present much of it in this book in a form that would be comprehensible to non-econ and non-psychology persons.
Read more
752 people found this helpful

Search
Sort by
Top reviews
Filter by
All reviewers
All stars
Text, image, video
33,508 total ratings, 6,326 with reviews

There was a problem filtering reviews right now. Please try again later.

From the United States

Graham H. Seibert
5.0 out of 5 stars Annotations on Kahneman's table of contents - a survey of logic and illogic
Reviewed in the United States 🇺🇸 on March 15, 2012
Verified Purchase
When you come late to the party, writing the 160th review, you have a certain freedom to write something as much for your own use as for other readers, confident that the review will be at the bottom of the pile.

Kahneman's thesis is that the human animal is systematically illogical. Not only do we mis-assess situations, but we do so following fairly predictable patterns. Moreover, those patterns are grounded in our primate ancestry.

The first observation, giving the title to the book, is that eons of natural selection gave us the ability to make a fast reaction to a novel situation. Survival depended on it. So, if we hear an unnatural noise in the bushes, our tendency is to run. Thinking slow, applying human logic, we might reflect that it is probably Johnny coming back from the Girl Scout camp across the river bringing cookies, and that running might not be the best idea. However, fast thinking is hardwired.

The first part of the book is dedicated to a description of the two systems, the fast and slow system. Kahneman introduces them in his first chapter as system one and system two.

Chapter 2 talks about the human energy budget. Thinking is metabolically expensive; 20 percent of our energy intake goes to the brain. Moreover, despite what your teenager tells you, dedicating energy to thinking about one thing means that energy is not available for other things. Since slow thinking is expensive, the body is programmed to avoid it.

Chapter 3 expands on this notion of the lazy controller. We don't invoke our slow thinking, system two machinery unless it is needed. It is expensive. As an example, try multiplying two two-digit numbers in your head while you are running. You will inevitably slow down. NB: Kahneman uses the example of multiplying two digit numbers in your head quite frequently. Most readers don't know how to do this. Check out "The Secrets of Mental Math" for techniques. Kahneman and myself being slightly older guys, we probably like to do it just to prove we still can. Whistling past the graveyard - we know full well that mental processes slow down after 65.

Chapter 4 - the associative machine - discusses the way the brain is wired to automatically associate words with one another and concepts with one another, and a new experience with a recent experience. Think of it as the bananas vomit chapter. Will you think of next time you see a banana?

Chapter 5 - cognitive ease. We are lazy. We don't solve the right problem, we solve the easy problem.

Chapter 6 - norms, surprises, and causes. A recurrent theme in the book is that although our brains do contain a statistical algorithm, it is not very accurate. It does not understand the normal distribution. We are inclined to expect more regularity than actually exists in the world, and we have poor intuition about the tail ends of the bell curve. We have little intuition at all about non-Gaussian distributions.

Chapter 7 - a machine for jumping to conclusions. He introduces a recurrent example. A ball and bat together cost $1.10. The bat costs one dollar more than the ball. How much does the ball cost? System one, fast thinking, leaps out with an answer which is wrong. It requires slow thinking to come up with the right answer - and the instinct to distrust your intuition.

Chapter 8 - how judgments happen. Drawing parallels across domains. If Tom was as smart as he is tall, how smart would he be?

Chapter 9 - answering an easier question. Some questions have no easy answer. "How do you feel about yourself these days?" Is harder to answer than "did you have a date last week?" If the date question is asked first, it primes an answer for the harder question.

Section 2 - heuristics and biases

Chapter 10 - the law of small numbers. In the realm of statistics there is a law of large numbers. The larger the sample size, the more accurate the statistical inference from measuring them. Conversely, a small sample size can be quite biased. I was in a study abroad program with 10 women, three of them over six feet. Could I generalize about the women in the University of Maryland student body? Conversely, I was the only male among 11 students and the only one over 60. Could they generalize anything from that? In both cases, not much.

Chapter 11 - anchors. A irrelevant notion is a hard thing to get rid of. For instance, the asking price of the house should have nothing to do with its value, but it does greatly influence bids.

Chapter 12 - the science of availability. If examples come easily to mind, we are more inclined to believe the statistic. If I know somebody who got mugged last year, and you don't, my assessment of the rate of street crime will probably be too high, and yours perhaps too low. Newspaper headlines distort all of our thinking about the probabilities of things like in and terrorist attacks. Because we read about it, it is available.

Chapter 13 - availability, emotion and risk. Continuation.

Chapter 14 - Tom W's specialty. This is about the tendency for stereotypes to override statistics. If half the students in the University area education majors, and only a 10th of a percent study mortuary science, the odds are overwhelming that any individual student is an education major. Nonetheless, if you ask about Tom W, a sallow gloomy type of guy, people will ignore the statistics and guess he is in mortuary science.

Chapter 15 - less is more. Linda is described as a very intelligent and assertive woman. What are the odds she is a business major? The odds that she is a feminist business major? Despite the mathematical impossibility, most people will think that the odds of the latter are greater than the former.

Chapter 16 - causes trump statistics. The most important aspect of this chapter is Bayesian analysis, which is so much second nature to Kahneman that he doesn't even describe it. The example he gives is a useful illustration.
* 85% of the cabs in the city are green, and 15% are blue.
* A witness identified the cab involved in a hit and run as blue.
* The court tested the witness' reliability, and the witness was able to correctly identify the correct color 80% of the time, and failed 20% of the time.
First, to go to the point. Given these numbers, most people will assume that the cab in the accident was blue because of the witness testimony. However, if we change the statement of the problem so that there is a 20% chance that the blue identification of the color was wrong, but 85% of the cabs involved in accidents are green, people will overwhelmingly say that the cab in the accident was a green madman. The problems are mathematically identical but the opinion is different.
Now the surprise. The correct answer is that there is a 41% chance that the cab involved in the accident was blue. Here's how we figure it out from Bayes theorem.
If the cab was blue, a 15% chance, and correctly identified, an 80% chance, the combined probability is .15 * .8 = .12, a 12% chance
If the cab was green, an 85% chance, and incorrectly identified, a 20% chance, the combined probability is .85 * .2 = .17, a 17% chance
Since the cab had to be either blue or green, the total probability of it being identified as blue, whether right or wrong, is .12 + .17 = .29. In other words, this witness could be expected to identify the cab as blue 29% of the time whether she was right or wrong.
The chances she was right are .12 out of .29, or 41%. Recommend that you cut and paste this, because Bayes theorem is cited fairly often, and is kind of hard to understand. It may be simple for Kahneman, but it is not for his average reader, I am sure.

Chapter 17 - regression to the mean. If I told you I got an SAT score of 750 you could assume that I was smart, or that I was lucky, or some combination. The average is only around 500. The chances are little bit of both, and if I take a test a second time I will get a lower score, not because I am any stupider but because your first observation of me wasn't exactly accurate. This is called regression to the mean. It is not about the things you are measuring, it is about the nature of measurement instruments. Don't mistake luck for talent.

Chapter 18 - taming intuitive predictions. The probability of the occurrence of an event which depends on a number of prior events is the cumulative probability of all those prior events. The probability of a smart grade school kid becoming a Rhodes scholar is a cumulative probability of passing a whole series of hurdles: studying hard, excelling in high school, avoiding drink and drugs, parental support and so on. The message in this chapter is that we tend to overestimate our ability to project the future.

Part three - overconfidence

Chapter 19 - the illusion of understanding. Kahneman introduces another potent concept, "what you see is all there is," thereinafter WYSIATI. We make judgments on the basis of the knowledge we have, and we are overconfident about the predictive value of that observation. To repeat their example, we see the tremendous success of Google. We discount the many perils which could have totally derailed the company along the way, including the venture capitalist who could have bought it all for one million dollars but thought the price was too steep.

Chapter 20 - The illusion of validity. Kahneman once again anticipates a bit more statistical knowledge than his readers are likely to have. The validity of a measure is the degree to which an instrument measures what it purports to measure. You could ask a question such as whether the SAT is a valid measure of intelligence. The answer is, not really, because performance on the SAT depends quite a bit on prior education and previous exposure to standardized tests. You could ask whether the SAT is a valid predictor of performance in college. The answer there is that it is not very good, but nonetheless it is the best available predictor. It is valid enough because there is nothing better. To get back to the point, we are inclined to assume measurements are more valid than they are, in other words, to overestimate our ability to predict based on measurements.

Chapter 21 - intuitions versus formulas. The key anecdote here is about a formula for predicting the quality of a French wine vintage. The rule of thumb formula beat the best French wine experts. Likewise, mathematical algorithms for predicting college success are as least as successful, and much cheaper, than long interviews with placement specialists.

Chapter 22 - expert intuition, when can we trust it? The short answer to this is, in situations in which prior experience is quite germane to new situations and there is some degree of predictability, and also an environment which provides feedback so that the experts can validate their predictions. He would trust the expert intuition of a firefighter; there is some similarity among fires, and the firemen learns quickly about his mistakes. He would not trust the intuition of a psychiatrist, whose mistakes may not show up for years.

Chapter 23 - the outside view. The key notion here is that people within an institution, project, or any endeavor tend to let their inside knowledge blind them to things an outsider might see. We can be sure that most insiders in Enron foresaw nothing but success. An outsider, having seen more cases of off-balance-sheet accounting and the woes it can cause, would have had a different prediction.

Chapter 24 - the engine of capitalism. This is a tour of decision-making within the capitalist citadel. It should destroy the notion that there are CEOs who are vastly above average, and also the efficient markets theory. Nope. The guys in charge often don't understand, and more important, they are blind to their own lack of knowledge.

Part four - choices

This is a series of chapters about how people make decisions involving money and risk. In most of the examples presented there is a financially optimal alternative. Many people will not find that alternative because of the way the problem is cast and because of the exogenous factors. Those factors include:

Marginal utility. Another thousand dollars is much less important to a millionaire than a wage slave.

Chapter 26 - Prospect theory: The bias against loss. Losing $1000 causes pain out of proportion to the pleasure of winning $1000.

Chapter 27 - The endowment effect. I will not pay as much to acquire something as I would demand if I already owned it and were selling.

Chapter 28 - Bad Events. We will take unreasonable risk when all the alternatives are bad. Pouring good money after bad, the sunk cost effect, is an example.

Chapter 29 - The fourfold pattern. High risk, low risk, win, lose. Human nature is to make choices which are not mathematically optimal: buying lottery tickets and buying unnecessary insurance.

Chapter 30 - rare events. Our minds are not structured to assess the likelihood of rare events. We overestimate the visible ones, such as tsunamis and terrorist attacks, and ignore the ones of which we are unaware.

Chapter 31 - Risk policies. This is about systematizing our acceptance of risk and making policies. As a policy, should we buy insurance or not, recognizing that there are instances in which we may override the policy. As a policy, should we accept the supposedly lower risk of buying mutual funds, even given the management fees?

Chapter 32 - keeping score. This is about letting the past influence present decisions. The classic example is people who refuse to sell for a loss, whether shares of stock or a house.

Chapter 33 - reversals. We can let a little negative impact a large positive. One cockroach in a crate of strawberries.

Chapter 34 - Frames and reality. How we state it. 90% survival is more attractive than 10% mortality.

Part V. Two selves: Experience and memory

Our memory may be at odds with our experience at the time. Mountain climbing or marathon running are sheer torture at the time, but the memories are exquisite. We remember episodes such as childbirth by the extreme of pain, not the duration.

Lift decision: do we live life for the present experience, or the anticipated memories? Are we hedonists, or Japanese/German tourists photographing everything to better enjoy the memories?
3,205 people found this helpful
Helpful
Report abuse
    Showing 0 comments

There was a problem loading comments right now. Please try again later.


John B. Robb
5.0 out of 5 stars Required reading for educated people, but falls short as a model of mind
Reviewed in the United States 🇺🇸 on August 31, 2015
Verified Purchase
This is an invaluable book that every person who considers him/herself educated should read - even study. Indeed, it is a scandal that mastering the material in this book isn't considered an essential component of a high school education.

The author was awarded the Nobel in Economics for his work on what he calls decision theory, or the study of the actual workings of the typical human mind in the evaluation of choices, and the book itself presents the findings of many decades of psychological studies that expose the endemic fallacious thinking that we are all prone to, more or less. The lives of all of us could be improved by lessons learned from this book, not just individually, through self-education, but also on the large scale, if the large scale decision makers in this society in and out of government could be educated as well. In fact, it is largely because these large scale decision makers are no better than the rest of us in their ability to think straight and plan well, that society is as screwed up as it is, and that essentially all of its institutions are diseased and corrupt. The lesson there, however, is that decision making needs to be returned to the individual - that the powers that be need to be deprived of their powers to mess up the lives of the rest of us.

Despite the many virtues of this book - it is well-written, engaging, and its academic author reasonably restrained in the tendencies of his tribe to blathering in abstractions - it is a bit disappointing at the very end, when the author proves unable to synthesize all his material into a comprehensive theory of the thinking, and deciding mind - or at least into a set of carefully formulated principles that provide a succinct summary of the principles of human thinking, both typical and ideal.

Kahneman uses throughout a construct that implies that we are of two minds: System 1 is the fast-thinking, intuitive, mind, prone to jumping to conclusions; while System 2 is the slow-thinking analytical mind, that is brought into play, if at all, only to critique and validate the conclusions that we have jumped to. System 2, we are told, is lazy, and if often just rubber-stamps the snap judgements of System 1, or if pressed, rationalize them, instead of digging critically as well as constructively into the complex underpinnings of the material and sorting them out as best it can.

Instead of working this construct up into a comprehensive model of mind, K merely uses it as a loose schema for representing the kinds of thinking thought to underlie the results derived from the many psychological experiments that he here reports on. This neglect raises the question at many points as to just how well the experimenters have really understood the thinking that underlies the behavior of their subjects. But this, I am sorry to say, is a weakness of virtually all psychological experimentation, which is still just beginning to come to grips with the complexity and varieties of cognitive style of the human mind.

What Kahneman does do, however, is to provide convenient labels for many characteristic types of fallacious thinking, although again, the exact role of System 1 and System 2, and their interaction, is inadequately explicated. Instead, towards the end of the book, another, somehow related, but nominally independent theme is developed: the disturbing divergence between the experiencing self and the remembering self. This is in itself such an interesting and important idea, so pregnant with both psychological and philosophical implications, that it could have used a fuller treatment, and again, there is no coherent integration of this theme with the System1/System2 construct.

The idea here is that our present experience includes our most salient memories of previous experiences - for example the highlights of past vacations, or out of the ordinary episodes of our lives. Somewhat surprisingly, though, what we remember is a systematic distortion of the actual experience. Our memory collapse the duration of various aspects of our experience and highlights only the peak moment(s) and the final moments, perhaps with a nod toward the initial presentation of the experience. And this systematic distortion of the actual experience in all its fullness, can lead us to make irrational and detrimental choices in deciding whether to repeat the experience in the future. Thus, a bad ending to an otherwise wonderful experience can spoil the whole thing for us in memory, and cause us to avoid similar experiences in the future, even though by simply anticipating and improving the ending we might make the whole experience as wonderful as most of the original was. Likewise, subjects in experiments involving either long durations of pain, or much shorter episodes of pain with a higher peak, were consistently more averse to the latter rather than the former - or they overemphasized the way these presentations ended as a factor in judging them as a whole.

These are important findings that go the heart of the question of how best to steer our course through life, but here is the only attempt at integration of this remembering vs experiencing self theme, and the System 1/ System 2 theme that I find in the final chapter, Conclusions:

"The remembering self is a construction of System 2. However, the distinctive features of the way it evaluates episodes and lives are characteristics of our memory. Duration neglect and the peak-end rule originate in System 1 and do not necessarily conform to the values of System 2".

There is more here, but it merely repeats the earlier analysis of the relevant experiments.

No evidence is presented as to the respective roles of System 1 and System 2 with respect to the laying down of memory, to its decay, or with respect to a recently discovered phenomenon: memory reconsolidation. Nor is any account taken of what has been learned, much of it in recent decades, about the interactions between short, intermediate, and long term memory, or any of the radically different modalities of episodic (picture strip) and semantic (organized, abstracted) memory. Consequently, Kahneman's vague reference to the "characteristics of our memory" is essentially a ducking of the question of what the remembering self is. I think that at best, the finding of the replacement of the original experience by an abstract predicated on peak-end bias is an exaggeration, though there's no question that "duration neglect" is in operation, and a good thing too, unless K means by "duration neglect" not just the stretches of minimally changing experience (which have little memorial significance anyway, but even the consciousness of how long the edited out parts were (this distinction was never made in Chapter 35, where the theme of the remembering vs. the experiencing self is first taken up).

Speaking for myself anyway, I have a much fuller memory of my most important experiences than Kahneman seems to indicate. Naturally the highlights are featured, but what I tend to remember are representative moments that I took conscious note of at the time, as though making a psychological photograph. I remember these moments also because I bring them up from time to time when I'm thinking about that experience. For example, I'm thinking now of a long distance race I did in 2014 (a very tough half-marathon, with almost 2000' of climbing). I remember: the beginning section as well as the ending section; each of the rest stations; certain moments of each of the major hill climbs; at least one moment from each of the descents; and a number of other happenings during the almost three hour event. For me in this race, the peak experience occurred right at the end, when I all but collapsed, yet managed to stagger to the finish line. That ending does naturally come first to mind as a representation of the entire event, but it is merely the culmination of a long and memorable experience with many moving parts, and if I want, my remembering self can still conjure up many other moments, as well as a clear sense of the duration of each of the sections of the course.

Over many years most memories fade, and it's certainly reasonable to suppose that in extreme cases, where they are all but forgotten, only a single representative moment might be retained. However, if we can say anything for sure about memory it is this: we remember what we continue to think of and to use, and we do that precisely because this material has continuing importance to us. The recent research in memory reconsolidation tells us that when we do bring up memories only occasionally, we reinforce them, but we also edit and modify them to reflect our current perspectives, and sometimes we conflate them with other seemingly related knowledge that we've accrued. We are thus prone to distort our own original memories over time, in some cases significantly, but we may still retain much more of the original experience that just the peak and the end, and if we do reinterpret our memories in the light of more recent experience, that's not necessarily a bad thing. In any case, the memories that occur in the present may be said to be a joint project of the experiencing as well as the remembering self, which rather erodes the whole Two Selves concept that Kahneman first posited.

I do not mean to criticize the valuable evidential material in the book, and in general I think that Kahneman, and the other researchers and thinkers whom he quotes, have drawn reasonable conclusions from the experiments they report on. But ultimately, the book, as well as the fields both of psychology and brain neurophysiology suffer both in coherence and meaningfulness because they aren't predicated on a more comprehensive theory of mind. It's the old story in science, first formulated by Karl Popper in his 1935 book, The Logic of Scientific Discovery: unless we approach the data with an hypothesis in mind - unless, indeed, we seek out data likely to be relevant to a particular hypothesis, we're not going to make any enduring progress in understanding that data in a comprehensively meaningful way, let alone be able to make falsifiable deductions about elements of the system for which we have at present no data. Popper's quotation from the German philosopher Novalis comes to mind - "Theories are nets: only he who casts will catch."

In the final, "Conclusions", chapter, K caricatures the abstract economists' model of homo econimicus (man as a rational optimizer of his utility), contrasting it with the more sophisticated and experientially grounded model of psychologists such as himself. In keeping with his penchant for framing (or spinning) his presentation favorably to his own perspective, he calls the economists' model "Econ", and his own "Human". In fact, "Econ" was never meant to represent man in all his humanity, and Kahneman's Economics Nobel, recognizing his decision theory contributions to economics, was preceded by many other Nobels to economists who had been expanding the concept of the economic actor into psychological territory for decades. In fact, the essential view of the Austrian economists dating from the 1920s (von Mises, Hayek, and their predecessors) is that economics is in the end wholly dependent on psychology because it is predicated on the unknowable, unquantifiable subjective value preferences of humans, acting individually and in concert. Cautious generalizations can perhaps be made about human psychology in general, but I think that on the whole the Austrians have been a bit wiser in their restraint than Kahneman and his many, and mostly lesser, pop psychology compatriots have proved in their often sensationalist extrapolations from lab experiments.

Here is an example, I think of Kahneman over-reaching. He speaks repeatedly of the laziness of System 2, and its foot dragging reluctance to get involved in the thinking process, but in the real world, snap judgements are good enough for immediate purposes, and the better part of rationality may be to go with one's fast thinking intuitive System 1: indeed, Kahneman acknowledges this himself in passing, both in his beginning and his ending, but this isn't enough to counterbalance the overall argument of his book.

Kahneman also, in his final chapter, speculatively extends his findings into the political sphere (his liberal Democratic Party bias has already been made clear by gratuitous and somewhat annoying usage of salient modern politicians in examples), but not to any great effect.

Kahneman advocates "libertarian paternalism" consisting of government programs that people are enrolled in automatically unless they opt out by checking a box on forms - thus manipulating the presentation frame so as to trick them into signing on to what some government bureaucrat thinks is good for them. Of course, as long as people are allowed to opt out, one can't call the choice here anything but libertarian, though to be consistent with their socialist mores, liberals like Kahneman really ought to object to such practices as being manipulative advertising. This libertarian finds nothing objectionable about the way such a choice might be presented - after all, the average man, if adequately educated and prepared for the real world, should have no trouble seeing through the frames. What is not only paternalistic, but totalitarian in spirit, is the extortion of taxpayer money to finance such government programs in the first place.

Somehow, it fails to occur to Kahneman that most people could be trained to recognize and avoid fallacious thinking during all those years of enforced and mostly wasteful schooling - just as most people can be trained to recognize the Müller-Lyer illusion for what it is. IMO every high school graduate should be required to learn to recognize and avoid the paradigm cases of fallacious thinking presented in Thinking, Fast, and Slow, and this material could profitably be expanded to cover the many rhetorical tricks used by the manipulators and spinmeisters, both public and private, who batten off of our society. With such training in critical thinking, and with the reintroduction of enough honest and rigor to begin high school graduates up to the 12th grade reading and writing proficiencies that were routine in the 1950s, the need for college as life preparation would be altogether obviated, and most young people could avoid wasting their early years in college, piling up debt, and get on with their work and/or their self-education, as they chose.
49 people found this helpful
Helpful
Report abuse
    Showing 0 comments

There was a problem loading comments right now. Please try again later.


Paul F. Ross
4.0 out of 5 stars A scientist's look at how we think
Reviewed in the United States 🇺🇸 on January 14, 2012
Verified Purchase
A scientist's look at how we think

Review of Kahneman's Thinking, fast and slow by Paul F. Ross

A friend called Kahneman's book to my attention. I purchased it immediately and put it at the top of my stack of "to be read" books. Having finished the read just a day ago, writing my impressions immediately is important to catching the details of my observations, the yin and yang of this book. The work, indeed, captures my high interest. Kahneman, psychologist at Princeton University, is known for his work on understanding people as they interact with "the economy," work Kahneman did in concert with Amos Tversky. The work won the 2002 Nobel prize in economics for having persuaded economists that humans, contrary to the assumptions long a part of economists' theories of people as economic actors, do _____________________________________________________________________________________

Kahneman, Daniel Thinking, fast and slow 2011, Farrar, Straus and Giroux, New York NY, vi + 499 pages
_____________________________________________________________________________________

not always consider all the information available to them and make choices in their own individual best interests with respect to wealth. The difference between Econs and Humans, these labels adopted from today's economist Richard Thaler, is one of the continuing themes of Kahneman's story. Daniel Bernoulli (d. 1782, he and Adam Smith working at the same time ... there being not a single reference in Kahneman to Adam Smith) proposed that people make their economic choices so as to maximize the decision-maker's own wealth. Kahneman sees too many economists today, including several from the highly regarded (University of) Chicago School, as accepting this erroneous view of economic actors (as Econs) and failing to understand that people (as Humans) are influenced in their choices by many circumstances well beyond the decision's outcome as it affects the decision-maker's wealth. This book is Kahneman's explanation for the general reader about the work which won the Nobel prize.

In thirty nine chapters divided into five parts, Kahneman makes distinctions between Econs and Humans, between associative memory that thinks fast using simplified relationships (he calls it System 1) and analytical mental processes which think slowly and, prodded into doing work, can cope with complexity (he calls it System 2), between heuristics and biases in thinking, between choices as Econs make choices and choices as Humans make choices, between the influence of the decision's frame and the details of the choice itself, between broadly framed decision-making and narrowly focused decision-making. The publisher reprints as appendices the 1974 paper in Science by Tversky and Kahneman (Judgment under uncertainty: heuristics and biases) and the 1984 paper in American Psychologist by Kahneman and Tversky (Choices, values, and frames) which were key steps in winning widespread recognition for their work. "Prospect theory," described in their 1984 paper, was a step in their thinking that showed the sharp distinction Humans make between gains and losses. Kahneman and Tversky can be classified among the behavioral economists who do experiments with people in choice-making exercises to uncover the regularities and irregularities in, the influences upon, peoples' decisions.

I thoroughly enjoy this work by Kahneman (Tversky died in 1996) especially because it shows so distinctly the observed-in-experiments, evidence-based foundations for psychology and psychology's claim to understanding human behavior. I enthusiastically recommend the read for those thoughtfully interested in people and people-effects on our individual lives as well as our shared lives in organizations, governments, and cultures. Walk up and down the aisles of the local large bookstore looking for psychology and one finds self help books that present "ways to think about" our lives, implying by their very structure and presentation that all one needs to do to understand self and others is to take thought. That's pure nonsense, yet that's the modal view - actually it's almost the only view - of how psychological science gets built. For those willing to read the footnotes, Kahneman thoroughly undoes this view. Psychology, as all psychologists should know, is built upon empirical investigations of human behavior. It is something of a bitter joke that it should take two psychologists to persuade economists that human beings make economic decisions using inputs well beyond the wealth-maximizing rule posited by Bernoulli in the 18th century. Yet, as Kahneman states so clearly in his chapter titled Conclusions, there are still many economists who continue to use the Econ model of the human being as the individual economic actor. The subject of economics is taught in universities with Econ as the decision-making universal actor for all humankind. The Econ model is simply incomplete as a model for the individual actor and obviously even has its difficulties in accurately forecasting group behavior as we are learning through hard lessons in our inadequately understood attempts in Keynesian economics following the 2008-2009 economic downturn, efforts we make as we grope for economic recovery.

Begin reading Kahneman's book by first reading the chapter on Conclusions, the last in the book. While that may prove a bit difficult because of the unfamiliar concepts presented as key words or phrases (Econs, Humans, System 1, System 2, prospect theory, utility theory, etc. etc.), the reader can quickly learn some of the handles that Kahneman uses and, particularly, the view of politics and society from which he thinks and writes. Then, of course, read the Conclusions again as you finish the book.

Very much admiring this work by the behavioral economists and by the neurobiologists in exploring human behavior with respect to economic choices, I'm also critical. Psychologists and thoughtful readers need to know not only the achievements of this work but also its shortfalls ... thus perhaps avoiding frustrations in the reading and errors of overgeneralization for the findings.

First, one of the rules of science is that it seek the simplest possible explanations for the phenomena being explored. Kahneman has the difficult task of speaking from and to both psychological and economics backgrounds, sometimes also adding neuroscience. In taking on that task, he refers to concepts and data from all three disciplines. Sometimes the concepts are closely related, almost duplicates. Needing to communicate with two or three highly specialized audiences, Kahneman uses the several terminologies and offers the reader no charts or tables listing, classifying, and simplifying the concepts. A tour through the book's index turns up ...

affect heuristic, anchoring index, associative coherence, associative memory, availability cascades, baseline predictions, base rates, broad framing, broken-leg rule, causal base rates, causal stereotypes, cognitive ease, cognitive illusion, cognitive strain, coherence, competition neglect, confirmation bias, conjunction fallacy, decision utility, decision weights, decorrelated errors, default options, disposition effect, duration neglect, duration weighting, econs, ego depletion, emotional coherence, endowment effect, evaluability hypothesis, expectation principle, expected utility theory, experienced utility, flowers syllogism, focusing illusion, fourfold pattern, halo effect, happiness, hedonimeter, hubris hypothesis, ideomotor effect, indifference map, joint evaluations, judgment heuristics, law of small numbers, less-is-more pattern, loss aversion, loss aversion ratio, mental shotgun, narrative fallacy, narrow framing, negativity dominance, neuroeconomics, norm theory, one-sided evidence, optimistic bias, outcome bias, peak-end rule, planning fallacy, possibility effect, precautionary principle, preference reversals, pretentiousness language, priming, probability neglect, professional stereotypes, prospect theory, psychopathic charm, rational-agent model, reciprocal priming, recognition-primed decision, reference class forecasting, retrievability of instances, risk assessment, risk aversion, risk seeking, selves, single evaluations, somatic marker hypothesis, structured settlements, sum-like variables, sunk-cost fallacy, System 1, System 2, theory-induced blindness, utility theory, validity, vividness, wealth, well-being, WYSIATI

... and this is indeed a complex mapping of what is intended to be a joined and simplified configuration explaining human thinking particularly as it is exercised in decisions about economic choices. I respect Kahneman's intent. His characterization of System 1 and System 2 is an important and simplified communication. I think he could do an even better job at simplifying what he is teaching, particularly by using tables and other means for classifying, grouping, his concepts. Of course, other bright people have similar problems. The physicists have their Standard Model of seventeen particles and forces classified as fermions and bosons, quarks and leptons, and that is not simple enough to allow the rest of us to keep firmly in mind what these seventeen particles-forces and several classifications mean ... nor does it explain mass, astronomy's dark matter, or even gravity! Communicating science simply often is no easy task. However it is an essential task if science is to have any use beyond mental stimulation. This reader demands that science, in its most valuable form, be useful. Communicating science simply is a task that must be mastered on the way to making one's science maximally useful.

Kahneman, as do the behavioral economists generally, is reporting experiments that have been run, primarily, with college students as participants ... and not even with a representative group of college students. They come heavily from UC Berkeley, U of Oregon, Michigan U ... very selective places where Kahneman and Tversky have taught and their graduate students have done work in the mode of their teachers. A few years ago I entered a conversation with a respected colleague, a colleague much better informed than am I on these matters, at The Ohio State University. "Psychology built upon studies with college sophomores must be questioned," I insisted. "Not so," he responded. "The work frequently generalizes well to the general population." The work Kahneman highlights so well in this book is work built heavily upon experiments run with college students as the participants. Despite these assurances from very able scholars, I cannot brush from my memory the very impressive account that Herrnstein and Murray (1994) presented in their controversial The bell curve which followed a representative cohort of American youth through a rather lengthy part of their life span (maybe age 12 to age 42 ... I don't recall accurately and, as I write now, am setting aside the task of looking it up). The work showed that life as experienced by youth at the high end of the bell curve on intelligence was a very, very different experience from life experienced by youth at the low end. Those at the low end seemed to find life a blooming, buzzing confusion and they got into lots of trouble. Their minds worked differently from the minds of their peers who found themselves to be college eligible and, in many instances, got that education. The evidence was overwhelmingly clear in life's outcomes (current income, marriage, need for and use of social support networks, use of drugs, crime rates, divorce, time in jail, . . .). The people in this longitudinal study had not yet reached the ends of their lives! Having taken the Herrnstein and Murray findings to heart, I think I might be relatively safe in transferring what I learn in Kahneman's book to the professionals and executives I have met in working with organizational leaders, but I'm still not convinced, despite my colleague's assurance from his superior and more current knowledge, that findings learned with college students as the participants can be assumed to be true for all human beings.

The behavioral and management sciences (psychology and economics included) receive a tiny fraction of the world's R&D funding. They are regarded by the college educated populations of the world as second class sciences, at best, and probably not sciences at all. Kahneman's book is a stout statement very much, and appropriately, in the face of this non-knowledge. But to win public support and public funding, to have what is known by these sciences at work in reducing problems that the world faces every day, to see these sciences contribute practices that enhance human well-being, it is absolutely essential that what the sciences know be communicated to opinion leaders - those who escaped college without having learned much about these sciences - in a way that is accurate, clear, and understandable. As shown by the list of ideas from the index of Kahneman's book, I think Kahneman has not yet formed the clearest and most communicable framework for his core ideas and that he, like the rest of us, needs to spend more time practicing communication with able people (non-scientist students and adults) and converting what he learns into written form. Our behavioral and management sciences have much they already know that could significantly relieve problems and add immensely to well-being. Accomplishing those prospective outcomes depends upon us scientists communicating what we know to audiences like those Kahneman is addressing in this book.

Read this book. You'll find you've profited.

Bellevue, Washington
19 December 2011

References

Herrnstein, Richard J. and Murray, Charles The bell curve: Intelligence and class structure in American life 1994, The Free Press, New York NY

Kahneman, Daniel Thinking, fast and slow 2011, Farrar, Straus and Giroux, New York NY

Copyright (c) 2011 by Paul F. Ross All rights reserved.
8 people found this helpful
Helpful
Report abuse
    Showing 0 comments

There was a problem loading comments right now. Please try again later.


David Zetland
4.0 out of 5 stars Long but FULL of insights for economists (and regular folks :)
Reviewed in the United States 🇺🇸 on July 10, 2014
Verified Purchase
Daniel Kahneman deserved to win the Nobel Prize in Economics for his contributions to "behavioral economics." He and collaborator Amos Tversky (who died before he could receive the award) used psychological insights to explain "irrational" behavior that economists had pre-emptively (and un-realistically) dismissed. Their most important contributions concern "cognitive bias" (we focus on some -- not all -- costs and benefits) and "prospect theory" (we put more weight on potential losses than potential gains).

I read Kahneman's 2011 book over several months because it was long (499 pages) and thorough repetitive.

My top-line recommendation that that you read this insightful book, but I suggest you take a chapter per day (or week) to allow yourself time to digest -- and experience -- the ideas. (Alternatively, print this review and read one note per day! :)

Here are some notes on Kahneman's ideas:

Kahneman suggests that we process decisions by instinct (System 1 thinking, or "guts") or after consideration (System 2 thinking, or "brains"). The important point is that each system is right for some situations but not others. Order food that "feels right" but don't buy a car that way. A car (or job or house) decision involves many factors that will interact and develop over years. We cannot predict all these factors, but we can give them appropriate weights with care.
Salespeople appeal to your guts when they want you to trust them. You should rely on brains to evaluate their promises.
We make better gut decisions when we're happy but worse ones when we're sad or angry.
Kahneman says we often fail to look beyond "what we see is all there is" when considering a situation. This leads to misdirected gut responses. (Nassim Taleb's Fooled by Randomness addresses this bias.) People over-estimate the risk of violent death because the media loves exotic, bloody stories.
People vote for "competence" (strong, trustworthy) over "likability" when judging candidates on looks. Many voters choose candidates based on looks.
People believe that a beneficial technology's risk is lower and that low risk technology brings more benefits. This may explain why most people don't care about the risks of driving cars (far more dangerous than flying in airplanes) or using cell phones. It also suggests that policy changes (e.g., higher prices for water) will be more acceptable when they are small and reversible. After the sky does not fall, the "low risk" strategy can be expanded.
The measuring stick of risk (relative to what?) affects people's perceptions of risk.
Bayesian reasoning: (1) anchor your judgement on the probability of an outcome (given a plausible set of repetitions), then (2) question the accuracy of your belief in that probability as outcomes appear. Put differently, take a stand and reconsider it as new data arrive.
People will pay more for a "full set of perfect dishes" than for the same set with extra damaged dishes -- violating the "free disposal" assumption of economic theory (we can always dump excess). This bias explains why a house with freshly painted, empty rooms will sell for more than one with fresh paint but old furniture.
Stereotyping is bad from a social perspective, but we should not ignore the information included in group statistics. Looking from the other direction, people are far TOO willing to assume the group behaves as one individual. (When I was traveling, I learned "not to judge a person by their country nor a country by one person.")
Passing the buck: People "feel relieved of responsibility then they know others have heard the same request for help." This fact explains the importance of putting one person in charge, asking that person for a decision, and setting a deadline to evaluate the decision's impact.
"Regression to the mean" happens when average performance replaces a "hot streak." It's not caused by burn out. It's caused by statistics. (Try to get a "hot streak" in coin flips.)
"Leaders who have been lucky are not punished for taking too much risk... they are credited with flair and foresight" [p204]. Two of three mutual funds underperform the market in any given year, but lucky managers (and their investors) cling to their "illusion of skill."
Successful stock traders find undervalued companies, not good companies whose shares may already be overpriced.
Philip Tetlock interviewed 284 people "who made their living commenting or offering advice on economic and political trends." Their predictions could have been beaten by dart-throwing monkeys -- even within their specializations. They offered excuses to explain their "bad luck" (see Note 8).
"Errors of prediction are inevitable because the world is unpredictable" [p220].
Algorithms are statistically superior to experts when it comes to diagnosing medical, psychological, criminal, financial and other events in "uncertain, unpredictable" domains. See my paper on real estate markets [pdf].
Simpler statistics are often better. Forget multivariate regressions. Use simple weights. For example: Marital stability = f (frequency of lovemaking - frequency of quarrels).
"Back-of-envelope is often better than an optimally weighted formula and certainly better than expert judgement" [p226].
Good (trustworthy) intuition comes from having enough time to understand the regularities in a "predictable environment," e.g., sports competition. "Intuition cannot be trusted in the absence of stable regularities in the environment" [p241].
The "planning fallacy" might lead one to believe the best-case prediction when events do not follow the best case path. Use less optimistic weights -- and read this book.
Overoptimism explains lawsuits, wars, scientific research and small business startups. Leaders tend to be overoptimistic, for better or worse. (Aside: I think men are more optimistic than women, which is why they discover more and die more often.)
Want to plan ahead? "Imagine it's one year in the future and the outcome of the plan was a complete disaster. Write a debrief on that disaster." This is useful because there are more ways to fail than succeed.
Our attitudes towards wealth are affected by our reference point. Start poor, and it's all up; start rich, and you may be disappointed. (If you own a house, decide if you use the purchase price or its "value" during the bubble.") You're much happier going from $100 to $200 than $800 to $900.
The asymmetry of loses/wins in prospect theory explains why it's harder for one side to "give up" the exact same amount as the other side gains. This explains the durability of institutions -- for better or worse -- and why they rarely change without (1) outside pressure of bigger losses or (2) huge gains to compensate for losses. It also explains why it's hard for invaders to win.
Economists often fail to account for reference points, and they dislike them for "messing up" their models. Economists whose models ignore context may misunderstand behavior.
We give priority to bad news, which is why losing $100 does not compensate for winning $100. Hence, "long term success in a relationship" depends on avoiding the negative more than seeking the positive" [p302].
People think it's fairer to fire a $9/hr worker and hire a $7/hr worker than reduce the wages of the $9/hr worker. That may not be a good way to go.
"The sunk cost fallacy keeps people too long in poor jobs, unhappy marriages and unpromising research projects" [p345].
"The precautionary principle is costly, and when interpreted strictly it can be paralyzing." It would have prevented "airplanes, air conditioning, antibiotics, automobiles..."
Framing and anchoring affect our perspectives. The American preference for miles per gallon (instead of liters per 100km) means they cannot accurately compare fuel efficiency among cars. This is not an accident as far as US car companies are concerned. (Another non-accident is raising fuel economy standards instead of gas taxes.)
People may choose a vacation according to what they PLAN to remember than what they will experience. That may be because we remember high and low points but forget their duration.
"The easiest way to increase happiness is to control use of your time. Can you find more time to do the things you enjoy doing?" (I have the freedom to write this review, but it gets tedious after 3 hours...)
"Experienced happiness and life satisfaction are largely determined by the genetics of temperament," but "the importance that people attached to income at age 18 anticipated their satisfaction with their income as adults" [pp400-401]. I am fortunate, I think, to have started life with low expectations. That makes it easier for me to make 1/3 the money in Amsterdam that I would in Riyadh because it's definitely better to be "poor" in Amsterdam.
That said, "the goals people set for themselves are so important to what they do and how they feel that... we cannot hold a concept of well-being that ignores what people want" [p402].
"Adaptation to a situation means thinking less and less about it" [p405].
[Paraphrased from p412]: Our research has not shown that people are irrational. It has clarified the shape of their rationality, which creates a dilemma: should we protect people against their mistakes or limit their freedom to make them? Seen from the other side, we may think it easier to protect people from the quirks of "guts" and laziness of "brains." (Hence my support for a ban on advertising.)
"Brains" may help us rationalize "guts" but they can also stop foolish impulses -- when we acknowledge the limits to our reason and the information we rely on.
"Gut" feelings can guide us well if we can tell the difference between clear and complicated circumstances.
"An organization is a factory that manufactures judgements and decisions" [p417]. It's important, therefore, to balance between its "gut" and "brain" functions.

Bottom Line: I give this book FOUR STARS. Skip psychology and read it to understand yourself and others.
9 people found this helpful
Helpful
Report abuse
    Showing 0 comments

There was a problem loading comments right now. Please try again later.


Halordain
5.0 out of 5 stars How to reconcile our two mental systems, and capitalize on the best each has to offer
Reviewed in the United States 🇺🇸 on June 15, 2020
Verified Purchase
Thoughtful, applicable, insightful, entertaining. I enjoyed dissecting the numerous thought experiments and studies for merit that I could apply to my everyday thinking.

System 1 and System 2 form the backbone of this book.

System 1 is impulsive, and provides heuristic and intuitive guesses and reactions to stimuli without prompting. It can't be turned off. Why You See Is All There Is (WYSIATI). But it's also responsible for remarkably well-tuned intuitions, fast information-processing, "muscle memory," pattern-matching, intensity matching, face & situation recognition, and much of what makes human minds human. Author Daniel Kahneman points out situations in which System 1 is scientifically poor, including statistics (like the difference between 0.01% and 0.001%), random events, weighting time & duration in retrospect.

System 2 is more calculated, requires devoted effort and concentration in rationalizing & making decisions. It assigns value to past events, keeps score, questions bias, and carries out more intensive, deliberate calculations involving more data. It's like fetching data from main memory, rather than relying on the cache.

Kahneman's prose is rich with examples and experimental case studies, from visceral phenomena like pleasure vs. pain tolerance, and the tendency to derive general from specific, rather than the specific from the general (statistic).... to logic fallacies like optimism in planning, misjudging statistics, underestimating sampling, and sunk costs. He gives simple, easy-to-remember names to a lot of the phenomena he observed in studies, like the peak-end rule that describes how humans tend to judge pain or pleasure of a past event based on an average of how it ended and the peak of the experience, neglecting duration. He also provides solutions and checks & balances that can help us reduce bias, where possible. For example, to mitigate the planning fallacy, he writes:

1.) Identify an appropriate reference class.
2.) Obtain the statistics of the reference class. Use the statistics to generate a baseline prediction.
3.) Use specific information about the case to adjust the baseline prediction, if there are particular reasons to expect the optimistic bias to be more of less pronounced in this project than in others of the same type.

The author's life experiences, from serving as an evaluator in the Israeli defense forces to publishing papers as a professor, inform many of the studies and give them a human element and interest to which I could easily relate. I also appreciate his emotion, adding exclamation points to the observations that surprised him, or proved him wrong. He writes in an unassuming, humble, curious way that made each anecdote or cited study a joy to read.

Finally, I thought he -- and his editors -- divided the book brilliantly. Chapters are short, usually 8~16 pages, and they always concentrate on some nugget or fallacy that could be later referenced by a single term, like "anchoring," "regression to the mean," "the fourfold pattern," or "the halo effect." He builds up knowledge brick by brick, experiment by experiment, that after a few nights of reading, you begin to recognize fallacies and patterns in everyday life by the terms Kahneman has assigned to them.

My favorite part of each chapter is the end: a short section of quotations that describe and use key terms from the chapter, in an everyday, relatable way.

There are too many topics and quotes to list them all. Some of my favorites:

"The best we can do is a compromise: learn to recognize situations in which mistakes are likely and try harder to avoid significant mistakes when the stakes are high. The premise of this book is that it is easier to recognize other people's mistakes than our own."

"Self-control requires attention and effort."

"His System 1 constructed a story, and his System 2 believed it. It happens to all of us."

"They didn't want more information that might spoil their story. WYSIATI."

"We often compute much more than we want or need. I call this excess computation the mental shotgun. It is impossible to aim at a single point with a shotgun because it shoots pellets that scatter, and it seems almost equally difficult for System 1 not to do more than System 2 charges it to do."

"Money-primed people become more independent than they would be without the associative trigger."

"He was asked whether he thought the company was financially sound, but he couldn't forget that he likes their product."

"Our aim in the negotiation is to get them anchored on this number."

"Let's make it clear that if that is their proposal, the negotiations are over. We do not want to start there."

"When the evidence is weak, one should stick with the base rates."

"They added a cheap gift to the expensive product, and made the whole deal less attractive. Less is more in this case."

"System 1 can deal with stories in which the elements are causally linked, but it is weak in statistical reasoning."

"The experiment shows that individuals feel relieved of responsibility when they know that others have heard the same request for help."

"We can't assume that they will really learn anything from mere statistics. Let's show them one or two representative individual cases to influence their System 1."

"But those with the most knowledge are often less reliable. The reason is that the person who acquires more knowledge develops an enhanced illusion of her skill and becomes unrealistically overconfident."

"The question is not whether these experts are well trained. It is whether their world is predictable."

"The research suggests a surprising conclusion: to maximize predictive accuracy, final decisions should be left to formulas, especially in low-validity environments."

"In this view, people often (but not always) take on risky projects because they are overly optimistic about the odds they face."

"A well-run organization will reward planners for precise execution and penalize them for failing to anticipate difficulties, and for failing to allow for difficulties that they could not have anticipated --the unknown unknowns."

"She is the victim of a planning fallacy. She's assuming a best-case scenario, but there are too many different ways for the plan to fail, and she cannot foresee them all."

"He weighs losses about twice as much as gains, which is normal."

"Think like a trader! You win a few, you lose a few."

"Decision makers tend to prefer the sure thing over the gamble (they are risk averse) when the outcomes are good. They tend to reject the sure thing and accept the gamble (they are risk seeking) when both outcomes are negative."

"We are hanging on to that stock just to avoid closing our mental account at a loss. It's the disposition effect."

"The salesperson showed me the most expensive car seat and said it was the safest, and I could not bring myself to buy the cheaper model. It felt like a taboo tradeoff."

"Did he really have an opportunity to learn? How quick and how clear was the feedback he received on his judgments?"

"We want pain to be brief and pleasure to last. But our memory, a function of System 1, has evolved to represent the most intense moment of an episode of pain or pleasure (the peak) and the feelings when the episode was at its end. A memory that neglects duration will not serve our preference for long pleasure and short pains."

There are many, many more. Read the book for yourself and enjoy the wisdom!
18 people found this helpful
Helpful
Report abuse
    Showing 0 comments

There was a problem loading comments right now. Please try again later.


Herbert Gintis
4.0 out of 5 stars Lots of Truth, Too Much Hype
Reviewed in the United States 🇺🇸 on November 27, 2011
Verified Purchase
I was privileged to have Daniel Kahneman for many years as a member of my research group "MacArthur Network on the Origin and Nature of Norms and Preferences," where I came to appreciate his dedication, intelligence, and friendship. At last Kahneman has written a book for the public, and is already a widely discussed best seller. I am convinced that the contributions of Kahneman and his coauthor Amos Tversky are fundamental and lasting, but I am concerned that his work not be seen as conveying the general message that "people are irrational."

I was motivated to write these hasty comments by the review of the book by Jim Holt in the New York Times Book Review (11/27/2011), p. 16ff. Holt writes "Although Kahneman draws only modest policy implications... others... go much further. [David Brooks, NY Times editorial writer], for example, has argued that Kahneman and Tversky's work illustrates `the limits of social policy'; in particular, the folly of government action to fight joblessness and turn the economy around." Of course, nothing of the sort follows from Kahneman and Tversky's work.

I know Kahneman and Tversky's work (hereafter KT) very well, but I am going through this book slowly, so I will add to my comments as they accumulate.

Psychologists have known for many years that humans make systematic visual mistakes, called "optical illusions." This does not elicit the general pronouncement that humans systematically error in their visual judgments. The same should apply to KW's results: the results are correct, but they should not be sloppily interpreted as saying that people are general illogical and error-prone decision-makers.

First main point: I believe KW's work does not at all suggest that people are poor at making logical inferences. The experiments which might suggest this are generally misinterpreted. A particularly pointed example of this heuristic is the famous Linda the Bank Teller problem, first analyzed in Amos Tversky and Daniel Kahneman, "Extensional versus Intuitive Reasoning: The Conjunction Fallacy in Probability Judgment", Psychological Review 90 (1983):293-315. Subjects are given the following description of a hypothetical person named Linda: "Linda is 31 years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice and also participated in antinuclear demonstrations." The subjects were then asked to rank-order eight statements about Linda according to their probabilities. The statements included the following two: "Linda is a bank teller" and Linda is a bank teller and is active in the feminist movement."

More than 80\% of the subjects---graduate and medical school students with statistical training and doctoral students in the decision science program at Stanford University's business school---ranked the second statement as more probable than the first. This seems like a simple logical error because every bank teller feminist is also a bank teller. However, there is another interpretation according to which the subjects are correct in their judgments. Let p and q be properties that every member of a population either has or does not have. The standard definition of "the probability that member x is p" is the fraction of the population for which p is true. But an equally reasonable definition is the probability that x is a member of a random sample of the subset of the population for which p is true.' According to the standard definition, the probability of p and q cannot be greater than the probability of p. But, according to the second, the opposite inequality can hold: x might be more likely to appear in a random sample of individuals who are both p and q than in a random sample of the same size of individuals who are p.
In other words, the probability that a randomly chosen bank teller is Linda is probably much lower than the probability that a randomly chosen feminist bank teller is Linda. Another way of expressing this point is that the probability that a randomly chosen member of the set "is a feminist bank teller" may be Linda is greater than the probability that a randomly chosen member of the set "is a bank teller," is Linda.

I believe my interpretation is by far the more natural. Moreover, why would the experimenters have included information about Linda's college behavior unless it were relevant? This behavior is completely irrelevant given KW's interpretation of probability, but wholly pertinent given a "conditional probability" interpretation. The latter can be colloquially restated as "the conditional probability that an individual is Linda given that she is a feminist bank teller is higher than the conditional probability that an individual is Linda given that she is a bank teller."

Second Main Point: Many of the examples of irrationality given by KW are not in any way irrational. Consider for example the chief investment officer described by Kahneman (p. 12). This man invested tens of millions of dollars in Ford Motor Company stock after having visited and automobile show and having been impressed with the quality of the current offering of Ford vehicles. Kahneman says "I found it remarkable that he had apparently not considered the one question that an economist would call relevant: I Ford stock currently underpriced?" In fact, there is no objective measure of a stock being "underpriced," and no known correlation between a measure of "being underpriced" and subsequent performance on the stock market. Moreover, the executive may not have revealed all of the reasoning involved in his decision, but rather only a "deciding factor" after other considerations had been factored in.

Third Main Point: We have long known that people do not generally act in their own best interest. We have weakness of will, we procrastinate, we punish ourselves for things that are not our fault, we act thoughtlessly and regret our actions, yet repeat them, we become addicted to cigarettes and drugs, we become obese even though we would like to be thin, we pay billions of dollars for self-help books that almost never work. KW have not added much, if anything, to our understanding of this array of bizarre behaviors. Of course, they do not claim otherwise. However, commentators regularly claim that this behavior somehow contradicts the `rational actor model' of economic theory, which it does not in any way. Economic theory explores the implications of human choice behavior without claiming that the choices people make are in some sense prudent or even desirable to the decision-maker (we cannot choose our preferences).

Economic theory is in general supportive of the notion that people should get what they want, but has included the notion of "merit goods" that society values or disvalues for moral or practical reasons that counterindicate consumer sovereignty. For instance, we regulate pharmaceuticals, we prohibit racial discrimination in public places, and we outlaw markets in body parts.

Fourth Main Point: KW are right on target in asserting that people make massive errors in interpreting statistical arguments (e.g., the base rate fallacy, or the interpretation of conditional probabilities). This has nothing to do with "illogicality" or "irrationality," but rather the complexity of the mathematics itself. For instance, KW have shown that physicians routinely fail to understand what the statistical accuracy of lab tests mean---the fact that at test is 95% accurate is compatible with the fact that it is wrong 95% (or any other, depending on the incidence of the condition that is test for) of the time.

The psychologist Gerd Gigerenzer has shown that if conditional probabilities are reinterpreted as frequencies, people have no problem in interpreting their meaning (see the discussion "Risk School" in Nature 461,29, October 2009). Gigerenzer has be promoting the idea that trigonometry be dropped from the high school math sequence (no one uses it except surveyors, physicists, and engineers) and probability theory be added. This sounds like a great idea to me.

Of course, if people do not do well a formal statistical analysis, how are to we defend the rational actor model, which is thoroughly Bayesian and implicitly assumes people are infinitely capable statistical decision-makers? The answer is not to abandon the rational actor model, which in general has had exceptional explanatory power---see my book, The Bounds of Reason (Princeton 2009) and my review of Ken Binmore's Rational Decisions (Economic Journal, February 2010). Rather, I believe the answer lies in replacing the subjective prior assumption of the rational actor model with a broader assumption that individuals make decision within networks of minds that are characterized by distributed cognition, much as social insects, except of course on a much higher level, using language instead of pheromones, with the cultural construction of iconic rather than pheromonic signals. But, that is the subject to be explored in the future. One of the payoffs of KT research is to make it clear how insufficient the standard economic model of decision-making under (radical) uncertainty really is.
162 people found this helpful
Helpful
Report abuse
    Showing 0 comments

There was a problem loading comments right now. Please try again later.


Gary Colwill
5.0 out of 5 stars Leaping to Conclusions
Reviewed in the United States 🇺🇸 on May 7, 2020
Verified Purchase
Very interesting concepts and results - very informative. However, I think that some conclusions are not totally conclusive. One example is the "Linda" experiment in Chapter 15, wherein a fictional person named Linda is described as she was as a student, then the subject is given a list of possibilities for Linda's current situation and is asked to rank them in order of probability. Two of the choices are "bank teller" and "bank teller who is a feminist".

Clearly the second is in a subset that is a member of the set in which the first is found, so naturally someone who takes this list on its face will recognize that the first is more likely than the second just by definition. However, a large majority of participants choose the second selector as the most likely from a probability point of view. The experimenters conclude that this is an example of "representation" and ignoring basic principles of probability, or confusing probability with plausibility or coherence.

It's true that the result illustrates a fault in the logic of most participants, but there can be other explanations for why this fault occurs. In this example, it occurs to me, as one who answered wrongly and then thought about why I did so, that specifying two categories that are so similar could cause one to imply something about the less specific category that is not actually true. "Bank teller" in light of "bank teller who is a feminist" might automatically be seen by many participants as "bank teller who is not a feminist", without the participants even realizing she has made this change in the selector. That's what I think I did, and, though an error, is not one of representation. A very different mechanic was in place for me here. I would guess that this also applied to many other participants.

In other words, the selectors are presented as if they are on the same level - no hierarchy. Is she a school teacher, or a bookstore clerk, or an insurance salesperson, or a bank teller, or a bank teller who is a feminist? I think System 1 sees that most of these are either/or choices, but two seem to be this and also possibly that, which is hierarchical so doesn't really fit in with the rest of the list. System 1 might think of modifying the "bank teller" selector to "bank teller that is not a feminist", thus making the selector list much more uniform (and while it's making these changes, why not change "is active in the feminist movement" into "is active in the feminist movement and is not a bank teller"). And since this is automatic and System 1 never explains itself, most people will never be able to explain why they made such a seemingly obvious mistake. Maybe...if so, then this possible explanation also applies to all the subsequent refinements of the Linda experiment, in my uneducated and unqualified view.

To go further, another experiment is presented in which there is a die that has 4 green faces and 2 red faces, and the participant is given a list of three outcomes, all of which are unlikely, because they have more reds that greens, and the participant is supposed to choose which one she would select to win $25. Here are the choices:

1. RGRRR
2. GRGRRR
3. GRRRRR

The word "probability" is left out completely, as are any verbalized descriptions, to control for the possibility that participants are misunderstanding what is meant by probability. Even with this attempt at control, the result is that the participants choose 2 more often that 1 or 3, even though clearly if one chooses 1 they will win the $25 if the result is RGRRR or GRGRRR (because the latter contains the former). Right. But what if System 1 sees that two of the three selectors has 6 results, and only one has 5, so rules in its automated way that 1 is invalid? The rules seem to imply that there are six rolls of the die, and if we bet on 1, we will lose (the probability that I will get only 5 results from 6 rolls of the die is exactly zero percent! - well, maybe more than 0% if there's a perfectly timed earthquake or other cataclysm) - an unspoken rule is detected such that the bet is on the full 6 rolls. Given that the only two valid selectors are 2 and 3, then 2 is the most probably choice. Maybe...but this, of course, depends on the eventuality of System 1 to also ignore the detail that there would be 20 die rolls - and we've already learned that System 1 is wont to do this for the sake of expediency.

In both cases above, System 1 might have chosen to modify the scenario - in other words, System 1 might have substituted the original question with one that is easier to answer, which is a heuristic technique of System 1 that is discussed at length earlier in the book.

The point is that either way, there has been an error, but the mechanics of that error may not be what they appear to be. That's my thought anyway. I think this also applies to many other cases. Or maybe not...

The heart attack and Borg experiments were very good ones as well, but my System 2 is now too depleted of energy to think of alternative conclusions, or maybe just too lazy. :-). These are strong enough that I may just eventually conclude that my second guessing above is merely a narrative fallacy on my part.

Anyway, very fascinating book, and very revealing.
4 people found this helpful
Helpful
Report abuse
    Showing 0 comments

There was a problem loading comments right now. Please try again later.


Dr. Chuck Chakrapani
5.0 out of 5 stars A brilliant book by a brilliant mind. BE SKEPTICAL ANYWAY.
Reviewed in the United States 🇺🇸 on November 17, 2011
Verified Purchase
Back in 1994, Massimo Piattelli-Palmarini, Director of the Institute of San Raffaele in Milan, Italy, wrote a charming little book about common cognitive distortions called Inevitable Illusions. It is probably the very first comprehensive summary of behavioral economics intended for general audience. In it, he predicted that the two psychologists behind behavioral economics - Amos Tversky and Daniel Kahneman - would win the Nobel prize. I didn't disagree with the sentiment, but wondered how in the world were they going to get it since these two were psychologists and there is no Nobel prize in psychology. I didn't think there was much chance of them winning the Nobel Prize in economics. I was wrong and Piattelli-Palmarini was right. Kahneman won the Nobel prize in Economic Sciences. (Tversky unfortunately prematurely passed away by this time.) Just as Steve Jobs who was not in the music industry revolutionized it, the non-economists Kahneman and Tversky have revolutionized economic thinking. I have known Kahneman's work for quite some time and was quite excited to see that he was coming out with a non-technical version of his research. My expectations for the book were high and I wasn't disappointed.

Since other reviewers have given an excellent summary of the book, I will be brief in my summary but review the book more broadly.

The basis thesis of the book is simple. In judging the world around us, we use two mental systems: Fast and Slow. The Fast system (System 1) is mostly unconscious and makes snap judgments based on our past experiences and emotions. When we use this system we are as likely to be wrong as right. The Slow system (System 2) is rational, conscious and slow. They work together to provide us a view of the world around us.

So what's the problem? They are incompatible, that's what.

System 1 is fast, but easily swayed by emotions and can be as easily be wrong as be right. You buy more cans of soup when the display says "Limit 12 per customer". We are on autopilot with this system. System 1 controls an amazing array of behavior. System 2 is conscious, rational and careful but painfully slow. It's distracted and hard to engage. These two systems together provide a backdrop for our cognitive biases and achievements.

This very well written book will enlighten and entertain the reader, especially if the reader is not exposed to the full range of research relating to behavioral economics.

This book serves an antidote to Malcolm Gladwell's Blink. Although Gladwell never says that snap judgments are infallible and cannot badly mislead us, many readers got a different message. As the Royal Statistical Society's Significance magazine put it "Although Gladwell's chronicle of cognition shows how quick thinking can lead us both astray and aright, for many readers Blink has become a hymn to the hunch." While Kahneman does show how "fast thinking" can lead to sound judgments, he also notes how they can lead us astray. This point is made much more clearly and deliberately in Kahneman's book

All my admiration for the brilliance and creativity of Kahneman (and Tversky) does not mean that I accept 100% of their thesis. Consider this oft-quoted study. Linda is 31 years old, single, outspoken, and very bright. As a student, she was deeply concerned with the issues of discrimination and social justice, and she also participated in anti-nuclear demonstrations. Which is more probable?
1. Linda is a bank teller.
2. Linda is a bank teller and is active in the feminist movement.
Eighty-five percent of test subjects chose the second option, that Linda was a bank teller and active in the feminist movement. Kahneman's interpretation is that this opinion is wrong because the probability of a (random) woman being a bank teller is greater that than person's being a bank teller AND a feminist. What Kahneman overlooks here is that what most people answered may not be the question that was asked. The respondents may not have been concerned with mathematical probabilities, but rather could be responding to the question in reverse: Is it more likely for a current activist to have been an activist in the past compared to others in the profession? A more formal and theoretically better argued rebuttal of some of Kahneman's hypotheses can be found in the works of Gerd Gigerenzer.

Kahneman notes that even top performers in business and sports tend to revert to the mean in the long run. As a result, he attributes success largely to luck. I'm not so convinced of this. There can be alternative explanations. People who achieve high degree of success are also exposed to a high degree of failure and the reversion to the mean may be attributable to this possible mirror effect. Spectacular success may go with spectacular failure and run-of-the-mill success may go with run-of-the-mill failure. Eventually everyone may revert the mean, but the ride can be very different. Chance may not account for that.

Another concern is that much of the work is done in artificial settings (read college students). While much of what we learnt can perhaps be extended to the real world, it is doubtful every generalization will work in practice. Some may find Kahneman's endorsement of "libertarian paternelism," not acceptable. More importantly, when applied to the real world it did not always found to work.

In spite to these comments this book is written carefully in a rather humble tone. I also appreciated Kahneman's generous and unreserved acknowledgement of Tversky's contributions and his conviction that, had he been alive, Tversky would have been the co-recipient of the Nobel Prize. My cautionary comments probably have more to do with the distortions that might arise by those who uncritically generalize the findings to contexts for which they may not applicable. As mentioned earlier, the wide misinterpretation of Gladwell's Blink comes to mind.

Nevertheless, Thinking Fast and Slow is a very valuable book by one of the most creative minds in psychology. Highly recommended. For a more complete and critical understanding, I also recommend the writings of the critics of behavioral economic models such as Gerd Gigerenzer.

PS. After I published this review, I noticed an odd coincidence between Thinking Fast and Slow and Inevitable Illusions that I mentioned in my opening paragraph. Both books have white covers, with an image of a sharpened yellow pencil with an eraser top. How odd is that?
1,432 people found this helpful
Helpful
Report abuse
    Showing 0 comments

There was a problem loading comments right now. Please try again later.


Herbert Beigel
5.0 out of 5 stars Observations
Reviewed in the United States 🇺🇸 on January 16, 2012
Verified Purchase
I am deeply skeptical of psychological studies that purport to extrapolate from the general to the particular as to human behavior. I think that approach runs afoul of what Dr. Kahneman criticizes, substituting an easy answer for the hard question. The whole analysis of prospect theory and loss aversion seems to me flawed in that respect. We often resort to group averages to cover up our lack of understanding as to the physical reasons an individual brain operates the way it does. Witness the alcoholic or the gambling addict, who pay no attention to loss aversion.

When I was in college and took sociology and psychology courses, the joke on campus was that both disciplines could be described by the phrase: "The science of what everybody knows." Dr. Kahneman likes to say that the results of the experiments he describes are "surprises," but I really don't think so. From my perspective, using my, as Dr. Kahneman describes, System 1, I found most of the results of the studies he described as unsurprising and a match for my intuitive (counter-intuitive). As another reviewer commented, Dr. Kahneman's studies are flawed in a few ways: (a) they extrapolate from general averages to a particular; (b) he defines intuitive based on those averages, rather than on individual brain chemistry and how a particular individual may think; (c) he designs experiments on loss aversion that guarantee a certain result, because the students in the experiment do not get the benefit of the law of large numbers. Having presented these criticisms, however, i think his book makes a valuable contribution to people who read it and want a better understanding of how most people think.

The problem I have with Dr. Kahneman's approach is that it is a classic example of that in which he finds fault. Substituting the easier, irrelevant question for the harder question. Psychology experiments such as he cites are substitutes for an understanding of the physical brain functions that cause an individual to behave in a certain way. Neither gambling addicts nor alcoholics nor serial murderers exhibit any loss aversion whatsoever.

So, to cite one particular example, here is what I wrote the authors of a study Dr. Kahneman cites in support of loss aversion and his prospect theory.

Dear Dr. Schweitzer and Dr. Pope:

I read with interest your study after reading Thinking Fast and Thinking Slow.

I have some questions/observations that I would very much appreciate your comments.

By way of background, I have been a trial lawyer for 40 years and have a substantial interest in human behavior and loss aversion and related concepts of risk.

After reading your article I remain a bit skeptical about the connection between your results and loss aversion, at least insofar as the latter is viewed as non-rational. I should admit I have a bias against psychological or any kind of group behavior that attempts to establish a causative link with individual behavior, but having admitted my bias, I have the following to say:

1. Did your study examine the statistical comparison in the last nine holes of a tournament by a tournament leader who is ahead by four strokes or more, or any other number in the lead. Since, for example, the odds of the second place player proceeding to four under par for the last nine holes seems low, might it not be rational for the leader to avoid bogeys, regardless of whether putting or conservative play in general is concerned.

2. Some golfers (perhaps many) are leads concerned about winning than ending up in the money. The tour is replete with examples of top annual money winners having won few if any tournaments. For those golfers who are not intensely interested in winning, their putting may be differently motivated.

3. I'd be interested if you looked at the following correlations:

a. Percentage record in your study by individual golfer versus wins and where they finished on the money list;

b. The individual golfer's putting performance as related to drives in fairway, or greens in regulation;

c. None of the statistics currently promoted by the PGA, e.g. drives in fairway, sand saves, putting, greens in regulation, etc. seem to correlate well with winnings.

4. I noticed that some of the golfers with the lowest percentage difference in your study were relatively unsuccessful in winning tournaments, e.g. Sergio Garcia, Justin Leonard, John Daly, where Tiger Woods with a fairly high or average percentage was the top winner by far until a couple of years ago.

5. As in football turnover ration is there a correlation between the putting percentages you found and winning. In football, there is a high cost to pay for turnovers (in field position, lost opportunity to score, for example), which may be the reason coaches stress the importance of avoiding risk by engaging in less aggressive play (carrying the ball with one hand enhances the runner's ability to avoid tackles or)
the longer the pass, perhaps the greater chance for an interception or conversely, the greater chance for a big gain).

6. A golfer, as a young player, often participates mostly in match play. This may "train" the player to think in terms of avoiding bogeys at all cost.

7. Did you analyze the casual player, who is not playing competitively. He may be more aggressive in trying for birdies. Your acknowledgment that the percentage difference declines in the last two rounds seems to indicate what I would call the "match play" mindset, because birdies may become more important during the last two rounds as having a larger psychological effect on a competitor who sees the birdie on the scoreboard from a close competitor.

8. Downhill putts are generally considered more difficult than uphill putts; did you analyze those statistics. Your division of the green in segments may be to broad.

9. Golfers often say that putting past the hole offers a better chance of making the next putt, because they immediately see the correct line. Does your study contradict that approach?

10. Is the prospect theory of loss aversion undermined, at least a little, by the relative lack of success that appears to come to an aggressive player, e.g. Phil Mickleson's early relative lack of success, until he played more conservatively. Is there a relationship between the high percentage putt players and conservative play in general? (But see John Daly-aggressive- vs. Justin Leonard-conservative) Or, does aggressive play from tee to green correlate with conservative play on the green?

11. Is there a relationship to examine between the leader after two rounds and the ultimate winner?

I have other similar observations, but I suppose the ones above are all based on wondering whether you should take into account each golfer's overall play before deciding whether a particular birdie vs. par putting percentage is consistent with loss aversion theory in general?

Thank you in advance for reading this email. I certainly found Dr. Kahneman's book and studies and your own study as thought provoking.

Sincerely,

Herb Beigel
Helpful
Report abuse
    Showing 0 comments

There was a problem loading comments right now. Please try again later.


Book Shark
VINE VOICE
5.0 out of 5 stars Brilliant!
Reviewed in the United States 🇺🇸 on January 12, 2014
Verified Purchase
Thinking, Fast and Slow by Daniel Kahneman

“Thinking, Fast and Slow” is a fascinating look at how the mind works. Drawing on knowledge acquired from years of research in cognitive and social psychology, Nobel Prize Winner, Dr. Daniel Kahneman delivers his magnum opus on Behavioral Economics. This excellent book focuses on the three key sets of distinctions: between the automatic System 1 and the effortful System 2, between the conception of agents in classical economics and in behavioral economics, and between the experiencing and the remembering selves. This enlightening 512-page book is composed of thirty-eight chapters and broken out by the following five Parts: Part I. Two Systems, Part II. Heuristics and Biases, Part III. Overconfidence, Part IV. Choices, and Part V. Two Selves.

Positives:
1. Award-winning research. A masterpiece of behavioral economics knowledge. Overall accessible.
2. Fascinating topic in the hands of a master. How the mind works. The biases of intuition, judgment, and decision making.
3. Excellent format. Each chapter is well laid out and ends with a Speaking of section that summarizes the content via quotes.
4. A great job of defining and summarizing new terms. "In summary, most of what you (your System 2) think and do originates in your System 1, but System 2 takes over when things get difficult, and it normally has the last word."
5. Supports findings with countless research. Provides many accessible and practical examples that help readers understand the insightful conclusions.
6. A great job of letting us what we know and to what degree. "It is now a well-established proposition that both self-control and cognitive effort are forms of mental work."
7. You are guaranteed to learn something. Countless tidbits of knowledge throughout this insightful book and how it applies to the read world. "The best possible account of the data provides bad news: tired and hungry judges tend to fall back on the easier default position of denying requests for parole. Both fatigue and hunger probably play a role."
8. The differences of Systems 1 and 2 and how they function with one another. "System 1 is impulsive and intuitive; System 2 is capable of reasoning, and it is cautious, but at least for some people it is also lazy." "System 1 is gullible and biased to believe, System 2 is in charge of doubting and unbelieving, but System 2 is sometimes busy, and often lazy."
9. Important recurring concepts like WYSIATI (What You See Is All There IS). "You surely understand in principle that worthless information should not be treated differently from a complete lack of information, but WYSIATI makes it very difficult to apply that principle."
10. Understanding heuristics and biases. "The strong bias toward believing that small samples closely resemble the population from which they are drawn is also part of a larger story: we are prone to exaggerate the consistency and coherence of what we see. The exaggerated faith of researchers in what can be learned from a few observations is closely related to the halo effect, the sense we often get that we know and understand a person about whom we actually know very little. System 1 runs ahead of the facts in constructing a rich image on the basis of scraps of evidence. A machine for jumping to conclusions will act as if it believed in the law of small numbers. More generally, it will produce a representation of reality that makes too much sense."
11. Paradoxical results for your enjoyment. "People are less confident in a choice when they are asked to produce more arguments to support it."
12. Understanding how our brains work, "The world in our heads is not a precise replica of reality; our expectations about the frequency of events are distorted by the prevalence and emotional intensity of the messages to which we are exposed."
13. Wisdom. "'Risk' does not exist 'out there,' independent of our minds and culture, waiting to be measured. Human beings have invented the concept of “risk” to help them understand and cope with the dangers and uncertainties of life. Although these dangers are real, there is no such thing as 'real risk' or 'objective risk.'" Bonus. "To be useful, your beliefs should be constrained by the logic of probability."
14. You will learn lessons that are practical. "Rewards for improved performance work better than punishment of mistakes."
15. An interesting look at overconfidence. "Our comforting conviction that the world makes sense rests on a secure foundation: our almost unlimited ability to ignore our ignorance." "Remember this rule: intuition cannot be trusted in the absence of stable regularities in the environment."
16. Have you ever had to plan anything in your life? Meet the planning fallacy. "This may be considered the single most important piece of advice regarding how to increase accuracy in forecasting through improved methods. Using such distributional information from other ventures similar to that being forecasted is called taking an “outside view” and is the cure to the planning fallacy."
17. A very interesting look at Econs and Humans. "Economists adopted expected utility theory in a dual role: as a logic that prescribes how decisions should be made, and as a description of how Econs make choices."
18. Prospect theory explained. "The pain of losing $900 is more than 90% of the pain of losing $1,000. These two insights are the essence of prospect theory."
19. Avoiding poor psychology. "The conclusion is straightforward: the decision weights that people assign to outcomes are not identical to the probabilities of these outcomes, contrary to the expectation principle. Improbable outcomes are overweighted—this is the possibility effect. Outcomes that are almost certain are underweighted relative to actual certainty. The expectation principle, by which values are weighted by their probability, is poor psychology."
20. Great stuff on well being.
21. An excellent Conclusions chapter that ties the book up comprehensively.

Negatives:
1. Notes not linked up.
2. No formal separate bibliography.
3. Requires an investment of time. Thankfully, the book is worthy of your time.
4. The book overall is very well-written and accessible but some topics are challenging.
5. Wanted more clarification on how Bayes's rules work.

In summary, a masterpiece on behavioral economics. Dr. Kahneman shares his years of research and provides readers with an education on how the mind works. It requires an investment of your time but it so well worth it. A tremendous Kindle value don't hesitate to get this book. I highly recommend it!

Further suggestions: "Subliminal" by Leonard Mlodinow, "Incognito" by David Eagleman, "Switch" by Chip and Dan Heath, “Drive: The Surprising Truth about What Motivates Us” by Daniel H. Pink, “Blink” by Malcolm Gladwell, “The Power of Habit” by Charles Duhigg, “Quiet: The Power of Introverts in a World That Can't T Stop Talking” by Susan Cain, "The Social Animal" by David Brooks, "Who's In Charge" Michael S. Gazzaniga, "The Belief Instinct" by Jesse Bering, "50 Popular Beliefs that People Think Are True" by Guy P. Harrison, "The Believing Brain" by Michael Shermer, "Predictably Irrational" by Dan Ariely, "Are You Sure?" by Ginger Campbell, and "Mistakes Were Made But Not By Me" by Carol Tavris.
4 people found this helpful
Helpful
Report abuse
    Showing 0 comments

There was a problem loading comments right now. Please try again later.


  • ←Previous page
  • Next page→

Need customer service? Click here
‹ See all details for Thinking, Fast and Slow

Your recently viewed items and featured recommendations
›
View or edit your browsing history
After viewing product detail pages, look here to find an easy way to navigate back to pages you are interested in.

Back to top
Get to Know Us
  • Careers
  • Amazon Newsletter
  • About Amazon
  • Accessibility
  • Sustainability
  • Press Center
  • Investor Relations
  • Amazon Devices
  • Amazon Science
Make Money with Us
  • Sell products on Amazon
  • Sell apps on Amazon
  • Supply to Amazon
  • Protect & Build Your Brand
  • Become an Affiliate
  • Become a Delivery Driver
  • Start a package delivery business
  • Advertise Your Products
  • Self-Publish with Us
  • Host an Amazon Hub
  • ›See More Ways to Make Money
Amazon Payment Products
  • Amazon Rewards Visa Signature Cards
  • Amazon Store Card
  • Amazon Secured Card
  • Amazon Business Card
  • Shop with Points
  • Credit Card Marketplace
  • Reload Your Balance
  • Amazon Currency Converter
Let Us Help You
  • Amazon and COVID-19
  • Your Account
  • Your Orders
  • Shipping Rates & Policies
  • Amazon Prime
  • Returns & Replacements
  • Manage Your Content and Devices
  • Your Recalls and Product Safety Alerts
  • Amazon Assistant
  • Help
English
United States
Amazon Music
Stream millions
of songs
Amazon Advertising
Find, attract, and
engage customers
Amazon Drive
Cloud storage
from Amazon
6pm
Score deals
on fashion brands
AbeBooks
Books, art
& collectibles
ACX
Audiobook Publishing
Made Easy
Sell on Amazon
Start a Selling Account
 
Amazon Business
Everything For
Your Business
Amazon Fresh
Groceries & More
Right To Your Door
AmazonGlobal
Ship Orders
Internationally
Home Services
Experienced Pros
Happiness Guarantee
Amazon Ignite
Sell your original
Digital Educational
Resources
Amazon Web Services
Scalable Cloud
Computing Services
Audible
Listen to Books & Original
Audio Performances
 
Book Depository
Books With Free
Delivery Worldwide
Box Office Mojo
Find Movie
Box Office Data
ComiXology
Thousands of
Digital Comics
DPReview
Digital
Photography
Fabric
Sewing, Quilting
& Knitting
Goodreads
Book reviews
& recommendations
IMDb
Movies, TV
& Celebrities
 
IMDbPro
Get Info Entertainment
Professionals Need
Kindle Direct Publishing
Indie Digital & Print Publishing
Made Easy
Amazon Photos
Unlimited Photo Storage
Free With Prime
Prime Video Direct
Video Distribution
Made Easy
Shopbop
Designer
Fashion Brands
Amazon Warehouse
Great Deals on
Quality Used Products
Whole Foods Market
America’s Healthiest
Grocery Store
 
Woot!
Deals and
Shenanigans
Zappos
Shoes &
Clothing
Ring
Smart Home
Security Systems
eero WiFi
Stream 4K Video
in Every Room
Blink
Smart Security
for Every Home
Neighbors App
Real-Time Crime
& Safety Alerts
Amazon Subscription Boxes
Top subscription boxes – right to your door
 
    PillPack
Pharmacy Simplified
Amazon Renewed
Like-new products
you can trust
     
  • Conditions of Use
  • Privacy Notice
  • Your Ads Privacy Choices
© 1996-2023, Amazon.com, Inc. or its affiliates