Tuesday, August 27, 2013

Stay Positive: Kahneman's Thinking, Fast and Slow

One of the best books I've read in a while is Daniel Kahneman's Thinking, Fast and Slow




It's a book that synthesizes all kinds of important research from psychology, economics, and other social sciences, much of it connecting to decision making and what college professors might call "critical thinking." I'm interested in what he says about our "System 1" and "System 2" thinking, and the research makes you question how "rational" and open-minded you really are. If you work in any kind of organization, Kahneman's book is a must-read. 

I'm not doing a book review, but what I want to do here is present a litany of quotations (without the marks) from the book that are going to make it into my commonplace book:
  • The best we can do is a compromise: learn to recognize situations in which mistakes are likely and try harder to avoid significant mistakes when the stakes are high.
  • We normally avoid mental overload by dividing our tasks into multiple easy steps, committing intermediate results to long-term memory or to paper rather than to easily overloaded working memory. We cover long distances by taking our time and conduct our mental lives by the law of least effort. 
  • Too much concern about how well one is doing in a task sometimes disrupts performance by loading short-term memory with pointless anxious thoughts. 
  • ...many people are overconfident, prone to place too much faith in their intuitions. 
  • Those who avoid the sin of intellectual sloth could be called "engaged." They are more alert, more intellectually active, less willing to be satisfied with superficially attractive answers, more skeptical about their intuitions. 
  • Studies of priming effect have yielded discoveries that threaten our self-image as conscious and autonomous authors of our judgments and our choices.
  • Anything that makes it easier for the associative machine to run smoothly will also bias beliefs. A reliable way to make people believe falsehoods is frequent repetition, because familiarity is not easily distinguished from truth. 
  • How do you know that a statement is true? If it is strongly linked by logic or association to other beliefs or preferences you hold, or comes from a source you trust and like, you will feel a sense of cognitive ease. 
  • Robert Zajonc dedicated much of his career to the study of the link between the repetition of an arbitrary stimulus and the mild affection people eventually have for it. Zajonc called it mere exposure effect
  • "Familiarity breeds liking. This is mere exposure effect." 
  • The operations of associative memory contribute to a general confirmation bias
  • The tendency to like (or dislike) everything about a person--including things you have not observed--is known as the halo effect
  • Whether you state them or not, you often have answers to questions that you do not completely understand, relying on evidence that you can neither explain nor defend. 
  • We are pattern seekers, believers in coherent world, in which regularities ... appear not by accident but as a result of mechanical causality or of someone's intention. 
  • "The emotional tail wags the dog." ~Jonathan Haidt
  • There is a deep gap between our thinking about statistics and our thinking about individual cases. Statistical results with a causal interpretation have a stronger effect on our thinking than noncausal information. But even compelling causal statistics will not change long-held beliefs or beliefs rooted in personal experience. On the other hand, surprising individual cases have a powerful impact and are a more effective tool for teaching psychology because the incongruity must be resolved and embedded in a causal story. 
  • ... it is natural for System 1 to generate overconfident judgments, because confidence, as we have seen, is determined by the coherence of the best story you can tell from the evidence at hand. Be warned: your intuitions will deliver predictions that are too extreme and you will be inclined to put far too much faith in them. 
  • The tendency to revise the history of one's beliefs in light of what actually happened produces a robust cognitive illusion. 
  • The sense-making machinery of System 1 makes us see the world as more tidy, simple, predictable, and coherent than it really is. 
  • Facts that challenge such basic assumptions--and thereby threaten people's livelihood and self-esteem--are simply not absorbed. The mind does not digest them
  • We know that people can maintain an unshakable faith in any proposition, however absurd, when they are sustained by a community of like-minded believers. 
  • Emotional learning may be quick, but what we consider as "expertise" usually takes a long time to develop.
  • The associative machine is set to suppress doubt and to evoke ideas and information that are compatible with the currently dominant story.
  • In other words, do not trust anyone--including yourself--to tell you how much you should trust their judgment. 
  • ...the two basic conditions for acquiring a skill: an environment that is sufficiently regular to be predictable and an opportunity to learn these regularities through prolonged practice.
  • Expertise is not a single skill; it is a collection of skills, and the same professional may be highly expert in some of the tasks in her domain while remaining a novice in others. 
  • The planning fallacy is only one of the manifestations of pervasive optimistic bias. Most of us view the world as more benign than it really is, our own attributes as more favorable than they truly are, and the goals we adopt as more achievable than they are likely to be. We also tend to exaggerate our ability to forecast the future, which foster optimistic overconfidence. 
  • The evidence suggests that optimism is widespread, stubborn, and costly. 
  • The main obstacle is that subjective confidence is determined by the coherence of the story one has constructed, not by the quality and amount of the information that supports it. 
  • He [Gary Klein] labels his proposal the premortem. The procedure is simple: when the organization has almost come to an important decision but has not formally committed itself, Klein proposes gathering for a brief session a group of individuals who are knowledgeable about the decision. The premise of the session is a short speech: "Imagine that we are a year into the future. We implemented the plan as it now exists. The outcome was a disaster. Please take 5 to 10 minutes to write a brief history of that disaster."
  • The premortem has two main advantages: it overcomes groupthink that affects many teams once a decision appears to have been made, and it unleashes the imagination of knowledgeable individuals in a much-needed direction. As a team converges on a decision--and especially when the leaders tips her hand--public doubts about the wisdom of the planned move are gradually suppressed and eventually come to be treated as evidence of flawed loyalty to the team and its leaders. The suppression of doubt contributes to overconfidence in a group where only supporters of the decision have a voice. The main virtue of the premortem is that it legitimizes doubts. Furthermore, it encourages even supporters of the decision to search for possible threats that they had not considered earlier. 
  • The errors of a theory are rarely found in what it asserts explicitly; they hide in what it ignores or tacitly assumes. 
  • I call it theory-induced blindness: once you have accepted a theory and used it as a tool in your thinking, it is extraordinarily difficult to notice its flaws. 
  • The brains of humans and other animals contain a mechanism that is designed to give priority to bad news.
  • Animals, including people, fight harder to prevent losses than to achieve gains. In the world of territorial animals, this principle explains the success of defenders. 
  • Amos [Tversky] had little patience for these efforts; he called the theorists who tried to rationalize violations of utility theory "lawyers for the misguided." We went in another direction. We retained utility theory as a logic of rational choice but abandoned the idea that people are perfectly rational choosers. We took on the task of developing a psychological theory that would describe the choices people make, regardless of whether they are rational. In prospect theory, decision weights would not be identical to probabilities.
  • ...people expect to have stronger emotional reactions (including regret) to an outcome that is produced by action than to the same outcome when it is produced by inaction. 
  • The most "rational" subjects--those who were the least susceptible to framing effects--showed enhanced activity in the frontal area of the brain that is implicated in combining emotion and reasoning to guide decisions. 
  • Tastes and decisions are shaped by memories, and the memories can be wrong. The evidence presents a profound challenge to the idea that humans have consistent preferences and know how to maximize them, a cornerstone of the rational-agent model. 
  • Some aspects of life have more effect on the evaluation of one's life than on the experience of living. Educational attainment is an example. More education is associated with higher evaluation of one's life, but not with greater experienced well-being. Indeed, at least in the United States, the more educated tend to report higher stress.
  • The acquisition of skills requires a regular environment, an adequate opportunity to practice, and rapid and unequivocal feedback about the correctness of thoughts and actions.

5 comments:

Babe Runner said...

Man, I have GOT to read this book.

Quintilian B. Nasty said...

It's a makeup read for the mediocre Intro to Psych course I took in college.

It's worth your time--really interesting and accessible.

Dr. K said...

While I can see the advantage of "premortem" thinking, my experience on UGS suggests one can get too much of a good thing.

But there does seem to be a wealth of useful information and insight in this volume.

Quintilian B. Nasty said...

Sure, the premortem could be overused. No doubt.

Dr. K said...

But I've shared the list with the president of our union, who said "exactly" to point #1. Thanks for making this available.