Thinking, fast and slow

by Daniel Kahneman

Hardcover, 2011

Status

Checked out
Due Jun 29, 2020

Call number

BF 441 .K238 2011

User reviews

LibraryThing member JollyContrarian
This is a monster book packed with fascinating insights about how our cognitive systems process and render information. Its starting premise is that we have two discrete "systems" for mental processing. Daniel Kahneman, a cognitive psychologist who transformed himself into a Nobel Prize-winning
Show More
behavioural economist, gives these the Dr. Seussian labels "System 1 and System 2".

System 1 is fast. It makes snap judgments on limited information: it manifests itelf in the "fight or flight" reflex. System 2 is more deliberative: courtesy of this, one meditates on eternal verities, solves quadratic equations and engages in subtle moral argument. Though this is interesting enough, their interaction is more fascinating still. System 1 is lightweight, efficient and self-initiates without invitation; bringing System 2 to bear on a conundrum requires effort and concentration.

This talk of snap judgments calls to mind Malcolm Gladwell's popular but disappointing "Blink: The Power of Thinking Without Thinking". Kahneman's account, rooted in decades of controlled experiment, is a far more rigorous explanation of what is going on, and is able to explain why some snap judgments are good, and others are bad. This conundrum, unanswered in Gladwell's book, is Daniel Kahneman's main focus of enquiry.

It also invokes another popular science classic: Julian Jaynes' idea of the "Bicameral Mind" - wherein there are large aspects of our daily existence, which we consider them conscious, really are not - driving by rote to the office, playing a musical instrument - these are also mental processes, I imagine Kahneman would say, undertaken by System 1. Jaynes was widely viewed as a bit of an eccentric: Kahneman's work suggests he may have been right on the money.

It gets interesting for Kahneman where the division of labour between the systems isn't clear cut. System 1 can and does make quick evaluations even where system 2's systematic analysis would provide a better result (these are broadly the "bad" snap judgments of Gladwell's Blink). But System 2 requires dedicated mental resource (in Kahneman ugly expression, it is "effortful"), and our lazy tendency is to substitute (or, at any rate, stick with) those "cheaper" preliminary judgments where it is not obviously erroneous to do so (and by and large, it won't be, as System 1 will have done its work). Kahneman's shorthand for this effect is WYSIATI: What You See Is All There Is.

Kahneman invites the reader to try plenty of experiments aimed at illustrating his fecklessness, and these hit their mark: it is distressing to repeatedly discover you have made a howling error of judgment, especially when you knew you were being tested for it. This has massive implications for those who claim group psychology can be predicted on narrow logical grounds. The latter half of Thinking Fast and Slow focusses more on our constitutional inability to rationally adapt to probabilities and soundly wallops the notion of Homo Economicus, the rational chooser each of us imagine ourselves to be. This is where Kahneman's Nobel Prize-winning Prospect Theory and gets full run of the paddock.

Kahneman draws many lessons (which, by his own theory, doubtless will go unheeded) for scientists, economists, politicians, traders and business managers: "theory-induced blindness"; how we become (irrationally) risk tolerant when all our options are bad and risk averse when all our options are good, and how we systematically underweight high probability outcomes relative to actual certainty. For those with nerves of steel there's a real arbitrage to be exploited here.

This long book is a rich (if "effortful") store of information and perspective: it is not news that our fellow man tends not to be as rational as we like to think he is, but we are inclined strongly exclude present company from such judgments. Kahneman is compelling that we are foolish to do so: this is a physiological "feature" of our constitution: the "enlightened" are no more immune. This is a valuable and sobering perspective.
Show Less
LibraryThing member JonArnold
This isn’t a standard pop science book. From the cover and title you could be fooled into thinking it is, but ultimately it’s far deeper reaching and better evidenced than anything from Malcolm Gladwell or the Freakonomics team. Deeper but equally as accessible, assuming depth doesn’t put you
Show More
off.

The title’s a tad deceptive as Kahnemann admits up front – the fast and slow systems of our mind he posits are a simply a metaphor to aid understanding. It’s also actually a far wider examination than simply ‘how we think’. What it does do is look at ‘fast’ thinking (what we might term intuition) and how it can lead us into tricking ourselves thanks to conscious or unconscious biases, and also how our deeper thinking mind interacts with that. By the end you’ll be wondering exactly how rational any of us are (answer: none of us are very rational, even at the best of times). And it’s always amusing to see the economic mantra of people always being rational undermined, even before ‘information asymmetry’ is taken into account.

Eye-opening, but be prepared for this one to take plenty of time to read and absorb.
Show Less
LibraryThing member Pennydart
I first encountered the work of Daniel Kahneman and his collaborator Amos Tversky when I was in graduate school. A student of artificial intelligence, I was keenly interested in the question of how to design systems that can behave “rationally” despite their computational limits. I was thinking
Show More
about computer systems, but of course human beings are also systems that have limits on the amount of “computation”—or at least thinking—that they can do. The adage of economists that rational behavior consists in doing the actions that do the most to further one’s goals given one’s beliefs simply didn’t seem to make a lot of sense when considered from a computational perspective: weighing one’s options in light of one’s goals and beliefs takes time, and if you try to weigh all your options, the world will change before you’ve finished. As various people have said, human beings are not “frictionless deliberators.”

Enter Kahneman and Tversky, whose work showed that people deviate in systematic ways from the standard model of rational thinking. The key insight is that the deviations are systematic—and they’re systematic in ways that accord with having finite reasoning capabilities. It’s not that people are irrational—Kahneman bristles at that description of his results. Rather it’s that the model of rational thinking developed by the economists may be fine as a normative model, but can’t work as a descriptive one.

Kahneman and Tversky collaborated on psychological studies of reasoning for more than a decade, writing hugely influential papers, including their 1974 Science article “Judgment under Uncertainty: Heuristics and Biases,” which is amongst the most highly cited paper in economics, and which reproduced in full in “Thinking Fast and Slow.” Tversky died in 1996 at the age of 59, but Kahneman went on to win the Nobel Prize in Economics for their joint work in 2002. Indeed, “Thinking Fast and Slow” is a joy to read not only for the science it contains, but also, secondarily, for the touching accounts he provides of his long collaboration with Tversky.

This is Kahneman’s magnum opus: an overarching review of 50 years of research that he conducted, first with Tversky, and subsequently with other leading social scientists. It’s remarkably accessible, even when he’s describing somewhat arcane points of psychology. Amongst the many cognitive biases he describes are these:

• Seemingly little things that make it easier to process information—the size or darkness of the font in text, for example—have a powerful effect on the likelihood that you will believe or approve of the information.

• We readily adopt a principle of WYSIATI—what you see is all there is. We neglect the possibility that our conclusions may be biased by having only incomplete information.

• We tend to apply a halo effect in assessing other people’s character or capabilities: we make specific assessments based on overall impression of them, and the errors that this induces are compounded by WYSIATI.

• When we’re trying to answer a challenging question, we often substitute an easier one, without even realizing it.

• Our numeric judgments are biases by “anchoring effects” in which we are swayed by sometimes irrelevant numbers that we encounter during our reasoning process.

• We often assess the quality of an experience by what happens at the end of it.

These are just six of the dozens of biases that Kahneman describes. For each bias, he presents the sometimes astonishing psychological studies that support it. For example, the last bias was uncovered in experiment with a group of patients who underwent colonoscopies in the early 1990s, when anesthesia was not well-administered for these procedures. One group of patients had a few extra minutes added to the procedure at the end, during which they experienced a relatively reduced level of pain. The patients in that group felt that the experience was overall less unpleasant, even though they’d experienced all the pain of the other group plus some more!

There’s an overarching coherence to Kahenman’s work: he explains human decision-making and the cognitive biases it includes, in terms of two systems: a quick-and-dirty Malcolm-Gladwell-like System 1, which, when needed, provides information and tentative conclusions to a more effortful, deliberative System 2. However, while you might be tempted to think that the errors produced by the types of cognitive biases listed above are solely the result of System 1, that would be incorrect. In fact, the connection between “accurate” reasoning and two systems is more complex, and even when humans rely on System 2, they are still not able to behave in the “fully rational” ways that would be specified by classical economics. It takes some careful reading of Kahneman’s book to sort this out, but on careful consideration one realizes that even with System 2 the economist’s rationality can’t be fully obtained. How could it, by creatures with finite brains?
Show Less
LibraryThing member browner56
What does it say about the state of an academic discipline when an experimental psychologist wins the Nobel Prize for Economic Sciences? To someone whose formal training in the field was based almost exclusively on the “rational man” model, that event symbolized nothing short of a dramatic
Show More
paradigm shift that has transformed economic theory and practice over the past two decades. Once considered pure heresy in the profession, the heuristic bias-based decision theories that underpin behavioral economics have by now become a standard part of the modern economist’s toolkit. Daniel Kahneman, along with his late colleague Amos Tversky, is widely regarded as being responsible for much of that evolution.

‘Thinking, Fast and Slow’ is a remarkable book that summarizes much of the considerable body of extant research into the way people actually make their decisions and judgments. Much of the story that emerges is that we are often susceptible to making logical errors when we assess and process information, whether due to our tendency to "anchor" on irrelevant facts, take shortcuts in the face of difficult tasks, or because of overconfidence in our ability to appraise rare events properly. What makes Kahneman’s discussion particularly insightful is his use throughout the book of the metaphorical two-system process underlying human thought: System 1, which makes fast and intuitive judgments, and System 2, the more logical and deliberate process that, among other things, tries its best to regulate the emotional responses of System 1. As the author explains, System 1 is the star of the show inasmuch as “fast thinking” is the source of so many of the cognitive biases that mark our decision making.

Although there is a lot to savor in the entire volume, I was especially intrigued by the chapters in which Kahneman summarizes the development of Prospect Theory. As much as anything else he has done in his lengthy career, it is this contribution for which he was recognized by the Nobel committee. It also provides the conceptual resolution for many of the problems that have plagued the expected utility approach to making rational judgments under uncertainty for more than 200 years. The author illustrates the main issues with the following thought experiment involving the choices made in two different decisions:

Decision 1: Choose between
A. Sure gain of $240
B. 25% chance to gain $1000 and 75% chance to gain $0

Decision 2: Choose between
C. Sure loss of $750
D. 75% chance to lose $1000 and 25% chance to lose $0

The vast majority of people facing these decisions choose A (i.e., a sure gain that is less than the expected value of the gamble) and D (i.e., the possibility of avoiding any loss). That is, unlike expected utility theory, Prospect Theory recognizes that people approach situations involving gains and losses differently—they are risk averse in the former case and loss averse in the latter. However, to see why these choices can be construed as being “irrational”, consider a third decision:

Decision 3: Choose between
E. 25% chance to win $240 and 75% chance to lose $760
F. 25% chance to win $250 and 75% chance to lose $750

Clearly, no one would choose E in this case, which is equivalent to F minus $10. However, E can be shown to be identical to having selected A and D in Decisions 1 and 2! Thus, among other things, how information about a decision is presented (i.e., “framed”) can have a material impact on the resulting choice, something that traditional economic theory cannot explain.

There is so much to like about ‘Thinking, Fast and Slow’ that it is hard for me to be critical about it in any way. However, if there is a shortcoming in the book—and indeed with the experimental approach that Kahneman adopts in much of his research—it is that while constructing decision environments in which people fail to make rational choices about, say, buying stocks or bonds is not difficult, it is very hard to show how these “mistakes” aggregate to the point that the prices of those securities become biased away from their intrinsic values. Of course, the fact that there remain problems to be solved is hardly a shortcoming of any single volume and Kahneman’s collected body of work does a marvelous job of pointing us in a very fruitful direction.
Show Less
LibraryThing member mbmackay
This is a wonderful book. The basic premise is that the human brain has evolved from an animal brain by adding the cerebral cortex (the 'rational' bit) to the pre-existing brain. We now have two 'systems' the quick intuitive from the underlying brain, and the slower, reasoning capacity from the
Show More
human part of the brain. As Kahneman amply and elegantly demonstrates, we use the intuitive part of our brain much more than we realise, and that the reasoning part is lazy, and used much less than we would otherwise imagine.
The result is usually fine - when a leopard is leaping out of a tree at you, you don't want to be carefully, and slowly, analysing the prospects of the threat being an optical illusion. However, and this is where Kahneman is so good in his original thinking, his experiments, and his written explanations, there are many instances in modern life and as homo economicus, that the quick and dirty response may not be so good. And, probably more importantly, the economic theory based on rational economic choices is thus baseless.
He has much more in this book - which I will go back and re-read.
A wonderful book, by a gifted writer and original thinker.
Read July 2012
Show Less
LibraryThing member bragan
Daniel Kahneman is a psychology professor and the winner of a Nobel Prize in economics. In this book he delves into how the human mind works when we're evaluating situations, solving problems, and making decisions. In particular, he talks about two kinds of mental processing we employ. One, which
Show More
he refers to as "System 1" (a phrase he emphasizes is simply a convenient label for a particular kind of thinking, not a physical thing that exists in your brain), works quickly, draws on associations between the problem before us and things we've seen before, simplifies complex problems to make them easier to handle, and produces results that just feel right. It's what we might like to call intuition. "System 2," on the other hand, proceeds slowly, logically, and analytically, for instance, when you're solving a complicated math problem. System 2 can be used to double-check on the intuitions provided by System 1, but it can also accept the conclusions System 1 feeds it and use those in its analysis. Or it can simply leave things to System 1 and never kick in at all.

Both systems are useful, even crucial, in their proper domains. Without System 2, we'd never pass calculus, and without System 1, we'd be like the proverbial centipede who couldn't walk because he couldn't keep track of what all his legs were doing. But relying on System 1 can lead us astray in all kinds of ways. And Kahneman shows us many of the ways it does that, as well as exploring the difference between real people and the idealized economists' model of human beings as perfectly rational agents invariably acting in their own best self-interest.

I thought most of this book was just really fantastic. I'd read a fair bit about this sort of topic before, and was a little afraid that it'd just hash over a lot of familiar ground for me, but, while I did get to be smug at having already developed the critical thinking skills not to fall for some of the trick questions designed to expose our irrationality, there was a lot here that I found really worthwhile and interesting, either because I was learning new things or because the already familiar concepts were just so wonderfully expressed.

I will say that there was one multi-chapter section of the book I felt slightly less satisfied with. Mostly, that involved a lot of examination of how people say they would respond to an offered gamble or deal, and whether their responses are rational or not. There was definitely some good stuff in these chapters, but I found them much less interesting, and somewhat harder to get through than the others. In part, that's probably because the way economics types approach these kinds of problems always kind of irritates me. Even when they're trying really hard, they always seem to me to fail to appreciate how real people feel about real money, treating it as some kind of contextless shiny thing that's just abstractly nice to have. It tends to leave me wanting to grab them by the lapels and say things like, "Of course the possibility of losing money that you already have feels more significant than the possibility of winning the same amount. People fear losing money for the same reason they fear losing blood. You have a limited amount of it, can only replenish it so fast, and you need that stuff to live." I mean, come on, guys.

Still, that may just be a personal quirk. Overall, it's an extremely valuable book, one that teaches some important, perhaps even utterly critical, lessons on the ways in which we can all be very, very wrong about things while being convinced we're completely, unquestionably right. Which is a perspective and a level of self-awareness that we desperately need more of in the world right now. I'd certainly like to force every politician to read it.
Show Less
LibraryThing member annbury
A truly amazing book, which has changed the way I think about many things -- notably, about the way I think, but also about economics, which was my profession for forty years. As other reviewers have noted, the book contrasts a slow "rational" mind with a fast "intuitive/instinctive" mind, He goes
Show More
through many experiments which demonstrated this duality, and which also demonstrated how much of our actions, choices and beliefs are determined not by the slow rational process, but by virtually instantaneous intuitive processes. That's important in any context, and Kahneman's fast/slow theory is very convincing. It is also alarming; if people are far less driven by rational thought than they think they are, how are we as a species to make rational decisions in a very difficult world?

For economics, of course, Kahneman's theory has turned the world on its head, because economics is based on the assumption of human rationality. I had always been aware that this was a major simplification, and in recent years had been increasingly interested in short excursions into behavioral economics (I was a bank economist, not an academic). Not until I read this book, however, did I really understand that the assumption of rationality is so far from the truth as to make it useless, indeed, misleading.

This book is clear and doesn't demand specialist knowledge, but it does demand close attention: I started out listening to it, but abandoned the effort, and have been very slowly reading it (with pen in hand) over the past month. It has been well worth the effort. A won
Show Less
LibraryThing member EmreSevinc
Can you understand the modern world you live in without having any idea about the following terms and the concepts they convey: 'inflation', 'unemployment', 'advertisement', 'capitalism', 'liberalism', 'democracy', 'civil rights', 'energy'? In order to think about some field, to understand some
Show More
aspects of our daily lives, and to communicate about them to the others, we need the terms describing them. Just like the terms given in the first sentence of this paragraph, the terms that became a part of our daily communication, we need terms and concepts to understand how our mind works in the modern world and what kind of pitfalls we face while we're trying make thousands of decisions and form ideas every day.

So it is time that we learn about 'anchoring effects', 'narrow framing', 'excessive coherence', 'endowment effect', 'planning fallacy', 'the illusion of validity', and many other aspects of decision making and thinking, so that we can understand the processes we encounter every day much better.

You'll have a difficult time after this book because it will probably make you think slowly and question many of the decisions you made in the past, and the ones you are to make in the future. As the Nobel laureate in economics and one of the most cited psychologists / cognitive scientists of all times, Daniel Kahnemann hardly needs an introduction. If you have ever read a book or an article about decision making, behavioral economics or cognitive psychology in the last 15-20 years, you have either read something inspired by his studies, or a criticism of him.

"Thinking, Fast and Slow" takes the reader on a very gentle tour during which he or she will see the pitfalls of inner workings of the mind. Kahneman's text is very fluent, one might even say a 'page turner', and without diving into deep and obscure details of academic journals, it gives a very good overview of one of the most important and radical research programs of the 20. century. For the curious and skeptic readers, the book contains many references to the original articles, books and discussions. But the power of the narrative comes from the crystal clear explanation of many interesting, yet very simple experiments. Unless you are already well versed in the field, you are going to come up with very fast and intuitive answers to many of the described questions in the book and you will probably be baffled as the author will go on to dissect the reasoning of yours, which even you were not aware of.

I am personally thankful Daniel Kahnemann for having given me the necessary tools to analyze the decision making processes of mine and others. His book will be definitely one of the references that I'll keep on referring to and I have already started to create a list of articles and books he gave in the detailed 'Notes' section at the end of the book.
Show Less
LibraryThing member StephenBarkley
Riddle me this:

"If it takes 5 machines 5 minutes to make 5 widgets, how long would it take 100 machines to make 100 widgets"—100 minutes or 5 minutes (65)?

If you answered "100 minutes," you're not alone ... and you're dead wrong. Think about it. In Thinking, Fast and Slow, Kahneman explains
Show More
why.

We have two types of thinking systems (thus, the title of the book). System one is intuitive and answers quickly. System two requires more thought and answers slowly. Both systems are valuable and necessary. Kahneman's spent his lifetime studying these systems and has developed and published many experiments over the years (including the one above) which exploit the flaws in our systems.

In Thinking, Fast and Slow, Kahneman helps us to recognize when our minds let us down (i.e. narrative fallacies, planning fallacies, WYSIATI, etc.) and gives us the tools to recognize our own errors.

This book is detailed, thorough, and absolutely fascinating. Kahneman walks the reader through many of the test scenarios he developed over the years. Even if you prepare yourself for the "trick" and try to answer correctly, human nature wins out. It's certainly a good dose of humility!

Many of these experiments were carried out with his friend and colleague, Amos Tversky, to whom the book is dedicated. The friendship between them and their mutual fascination with how the mind works makes this book on sociology border on memoir at times.

Read and be fascinated at your incredibly powerful and deeply flawed mind!
Show Less
LibraryThing member neurodrew
Thinking, Fast and Slow
Daniel Kahneman
Nov 24, 2011

I think this is, unqualifiably, a great book, not so much for its writing, although that is very good, but for its subject matter. The author won a Nobel for his experiments and thinking about decision making under uncertainty, developing what he
Show More
calls prospect theory. The book is difficult to summarize, revealing as it does the results of many experiments showing how intuitive (the author calls it system 1) thinking is specialized for rapid decisions and abstracting data from background, often gets the decision wrong when compared to the slower, reasoned effort that is required (system 2). The intuitive decision system can be influenced by priming (one is more likely to respond in a certain way having been recently exposed to words or pictures favoring that response), and by neglect of logic and statistical thinking. The Linda problem illustrates that a logically less likely outcome of an individual being it two categories at once rather than in one is preferred because of a compelling description. Most people also neglect Bayes theorem, failing to recognize prior probability, and most people do not appreciate regression to the mean as an explanation for random events. The problem of small samples and loss aversion are also very powerful influences on decision making. I recommend this book to everyone who makes financial decisions, and to those with a scientific outlook on life
Show Less
LibraryThing member oldman
A Nobel prize book for Economics this book was a "not to put down" book for me. The consistent logical thought process of the book was particularly appealing, as was the basic premiss of the book - humans think at two speeds, fast and slow. Well worth reading again.
LibraryThing member jorgearanda
Kahneman summarizes several research advances in psychology mostly having to do with the interplay between the two modes of thinking in the title: the fast (impulsive, guided by heuristics, efficient but predictably weak under some circumstances) and the slow (rational, controlled, lazy). It's
Show More
really exciting science, and it is well presented. Unfortunately the book also comes with fairly unnecessary rambles, detail, and repetition, and it is much longer than it needed to be. It also comes with a strange fixation on altering the gossip habits and watercooler conversations of its readers, for which they get a convenient list of stock phrases at the end of each chapter to use in the office in case they want to sound like pedantic dorks.

My biggest concern with this book, however (and perhaps with Kahneman's approach to psychology), is that it tries, over and over, to fit the world of our experience in a statistical model. In this he gets the relationship exactly backwards. The world is not random, it is causal, though complex. It turns out that, often, this complexity manifests as seemingly random occurrences, and in these cases, a statistical analysis can help us understand it better. But we should not forget that we use the concept of randomness out of ignorance, not because it explains the world appropriately. It is merely a tool. Confusing it with a description of the world works sometimes, but it also leads to some questionable assertions. An example of this, out of many, is the "hot hand" discussion. Basketball scoring seems random, but it's naive to claim that it *is* random, and that therefore any considerations about the different states of players and teams at different times is foolish. But as I said above, the book still reports on pretty exciting science in a compelling manner.
Show Less
LibraryThing member Daniel.Estes
My wife frequently ribs me about all the monotonous non-fiction I prefer to read, but even I couldn't make it through this one. I don't see how Thinking, Fast and Slow became a thing. It's an interesting concept, unnaturally stretched to over 500 pages, and set in some of the driest writing I've
Show More
ever read for a bestseller.
Show Less
LibraryThing member Parthurbook
Over-rated. As often the case, a really interesting central idea, explained in a ten thousand word essay, wrapped in several hundred pages of justification.
LibraryThing member vguy
No surprise that h. Sapiens is subject to emotionally driven prejudice, snap judgement and leaping to biassed conclusions, Or that rationality is hard work, and not always present when we imagine it is. The striking thing in this book is the detail of how all this works, the trivial factors that
Show More
can influence us, and the well- researched evidence that backs it. Example of the trivial: opinions change not only when we are in a good or bad mood, but that the mood can be simulated by holding a pencil in the teeth (smile! Go for it!) or just frowning (misery! Work it out cautiously!). Interesting too that we are more rational/logical when worried, we take more precautions and think, but not when tired (rational "system 2" is fundamentally lazy; how experts also get it wrong (overconfidence), but nonetheless expertise can be learned and give snap judgements that are valid ; how readily we dismiss contrary evidence...and much more. Not a light read, tho aimed at the lay,; my own belief in my rationall powers was duly humbled by many of the examples and thought experiments, most of which I got wrong or gave up on. K. Has a genial touch of humility and warmth, admits his own errors and teaches us from them, as well as from his successes. lovely that he gets nobel for economics without knowing anything about economics. A kindly old Jewish uncle full of wisdom and sympathy for human weakness.
Show Less
LibraryThing member paulsignorelli
Daniel Kahneman's engaging and well-documented treatise on how we make accurate and inaccurate decisions is, in some ways, reminiscent of what Daniel Ariely ("Predictably Irrational"), Malcolm Gladwell ("Blink"), and others have explored in terms of how irrational our decisions can sometimes be. It
Show More
also takes us much deeper into understanding, by experiencing what he is describing, the ways we trick ourselves into thinking we are better at decision-making than we actually are. At the heart of Kahneman's work is what he calls System 1 and System 2 thinking--shorthand terms for the ways we approach decision-making (sometimes quickly, sometimes after engaging in intellectual efforts requiring plenty of work). In example after example, we see how fear and inaccurate perceptions we assume are true govern our decisions. And if we apply these lessons to workplace leaning and performance (staff training), we can easily see not only the need to identify and correct misperceptions early in the learning process, but also the need to instill in ourselves and our learners an awareness of how easily we can be affected and influenced by stressful, emotional situations and inaccurate sources of information with little regard to their veracity.
Show Less
LibraryThing member the.ken.petersen
I thoroughly enjoyed this book. I must make clear, however, before getting into the meat of a review, that this is not my area of expertise (if such exists!).

Daniel Kahneman examines the human thought process and splits it into an imaginary, but useful for understanding, two function system. He
Show More
suggests that System 1 comes up with an instant answer to questions where ever possible: is this person whom I am seeing for the first time a 'nice' person? Am I going to enjoy eating this plate of food of a type I have never previously tried? and that System 2 is largely called in when System 1 cannot provide an instant response, i.e. 17 x 24, or to provide back up for the views expressed by System 1.

Kahneman does not simply state his beliefs, all the way through the book their are statements to assess and I found that I was lead into illogical thinking on every predicted occasion. He shows that our rational thought is not always rational and that remembered experience can differ from the perception at the time. He also shows that, fear of loss is a greater driver than anticipation of gain, to an extent whereby it can have a negative effect upon our life.

This book is one of those rare tomes, written by an expert but not clouded with phrases designed to be understood by another expert but guaranteed to defeat the interested amateur. As stated earlier, this is not my field and, whilst there is undoubtedly more in this book than I was able to extract, none the less, I am a wiser person for the reading.

In his conclusion, Khaneman pushes this on from interesting facts about how we think to the effect that this information should have upon the way we are governed. Should our rulers react to what we say is influencing our views, or through the expertise of those able to tease out the route dissatisfaction, take a superior, 'We know what you really want' attitude? This, of course leads to the danger of superior rulers ignoring our views because they know what we require better than we do.

Do I know myself better for reading this book? That is a difficult question to answer. Perhaps the best way so to do is to say that I am more aware of the weakness of some of my views: whether that will make it possible to engage System 2 more often and reduce the number of ill considered opinions which I hold, is another matter.
Show Less
LibraryThing member RobertDay
I bought a copy of this book on the recommendation of a colleague in the software testing community, and I struggled with it until bailing out at around the 65% mark. The basic premise, that we have a bicameral mind with two different ways of thinking, and that we rely on the first way of thinking
Show More
most of the time, which is fine when it's right but not so good when it's wrong, is important and needs saying. So much of our thinking about things that are complicated seems severely influenced by what Kahneman describes as System 1 thinking, which jumps to conclusions and takes the easy way out. We certainly seem to be living in a System 1 world right now.

Kahneman then goes on to describe all the different sorts of biases that can fool us. This is an important area for software testers in particular, because these biases influence the way we look at software applications under test, and the assumptions we make about how a particular application works, ought to work, or how it will be used by people out in the Real World who just want to open the software and use it without any further thought or preparation, like any other simple tool, from the stone axe onwards. However, computer software is just that bit more complex than the stone axe, and that's where the problems start.

So far for the book, so good. But I started running into problems with it from the outset. I rapidly came to the conclusion that someone, most likely the publisher, dumbed it down. (It took me a few days to discover the notes at the back because someone decided that it would be better to take all the referencing out of the text – but without the referencing, the book often reads to me like pseudo-science because of the way Kahneman keeps saying “Studies have shown…” or “Scientists in San Francisco found…”; without knowing that there actually IS a solid, valid reference behind these statements, they look like the sorts of things pseudo-scientists say to “prove” that you can extract sunbeams from cucumbers).

I did find the text rather old-fashioned; it read like a 1970s psychology textbook,. and indeed that's when Kahneman and his collaborator Amos Tversky did a lot of their initial work. In any case, I got as far as Chapter 16 and then seriously considered abandoning the book. But I rested it for a few days and then went back to it, which seemed to coincide with what looked like a change in direction in the text, to a more anecdotal style. But that was something of a false dawn, because Kahneman then dived into analyses of risk and gaming, and we ended up with a series of examples of questions like "Would you rather have a 50% chance of winning $50 and a 10% chance of having $10 taken away from you, or a 60% chance of winning $10 and a 35% chance of winning $85?" and after the fourth or fifth example of that - which seemed to occupy much of the rest of the book - I gave up. This is not something I regularly do.

Partly, I suspect I may not be the book’s intended audience; I found myself challenging too many of his examples and I saw through the perspective exercise in chapter 9 (Figure 9 – page 100 in my UK paperback edition) and was then amused to see that the author recognises that “experienced photographers have the skills of seeing the drawing as an object…” and I AM such a photographer!

Or perhaps I was applying my tester’s mindset to the problems, which may be over-thinking them, trying to find real-world solutions instead of just letting my own Systems 1 and 2 battle it out between them.

I did find the text excessively US-centric, to the point where I complained loudly over one question that was put as an example quite early on: “How many murders are committed in the state of Michigan?”, to which I replied “No idea – I’d usually Google that one.” Well, of course, the question OUGHT to have been “What is your estimate of the annual number of murders in the state of Michigan?”; but that aside, then Kahneman saying “Well, of course you only thought of Michigan and forgot that Detroit is in Michigan and so has its tremendous number of murders counted in the state-wide total” just struck me as the sort of geographical bias we try to eliminate when doing testing work, and indeed seemed to expose the very biases he went on to discuss later on in the book. There are other examples but this was the worst.

But Chapters 14 and 15 brought me to a juddering halt with two examples of what testers call “testing personas”, invented characters who are used to represent typical real-world users. The first, “Tom W”, is based around a set of assumptions about IT systems developers from the 1970s! I’ve worked in IT for twenty-five years, and the sort of stereotyping that Kahneman bases his expectations on died out long ago, certainly in the organisations and companies I’ve worked in. And then we had the “Linda problem”. Kahneman set up a fictional character, Linda, with a fairly detailed backstory and life circumstances, but then when people say “Yes, Linda could be a feminist bank teller”, he says that is the wrong answer! I’m sorry, I got angry with him at that point. I’ve met plenty of Lindas (of both genders) with strongly-held political beliefs that drive their existence, and they are quite capable of holding down comparatively menial jobs. If anything, their political beliefs support them in their jobs and give them a focus outside of those jobs that helps them cope. It was at this point that I realised that Kahneman was applying economic criteria to his cases. Thinking of the probabilities of the quantum of feminist bank tellers as a proportion of all bank tellers, the likelihood of Linda being a feminist bank teller was arrived at statistically. Yet Kahneman created her with a backstory where those feminist values would be sufficiently important to her for her to hold to her feminism – in the real world. Kahneman’s explanation at the end of chapter 15, that the sort of objections I raise actually aren’t relevant to the argument he’s trying to make, just irritated me more. Are we supposed to be reading this book to find out interesting facts about human nature when working with human beings and their artifacts, or just to admire how clever Daniel Kahneman is?

(His ‘less is more’ example in that chapter – can you charge more for a tea service with fewer pieces, all of which are perfect, or for one with more pieces, a significant number of which are imperfect - made me smile because if the author had had much experience of actually selling things in sets, he would have realised that a smaller but complete set is worth more than a larger, but incomplete and/or flawed set, especially to more discerning customers.)

Perhaps I was reading it too fast. One of the blurbs on the back of the UK edition says “Buy it fast. Read it slowly.”; and indeed the person who recommended it to me in the first place stretched his reading of it out over a series of weeks. As I said, perhaps i'm not the book's intended audience. Perhaps I’ll just take it to the office and see if any of my other testing colleagues want to have a try at it.
Show Less
LibraryThing member justindtapp
Kahneman is a Nobel laureate whose work I was already familiar with, particularly after having read the book Nudge. Thinking Fast and Slow (TFAS) explains the nuts and bolts behind behavioral economics, and how, as psychologists, Kahneman and Amos Tversky helped shatter certain foundations of
Show More
microeconomics-- namely utility theory and the way economists assume rationality and preferences work among people making choices.

Since studying economics, I've striven to be "rational," to think like an "Econ" as opposed to a rational human. (By "rational" I mean the definition of "having internally consistent beliefs.")
This book helped illustrate how impossible that is, even for the most self-aware. The helped me think about my own cognitive biases and the emotions behind some of my decision-making.

Every chapter is similar, which makes it rather monotonous. A hypothesis springing from a real-life observation, an explanation of various (often amusing) experiments done, some conclusions, and then a "how to apply this discovery or concept to everyday conversation/decision making."

The book starts by differentiating our thinking into two "systems," which become our friends throughout the book. System 1 is the instinct, gut reaction system. System 2 is the slower, processing system. If I write "2 + 2," your System 1 immediately says "4." But if I say "2 x (3 x 5) / (4 x 44)" you have an instant physiological reaction-- your pupils dilate, your heart rate increases, and your System 2 goes to work figuring it out. Kahneman points out many illogical errors our System 1 is prone to make, and how to address them. This is a crucial concept for leaders and decision-makers.

For example, System 1 often falls for the "halo effect," assigning multiple positive attributes to someone just because we like one particular attribute of them. You see someone giving a speech with a warm smile-- perhaps it subliminally reminds you of your uncle. You feel he must be a nice guy, an honest guy, a good family man-- someone you can trust. But all you know about him is his smile. Choosing to vote for him on your assumptions is illogical. But, politicians through the ages have gotten elected on little more than their physical attributes.
This is one reason I don't watch political speeches and debates often-- I prefer to read them afterwards. This helps eliminate the halo effect from the person's body language, looks, smile, audience reaction, etc.

System 2 is "lazy," according to Kahneman-- it requires focused effort. Perhaps the above helps explain why "Stocks with pronounceable trading symbols (like KAR or LUNMOO) outperform those with tongue-twisting tickers like PXG." Our System 1 likes simple things. (Kahneman says that finance is "an entire industry built on the illusion of skill," eagerly reminding us that return of index funds beat returns of actively-managed portfolios.)
System 1 also falls for the fallacy that human intuition is better than algorithms based on statistical data. This is the old Money Ball debate. Algorithms beat humans, get over it. System 1 is too optimistic. Optimism seems hardwired into the human psyche and helps explain our capitalist system as well as Westward expansion. Entrepreneurs rate their chances of success much higher than historical odds. If they were relying purely on their System 2 to make the decision to enter the market, they might never enter as rational justification falls away quickly.

Kahneman won the Nobel for prospect theory, which he explains in detail. Two identical propositions presented differently lead to different preferences from the responder, something that shouldn't happen if people are rational calculators, as standard economics assumes. People are loss-averse, proven time and again in experiments. If I offer you $50 for certain or a 75% chance to win $100, you should logically take the chance (your expected payout is $75 compared to the $50)-- but most people don't, they'd rather take the sure thing. The pain of losing the sure thing would outweigh the potential gain. This loss aversion shows up in various aspects of life. Studies have found that golfers putt better when going for par than they do for birdie-- they don't want to bogey.

"If in his best years Tiger Woods had managed to putt as well for birdies as he did for par, his average tournament score would have improved by one stroke and his earnings by almost $1 million per season."

It seems obvious to me that Kahneman doesn't have any kids, as he makes observations that any parent would see as obvious. Most of his experiments involved adults, and it would have been interesting to see how kids behave in similar situations.

There are too many examples of cognitive biases and fallacies to list here, but one section that was new to me was the idea of there being an "experiencing self," and a "remembering self."

"What we learn from the past is to maximize the qualities of our future memories, not necessarily of our future experience. This is the tyranny of the remembering self."

You often don't remember the wonderful game your NCAA basketball team played, or the fact that the best player went 10-12 from the field, an excellent percentage. What you remember is that he missed the shot at the buzzer that would have given your team a win-- a memory that still stings. You disregard the 24 other shots your team missed from the field that would have also made the difference-- that last memory is what "lost the game."

People's evaluation of overall well-being is dependent upon recent events and what stands out in memory. A fascinating experiment revealed:

"Adding 5 'slightly happy' years to a very happy life caused a substantial drop in evaluations of the total happiness of that life."

People would rather go out "on top" than see a significant decline in performance, happiness, or health before going out. It's a tragedy when an athlete plays too long as a diminished version of his former self-- he's ruined the memories of so many, and diminished his previous accomplishments. This is logically nonsense, his previous records haven't changed, only our current evaluation of him.

How the "remembering self" handles regret was particularly important to me. When making a major decision, factor in what the reduction in your happiness will be if it doesn't work out the way you hope. Add that to the cost/benefit analysis of making the decision.

It's not all gloom and doom, Kahneman tells us what research says about "happiness," which is tricky to define. Income above $75,000 in the U.S. has rapidly diminishing marginal returns to happiness. More education, likewise, correlates with more stress and less happiness.

"Religious participation also has relatively greater favorable impact on both positive affect and stress reduction than on life evaluation...Surprisingly, however, religion provides no reduction of feelings of depression or worry."

Marriage, in itself, is an overall wash because it improves some areas of satisfaction while worsening others. Kids bring satisfaction, but only if the time spent with them isn't a chore-- driving them all over town to their many activities, for example, can be much more stressful than satisfying.

Research has shown Kahneman:

"It is only a slight exaggeration to say that happiness is the experience of spending time with people you love and who love you."

Kahneman recommends switching from passive leisure, like watching TV, to more "active leisure," like socializing with friends, exercising, and doing other non-obligatory activities with people you enjoy being around.

I give this book 4.5 stars out of 5. A very worthwhile read.
Show Less
LibraryThing member HadriantheBlind
t admitted a bit of doubt when I first started this - the very concepts of Thinking, Fast and Slow, are evident to the student who has had Psych 101 - there are two basic modes of thinking. Automatic processing, which is described as System 1, which is easy, non-attentive, intuitive thinking, and
Show More
controlled processing, or System 2 - the 'attentive', reasoned, detail-oriented part of the mind. There are also some basic principles, such as heuristics ('shortcuts' of thinking, and biases.

Yet this bald litany of basic facts does not describe the whole contents of the book. Far from it. The real meat of the book comes over the next 20 or so chapters, and details many real social and economic applications, with many helpful examples and citations, drawn from respected and well-tested sources. This resourceful and detailed compilation of our best and worst behaviors in decision making is a fine book, and worth reading to those who are fascinated with our behaviors.
Show Less
LibraryThing member ajlewis2
This is one of the most detailed books, outside of textbooks, that I've read. It is organized very well. I leave it to you to look for a summary to get an idea of the contents. I listened to an audio version, but if you seriously want to absorb the material, I recommend a text version. I would like
Show More
to give a better idea about the contents, but it really is too exhaustive. I can only say that it is well done and is not one of those repetitive self-help books. I heard many points with detailed studies showing me that most of us indeed have faults in the way we think. These faults are part of our being human and much of our "fast" thinking makes it possible for us to survive. I heard that basically just learning about these problem areas will not actually cause a change in how I make decisions for the most part. At the same time, I am encouraged to at least understand that the biases are there in our thinking so that I may be willing to entertain the thought that I might be wrong. I am maybe a bit more alert to the ways that my decision-making may even be manipulated.

I was especially interested in the two selves: the remembering self and the experiencing self. This section alone was reason to read the book.
Show Less
LibraryThing member LaPhenix
Brimming with fantastic information and fascinating ideas that are applicable to everyday life! This book made me reconsider information I take for granted and reevaluate perspective in general.
LibraryThing member flydodofly
Revealing insight into different ways our mind works. I have already had fun observing others and myself making decisions similar to the ones in the research, and could immediately use newly gained knowledge to reconsider the logic of my thinking. Excellent stuff, and probably something kids should
Show More
be given the opportunity to observe at an early age, to have better chances to make the right decisions.
Show Less
LibraryThing member keylawk
Kahneman won the Nobel Prize in Economics, and this book carefully analyses studies involving human decision-making.

While the conclusions are more "bad news" for the Austrian/Chicago school of economics, and the libertarian model is destroyed by the facts presented here [411 ff], the book is far
Show More
from ideological in presentation. The author claims to cringe when his work is credited with the demonstration that human choices are irrational, when in fact their research only showed that Humans are not well described by the libertarian "rational-agent" model. That model is the foundation for libertarian public policy, it is theoretical and wrong. In fact, the admiration for the myth of market "efficiency" in allocating goods is wrong. The title and drift of Milton Friedman's book, "Free to Choose", is shown to be unsupportable.

Freedom is never "free", the chooser is rarely informed, and society needs protection from predators who exploit weakness. The data supports what is called "libertarian paternalism" by the economist Richard Thaler and the jurist Cass Sunstein in their 2008 book, "Nudge". The flagship example of their behavioral policy is "Save More Tomorrow", sponsored by the US Congress in a coalition between liberals and conservatives. This is the policy now largely characterizing the Obama Administration. [414] Obama appointed Sunstein to serve as administrator for the Office of Information and Regulatory Affairs, described in the 2010 Report of the Office of Management and Budget.

Humans do not make rational or good decisions without help. Informed and unintrusive regulations and standards, with enforcement against piratical assaults, can provide the help that is needed to maintain a free market.
Show Less
LibraryThing member stretch
So Thinking Fast and Slow is not my typical sort of popular science. In fact if it weren't for the great reviews it got on CR and all the hype after its publication, I'd probably would've passed on this one. I'm glad I didn't, but I still found this book to be outside of my wheelhouse. There is a
Show More
lot of tremendous insights into decision making and real world economic theory to be gleaned from these pages. (I have more highlights from this book than any other on my Kindle) The first two parts of the book a fascinating psychology. There is just so much to think about from these chapters alone to make a very good book. However, the latter half of part 3 when Kahneman begins to delve more into our decision making on a more economical level is where I began to think his conclusions were a bit too simple. While luck is an important factor in outcomes Kahneman's examples and evidence when put under the microscope don't really support his conclusions that blind luck is just as good or superior to experts. It was just to neat to be believed and from what I've found the relationship of randomness in our lives is profoundly more complicated. And part 4 while initially very interesting became very repetitive. I guess there is only so much you can do with abstract economic theory before it runs out of steam.

What I really respect about Thinking Fast and Slow is that Kahneman doesn't shy away from the science. The science is upfront and the story is secondary. This is something Gladwell and the like should take note of. There are no easy conclusions here, no big take away you can sell to others. Instead it is series of logical conclusions that build upon the whole to form a theory. This isn't a self improvement book disguised as science.
Show Less

Publication

New York : Farrar, Straus and Giroux, 2011.

Description

In this work the author, a recipient of the Nobel Prize in Economic Sciences for his seminal work in psychology that challenged the rational model of judgment and decision making, has brought together his many years of research and thinking in one book. He explains the two systems that drive the way we think. System 1 is fast, intuitive, and emotional; System 2 is slower, more deliberative, and more logical. He exposes the extraordinary capabilities, and also the faults and biases, of fast thinking, and reveals the pervasive influence of intuitive impressions on our thoughts and behavior. He reveals where we can and cannot trust our intuitions and how we can tap into the benefits of slow thinking. He offers practical and enlightening insights into how choices are made in both our business and our personal lives, and how we can use different techniques to guard against the mental glitches that often get us into trouble. This author's work has transformed cognitive psychology and launched the new fields of behavioral economics and happiness studies. In this book, he takes us on a tour of the mind and explains the two systems that drive the way we think and the way we make choices.… (more)

Media reviews

The replication crisis in psychology does not extend to every line of inquiry, and just a portion of the work described in Thinking, Fast and Slow has been cast in shadows. Kahneman and Tversky’s own research, for example, turns out to be resilient. Large-scale efforts to recreate their classic
Show More
findings have so far been successful. One bias they discovered—people’s tendency to overvalue the first piece of information that they get, in what is known as the “anchoring effect”—not only passed a replication test, but turned out to be much stronger than Kahneman and Tversky thought. Still, entire chapters of Kahneman’s book may need to be rewritten.
Show Less
2 more
"It is an astonishingly rich book: lucid, profound, full of intellectual surprises and self-help value. It is consistently entertaining and frequently touching..."
Thinking, Fast and Slow is nonetheless rife with lessons on how to overcome bias in daily life.

Original publication date

2011-10-25

Physical description

499 p.; 24 cm
Page: 1.6607 seconds