

Buy Thinking, Fast and Slow: Daniel Kahneman 1 by Kahneman, Daniel (ISBN: 9780141033570) from desertcart's Book Store. Everyday low prices and free delivery on eligible orders. Review: When is an intuition a simplifying heuristic or an expert solution? That is the cue of recognition, nigh a formula Grasshopper! - Stemming from the author's Nobel prize winning scholarly research on the simplifying short-cuts of intuitive thinking (systematic errors) and then decision making under uncertainty both published in Science Journal, the book is a series of thought experiments that sets out to counter the prevailing rational-agent model of the world (Bernoulli's utility errors) that humans have consistent preferences and know how to maximise them. Instead, Prospect Theory shows that important choices are prone to the relativity of shifting reference points (context) and formulations of inconsequential features within a situation such that human preferences struggle to become reality-bound. In particular our decisions are susceptible to heuristic (short-cutting) or cognitive illusory biases - an inconsistency that is built in to the design of our minds, for example, the 'duration neglect' of time (less is more effect) in recounting a story by the Remembering Self, as opposed to sequentially by the Experiencing Self. Prospect Theory is based on the well known dominance of threat/escape (negativity) over opportunity/approach (positivity) as a natural tendency or hard wired response towards risk adversity that Kahneman's grandmother would have acknowledged. Today this bias is explored by behavioural economics (psychophysics) and the science of neuroeconomics - in trying to understand what a person's brain does while they make safe or risky decisions. It would appear that there are two species of homo sapiens: those who think like "Econs" - who can compare broad principles and processes 'across subjects', like spread betters (broad framing) in trades of exchange; and "Humans" who are swayed optimistically or pessimistically in terms of conviction and fairness by having attachments to material usage (narrow framing) and a whole host of cognitive illusions, e.g. to name but a very few: the endowment effect, sunk cost fallacy and entitlement. Kahnmann argues that these two different ways of relating to the world are heavily predicated by a fundamental split in the brain's wet-ware architecture delineated by two complementary but opposing perspectives: System 1 is described as the Inside View: it is "fast" HARE-like intuitive thought processes that jump to best-case scenario and plausible conclusions based on recent events and current context (priming) using automatic perceptual memory reactions or simple heuristic intuitions or substitutions. These are usually affect-based associations or prototypical intensity matches (trying to compare different categories, e.g. apples or stake?). System 1 is susceptible to emotional framing and prefers the sure choice over the gamble (risk adverse) when the outcomes are good but tends to accept the gamble (risk seeking) when all outcomes are negative. System 1 is 'frame-bound' to descriptions of reality rather than reality itself and can reverse preferences based on how information is presented, i.e. is open to persuasion. Therefore, instead of truly expert intuitions System 1 thrives on correlations of coherence (elegance), certainty (conviction) and causality (fact) rather than evidential truth. System 1 has a tendency to believe, confirm (well known bias), infer or induce the general from the particular (causal stereotype). System 1 does not compute base rates of probability, the influence of random luck or mere statistics as correlation (decorrelation error) or the regression to the mean (causality error). System 1's weakness is the brain's propensity to succumb to over-confidence and hindsight in the resemblance, coherence and plausibility of flimsy evidence of the moment acronymically termed WYSIATI (What You See Is All There Is) at the expense of System 2 probability. To succumb is human as so humbly shown throughout the book which has no bounds to profession, institution, MENSA level or social standing. Maybe Gurdjieff was right when he noticed that the majority of humans are sheep-like. System 2 on the other hand is the Outside View that attempts to factor in Rumsfeld's "unknown unknowns" by using realistic baselines of reference classes. It makes choices that are 'reality-bound' regardless of presentation of facts or emotional framing and can be regarded as "slow" RAT-like controlled focus and energy sapping intention, the kind used in effort-full integral, statistical and complex reasoning using distributional information based on probability, uncertainty and doubt. However System 2 is also prone to error especially in the service of System 1 and even though it has the capability with application not to confuse mere correlation with causation and deduce the particular from the general, it can be blocked when otherwise engaged, indolent or full of pride! As Kahneman puts it "...the ease at which we stop thinking is rather troubling" and what may appear to be compelling is not always right especially when the ego - the executive regulator of will power and concentration - is depleted of energy, or conversely when it is in a good mood of cognitive ease (not stress) deriving from situations of 'mere exposure' (repetition and familiarity). Experiments have repeatedly shown that cognitive aptitude and self-control are in direct correlation, and biases of intuition are in constant need of regulation which can be hard work such as uncovering one's outcome bias (part hindsight bias and halo effect) based on the cognitive ease with which one lays claim to causal 'narrative fallacies' (Taleb) rather than "adjusting" to statistical random events born out of luck! So.. Do not expect a fun and "simples" read if you want clarity in to how impulses become voluntary actions and impressions and feelings and inclinations so readily become beliefs, attitudes and intentions (when endorsed by System 2). The solution.. Kahneman makes the special plea that our higher-minded intuitive statistician of System 2 take over the art of decision-making and wise judgement in "accurate choice diagnosis" to minimise the "errors in the design of the machinery of cognition." We should learn to recognise situations in which significant mistakes are likely by making the time and putting in the analytical effort to avoid them especially when the stakes are high - usually when a situation is unfamiliar and there is no time to collect more information. 'Thinking Fast and Slow' practically equips the reader with sufficient understanding to approach reasoning situations applying a certain amount of logic in order to balance and counter our intuitive illusions. For example recognising the Texas sharp shooter fallacy (decorrelation error) or de-constructing a representative heuristic (stereotype) in one's day-to-day affairs should be regarded as a reasonable approach to life even by any non-scientific yard stick. In another example, the System 2 objectivity of a risk policy is one remedy against the System 1 biases inherent in the illusion of optimists who think they are prudent, and pessimists who become overly cautious missing out on positive opportunities - however marginal a proposition may appear at first. One chapter called "Taming Intuitive Predictions" is particularly inspiring when it comes to corrections of faulty thinking. A reasonable procedure for systematic bias in significant decision-making situations where there is only modest validity (validity illusion) especially in-between subjects is explored. For example, when one has to decide between two candidates, be they job interviewees or start up companies as so often happens the evidence is weak but the emotional impression left by System 1 is strong. Kahneman recommends that when WYSIATI to be very wary of System 1's neglect of base rates and insensitivity to the quality of information. The law of small numbers states that there is more chance of an extreme outcome with a small sample of information in that the candidate that performs well at first with least evidence have a tendency not to be able to keep up this up over the longer term (once employed) due to the vagaries of talent and luck, i.e. there is a regression towards the mean. The candidate with the greater referential proof but less persuasive power on the day is the surer bet in the long term. However, how often can it be said that such a scenario presents itself in life, when the short term effect is chosen over the long term bet? Possibly a cheeky pertinant example here is the choice of Moyes over Mourinho as the recently installed Man Utd manager! A good choice of bad choice? There are many examples shown in low validity environments of statistical algorithms (Meehl pattern) showing up failed real world assumptions revealing in the process the illusion of skill and hunches to make long-term predictions. Many of these are based on clinical predictions of trained professionals, some that serve important selection criteria of interviewing practices which have great significance. Flawed stories from the past that shape our views of the present world and expectations of the future are very seductive especially when combined with the halo effect and making global evaluations rather than specific ratings. For example one's belief in the latest block busting management tool adopted by a new CEO has been statistically shown to be only a 10% improvement at best over random guess work. Another example of a leadership group challenge to select Israeli army leaders from cadets in order to reveal their "true natures" produced forecasts that were inaccurate after observing only one hour of their behaviour in an artificial situation - this was put down to the illusion of validity via the representation heuristic and non regressive weak evidence. Slightly more worryingly, the same can be said for the illusory skills of selling and buying stock persistently over time. It has shown that there is a narrative being played within the minds of the traders: they think they are making sensible educated guesses when the exposed truth is that their success in long term predictability is based on luck - a fact that is deeply ingrained in the culture of the industry with false credit being "taken" in bonuses!! Kahneman pulls no punches about the masters of the universe and I am inclined to believe in the pedigree of his analysis!! According to Kahneman so-called experts - and he is slightly derisive in his use of the term - in trying to justify their ability to assess masses of complexity as a host of mini-skills can produce unreliable judgements, especially long term forecasts (e.g planning fallacy) due to the inconsistency of extreme context (low or zero-validity environments with non regular practice) - a System 1 type error. Any final decision should be left to an independent person with the assessment of a simple equally weighted formula which is shown to be more accurate than if the interviewer also makes a final decision who is susceptible to personal impression and "taste"..(see wine vintage predictions). The best an expert can do is anticipate the near future using cues of recognition and then know the limits of their validity rather than make random hits based on subjectively compelling intuitions that are false. "Practice makes perfect" is the well known saying though the heuristics of judgement (coherence, cognitive ease and overconfidence) are invoked in low validity environments by those who do not know what they are doing (the illusion of validity). Looking at other similar books on sale, "You are Not So Smart" for example by David McRaney is a more accessible introduction to the same subject but clearly rests on Kahneman's giant shoulders who with his erstwhile colleagues would appear to have informed the subject area in every conceivable direction. It is hard not to do justice to such a brilliant book with a rather longish review. This is certainly one of the top ten books I have ever read for the benefits of rational perseverance and real world knowledgeable insights and seems to be part of a trend or rash of human friendly Econ (System 2) research emanating out of the USA at the moment. For example, recently 2013 Nobel winning economics research by R Shiller demonstrates that there are predictable regularities in assets markets over longer time periods, while E Fama makes the observation that there is no predictability in the short run. In summary, "following our intuitions is more natural, and 'somehow' more pleasant than acting against them" and we usually end up with products of our extreme predictions, i.e. overly optimistic or pessimistic, since they are statistically non-regressive by not taking account of a base rate (probability) or regression towards the mean (self correcting fluctuations in scores over time). The slow steady pace of the TORTOISE might be considered the right pace to take our judgements but we are prone not to give the necessary time and perspective in a busy and obtuse world. The division of labour and conflict between the two Systems of the mind can lead to either cognitive illusion (i.e. prejudice/bias) or if we are lucky wise judgement in a synthesis of intuition and cognition (called TORTOISE thinking by Dobransky in his book Quitting the Rat Race). Close your eyes and imagine the future ONLY after a disciplined collection of objective information unless of course you happen to have expert recognition, which is referred to in Gladwell's book on subject called Blink, but then your eyes are still open and liable to be deceived. Kahneman's way seems so much wiser but harder nonetheless. The art and science of decision-making just got so much more interesting in the coming world of artificial intelligence! Review: Present Company Included - This is a monster book packed with fascinating insights about how our cognitive systems process and render information. Its starting premise is that we have two discrete "systems" for mental processing. Daniel Kahneman, a cognitive psychologist who transformed himself into a Nobel Prize-winning behavioural economist, gives these the Dr. Seussian labels "System 1 and System 2". System 1 is fast. It makes snap judgments on limited information: it manifests itelf in the "fight or flight" reflex. System 2 is more deliberative: courtesy of this, one meditates on eternal verities, solves quadratic equations and engages in subtle moral argument. Though this is interesting enough, their interaction is more fascinating still. System 1 is lightweight, efficient and self-initiates without invitation; bringing System 2 to bear on a conundrum requires effort and concentration. This talk of snap judgments calls to mind Malcolm Gladwell's popular but disappointing " Blink: The Power of Thinking Without Thinking ". Kahneman's account, rooted in decades of controlled experiment, is a far more rigorous explanation of what is going on, and is able to explain why some snap judgments are good, and others are bad. This conundrum, unanswered in Gladwell's book, is Daniel Kahneman's main focus of enquiry. It also invokes another popular science classic: Julian Jaynes' idea of the " Bicameral Mind " - wherein there are large aspects of our daily existence, which we consider them conscious, really are not - driving by rote to the office, playing a musical instrument - these are also mental processes, I imagine Kahneman would say, undertaken by System 1. Jaynes was widely viewed as a bit of an eccentric: Kahneman's work suggests he may have been right on the money. It gets interesting for Kahneman where the division of labour between the systems isn't clear cut. System 1 can and does make quick evaluations even where system 2's systematic analysis would provide a better result (these are broadly the "bad" snap judgments of Gladwell's Blink). But System 2 requires dedicated mental resource (in Kahneman ugly expression, it is "effortful"), and our lazy tendency is to substitute (or, at any rate, stick with) those "cheaper" preliminary judgments where it is not obviously erroneous to do so (and by and large, it won't be, as System 1 will have done its work). Kahneman's shorthand for this effect is WYSIATI: What You See Is All There Is. Kahneman invites the reader to try plenty of experiments aimed at illustrating his fecklessness, and these hit their mark: it is distressing to repeatedly discover you have made a howling error of judgment, especially when you knew you were being tested for it. This has massive implications for those who claim group psychology can be predicted on narrow logical grounds. The latter half of Thinking Fast and Slow focusses more on our constitutional inability to rationally adapt to probabilities and soundly wallops the notion of Homo Economicus, the rational chooser each of us imagine ourselves to be. This is where Kahneman's Nobel Prize-winning Prospect Theory and gets full run of the paddock. Kahneman draws many lessons (which, by his own theory, doubtless will go unheeded) for scientists, economists, politicians, traders and business managers: "theory-induced blindness"; how we become (irrationally) risk tolerant when all our options are bad and risk averse when all our options are good, and how we systematically underweight high probability outcomes relative to actual certainty. For those with nerves of steel there's a real arbitrage to be exploited here. This long book is a rich (if "effortful") store of information and perspective: it is not news that our fellow man tends not to be as rational as we like to think he is, but we are inclined strongly exclude present company from such judgments. Kahneman is compelling that we are foolish to do so: this is a physiological "feature" of our constitution: the "enlightened" are no more immune. This is a valuable and sobering perspective. Olly Buxton




| Best Sellers Rank | 394 in Books ( See Top 100 in Books ) 1 in Psychological Schools of Thought 7 in Scientific Psychology & Psychiatry 10 in Higher Education of Biological Sciences |
| Customer Reviews | 4.5 out of 5 stars 46,914 Reviews |
J**E
When is an intuition a simplifying heuristic or an expert solution? That is the cue of recognition, nigh a formula Grasshopper!
Stemming from the author's Nobel prize winning scholarly research on the simplifying short-cuts of intuitive thinking (systematic errors) and then decision making under uncertainty both published in Science Journal, the book is a series of thought experiments that sets out to counter the prevailing rational-agent model of the world (Bernoulli's utility errors) that humans have consistent preferences and know how to maximise them. Instead, Prospect Theory shows that important choices are prone to the relativity of shifting reference points (context) and formulations of inconsequential features within a situation such that human preferences struggle to become reality-bound. In particular our decisions are susceptible to heuristic (short-cutting) or cognitive illusory biases - an inconsistency that is built in to the design of our minds, for example, the 'duration neglect' of time (less is more effect) in recounting a story by the Remembering Self, as opposed to sequentially by the Experiencing Self. Prospect Theory is based on the well known dominance of threat/escape (negativity) over opportunity/approach (positivity) as a natural tendency or hard wired response towards risk adversity that Kahneman's grandmother would have acknowledged. Today this bias is explored by behavioural economics (psychophysics) and the science of neuroeconomics - in trying to understand what a person's brain does while they make safe or risky decisions. It would appear that there are two species of homo sapiens: those who think like "Econs" - who can compare broad principles and processes 'across subjects', like spread betters (broad framing) in trades of exchange; and "Humans" who are swayed optimistically or pessimistically in terms of conviction and fairness by having attachments to material usage (narrow framing) and a whole host of cognitive illusions, e.g. to name but a very few: the endowment effect, sunk cost fallacy and entitlement. Kahnmann argues that these two different ways of relating to the world are heavily predicated by a fundamental split in the brain's wet-ware architecture delineated by two complementary but opposing perspectives: System 1 is described as the Inside View: it is "fast" HARE-like intuitive thought processes that jump to best-case scenario and plausible conclusions based on recent events and current context (priming) using automatic perceptual memory reactions or simple heuristic intuitions or substitutions. These are usually affect-based associations or prototypical intensity matches (trying to compare different categories, e.g. apples or stake?). System 1 is susceptible to emotional framing and prefers the sure choice over the gamble (risk adverse) when the outcomes are good but tends to accept the gamble (risk seeking) when all outcomes are negative. System 1 is 'frame-bound' to descriptions of reality rather than reality itself and can reverse preferences based on how information is presented, i.e. is open to persuasion. Therefore, instead of truly expert intuitions System 1 thrives on correlations of coherence (elegance), certainty (conviction) and causality (fact) rather than evidential truth. System 1 has a tendency to believe, confirm (well known bias), infer or induce the general from the particular (causal stereotype). System 1 does not compute base rates of probability, the influence of random luck or mere statistics as correlation (decorrelation error) or the regression to the mean (causality error). System 1's weakness is the brain's propensity to succumb to over-confidence and hindsight in the resemblance, coherence and plausibility of flimsy evidence of the moment acronymically termed WYSIATI (What You See Is All There Is) at the expense of System 2 probability. To succumb is human as so humbly shown throughout the book which has no bounds to profession, institution, MENSA level or social standing. Maybe Gurdjieff was right when he noticed that the majority of humans are sheep-like. System 2 on the other hand is the Outside View that attempts to factor in Rumsfeld's "unknown unknowns" by using realistic baselines of reference classes. It makes choices that are 'reality-bound' regardless of presentation of facts or emotional framing and can be regarded as "slow" RAT-like controlled focus and energy sapping intention, the kind used in effort-full integral, statistical and complex reasoning using distributional information based on probability, uncertainty and doubt. However System 2 is also prone to error especially in the service of System 1 and even though it has the capability with application not to confuse mere correlation with causation and deduce the particular from the general, it can be blocked when otherwise engaged, indolent or full of pride! As Kahneman puts it "...the ease at which we stop thinking is rather troubling" and what may appear to be compelling is not always right especially when the ego - the executive regulator of will power and concentration - is depleted of energy, or conversely when it is in a good mood of cognitive ease (not stress) deriving from situations of 'mere exposure' (repetition and familiarity). Experiments have repeatedly shown that cognitive aptitude and self-control are in direct correlation, and biases of intuition are in constant need of regulation which can be hard work such as uncovering one's outcome bias (part hindsight bias and halo effect) based on the cognitive ease with which one lays claim to causal 'narrative fallacies' (Taleb) rather than "adjusting" to statistical random events born out of luck! So.. Do not expect a fun and "simples" read if you want clarity in to how impulses become voluntary actions and impressions and feelings and inclinations so readily become beliefs, attitudes and intentions (when endorsed by System 2). The solution.. Kahneman makes the special plea that our higher-minded intuitive statistician of System 2 take over the art of decision-making and wise judgement in "accurate choice diagnosis" to minimise the "errors in the design of the machinery of cognition." We should learn to recognise situations in which significant mistakes are likely by making the time and putting in the analytical effort to avoid them especially when the stakes are high - usually when a situation is unfamiliar and there is no time to collect more information. 'Thinking Fast and Slow' practically equips the reader with sufficient understanding to approach reasoning situations applying a certain amount of logic in order to balance and counter our intuitive illusions. For example recognising the Texas sharp shooter fallacy (decorrelation error) or de-constructing a representative heuristic (stereotype) in one's day-to-day affairs should be regarded as a reasonable approach to life even by any non-scientific yard stick. In another example, the System 2 objectivity of a risk policy is one remedy against the System 1 biases inherent in the illusion of optimists who think they are prudent, and pessimists who become overly cautious missing out on positive opportunities - however marginal a proposition may appear at first. One chapter called "Taming Intuitive Predictions" is particularly inspiring when it comes to corrections of faulty thinking. A reasonable procedure for systematic bias in significant decision-making situations where there is only modest validity (validity illusion) especially in-between subjects is explored. For example, when one has to decide between two candidates, be they job interviewees or start up companies as so often happens the evidence is weak but the emotional impression left by System 1 is strong. Kahneman recommends that when WYSIATI to be very wary of System 1's neglect of base rates and insensitivity to the quality of information. The law of small numbers states that there is more chance of an extreme outcome with a small sample of information in that the candidate that performs well at first with least evidence have a tendency not to be able to keep up this up over the longer term (once employed) due to the vagaries of talent and luck, i.e. there is a regression towards the mean. The candidate with the greater referential proof but less persuasive power on the day is the surer bet in the long term. However, how often can it be said that such a scenario presents itself in life, when the short term effect is chosen over the long term bet? Possibly a cheeky pertinant example here is the choice of Moyes over Mourinho as the recently installed Man Utd manager! A good choice of bad choice? There are many examples shown in low validity environments of statistical algorithms (Meehl pattern) showing up failed real world assumptions revealing in the process the illusion of skill and hunches to make long-term predictions. Many of these are based on clinical predictions of trained professionals, some that serve important selection criteria of interviewing practices which have great significance. Flawed stories from the past that shape our views of the present world and expectations of the future are very seductive especially when combined with the halo effect and making global evaluations rather than specific ratings. For example one's belief in the latest block busting management tool adopted by a new CEO has been statistically shown to be only a 10% improvement at best over random guess work. Another example of a leadership group challenge to select Israeli army leaders from cadets in order to reveal their "true natures" produced forecasts that were inaccurate after observing only one hour of their behaviour in an artificial situation - this was put down to the illusion of validity via the representation heuristic and non regressive weak evidence. Slightly more worryingly, the same can be said for the illusory skills of selling and buying stock persistently over time. It has shown that there is a narrative being played within the minds of the traders: they think they are making sensible educated guesses when the exposed truth is that their success in long term predictability is based on luck - a fact that is deeply ingrained in the culture of the industry with false credit being "taken" in bonuses!! Kahneman pulls no punches about the masters of the universe and I am inclined to believe in the pedigree of his analysis!! According to Kahneman so-called experts - and he is slightly derisive in his use of the term - in trying to justify their ability to assess masses of complexity as a host of mini-skills can produce unreliable judgements, especially long term forecasts (e.g planning fallacy) due to the inconsistency of extreme context (low or zero-validity environments with non regular practice) - a System 1 type error. Any final decision should be left to an independent person with the assessment of a simple equally weighted formula which is shown to be more accurate than if the interviewer also makes a final decision who is susceptible to personal impression and "taste"..(see wine vintage predictions). The best an expert can do is anticipate the near future using cues of recognition and then know the limits of their validity rather than make random hits based on subjectively compelling intuitions that are false. "Practice makes perfect" is the well known saying though the heuristics of judgement (coherence, cognitive ease and overconfidence) are invoked in low validity environments by those who do not know what they are doing (the illusion of validity). Looking at other similar books on sale, "You are Not So Smart" for example by David McRaney is a more accessible introduction to the same subject but clearly rests on Kahneman's giant shoulders who with his erstwhile colleagues would appear to have informed the subject area in every conceivable direction. It is hard not to do justice to such a brilliant book with a rather longish review. This is certainly one of the top ten books I have ever read for the benefits of rational perseverance and real world knowledgeable insights and seems to be part of a trend or rash of human friendly Econ (System 2) research emanating out of the USA at the moment. For example, recently 2013 Nobel winning economics research by R Shiller demonstrates that there are predictable regularities in assets markets over longer time periods, while E Fama makes the observation that there is no predictability in the short run. In summary, "following our intuitions is more natural, and 'somehow' more pleasant than acting against them" and we usually end up with products of our extreme predictions, i.e. overly optimistic or pessimistic, since they are statistically non-regressive by not taking account of a base rate (probability) or regression towards the mean (self correcting fluctuations in scores over time). The slow steady pace of the TORTOISE might be considered the right pace to take our judgements but we are prone not to give the necessary time and perspective in a busy and obtuse world. The division of labour and conflict between the two Systems of the mind can lead to either cognitive illusion (i.e. prejudice/bias) or if we are lucky wise judgement in a synthesis of intuition and cognition (called TORTOISE thinking by Dobransky in his book Quitting the Rat Race). Close your eyes and imagine the future ONLY after a disciplined collection of objective information unless of course you happen to have expert recognition, which is referred to in Gladwell's book on subject called Blink, but then your eyes are still open and liable to be deceived. Kahneman's way seems so much wiser but harder nonetheless. The art and science of decision-making just got so much more interesting in the coming world of artificial intelligence!
O**N
Present Company Included
This is a monster book packed with fascinating insights about how our cognitive systems process and render information. Its starting premise is that we have two discrete "systems" for mental processing. Daniel Kahneman, a cognitive psychologist who transformed himself into a Nobel Prize-winning behavioural economist, gives these the Dr. Seussian labels "System 1 and System 2". System 1 is fast. It makes snap judgments on limited information: it manifests itelf in the "fight or flight" reflex. System 2 is more deliberative: courtesy of this, one meditates on eternal verities, solves quadratic equations and engages in subtle moral argument. Though this is interesting enough, their interaction is more fascinating still. System 1 is lightweight, efficient and self-initiates without invitation; bringing System 2 to bear on a conundrum requires effort and concentration. This talk of snap judgments calls to mind Malcolm Gladwell's popular but disappointing " Blink: The Power of Thinking Without Thinking ". Kahneman's account, rooted in decades of controlled experiment, is a far more rigorous explanation of what is going on, and is able to explain why some snap judgments are good, and others are bad. This conundrum, unanswered in Gladwell's book, is Daniel Kahneman's main focus of enquiry. It also invokes another popular science classic: Julian Jaynes' idea of the " Bicameral Mind " - wherein there are large aspects of our daily existence, which we consider them conscious, really are not - driving by rote to the office, playing a musical instrument - these are also mental processes, I imagine Kahneman would say, undertaken by System 1. Jaynes was widely viewed as a bit of an eccentric: Kahneman's work suggests he may have been right on the money. It gets interesting for Kahneman where the division of labour between the systems isn't clear cut. System 1 can and does make quick evaluations even where system 2's systematic analysis would provide a better result (these are broadly the "bad" snap judgments of Gladwell's Blink). But System 2 requires dedicated mental resource (in Kahneman ugly expression, it is "effortful"), and our lazy tendency is to substitute (or, at any rate, stick with) those "cheaper" preliminary judgments where it is not obviously erroneous to do so (and by and large, it won't be, as System 1 will have done its work). Kahneman's shorthand for this effect is WYSIATI: What You See Is All There Is. Kahneman invites the reader to try plenty of experiments aimed at illustrating his fecklessness, and these hit their mark: it is distressing to repeatedly discover you have made a howling error of judgment, especially when you knew you were being tested for it. This has massive implications for those who claim group psychology can be predicted on narrow logical grounds. The latter half of Thinking Fast and Slow focusses more on our constitutional inability to rationally adapt to probabilities and soundly wallops the notion of Homo Economicus, the rational chooser each of us imagine ourselves to be. This is where Kahneman's Nobel Prize-winning Prospect Theory and gets full run of the paddock. Kahneman draws many lessons (which, by his own theory, doubtless will go unheeded) for scientists, economists, politicians, traders and business managers: "theory-induced blindness"; how we become (irrationally) risk tolerant when all our options are bad and risk averse when all our options are good, and how we systematically underweight high probability outcomes relative to actual certainty. For those with nerves of steel there's a real arbitrage to be exploited here. This long book is a rich (if "effortful") store of information and perspective: it is not news that our fellow man tends not to be as rational as we like to think he is, but we are inclined strongly exclude present company from such judgments. Kahneman is compelling that we are foolish to do so: this is a physiological "feature" of our constitution: the "enlightened" are no more immune. This is a valuable and sobering perspective. Olly Buxton
E**Y
Starts well
This book starts by being intriguing and stimulating, and deserves to be read. The chapters are short, the writing is clear, the arguments supported by examples of behavioural studies, and each chapter usefully ends with a few colloquial statements that sum up what has been said. However, halfway through, the book appears to lose itself, whether by fault of verbosity, repetition, loss of structure of argument, fewer references to work by others, or just sheer volume, I’m not sure. In any case, the reader finishes the book thinking of the cited example of the scratch at the end of disc that dominates the memory of the whole. Where was the editor in all this? The first part of the book is worth it, however. It turns out that we possess two types of thinking, default System 1 (fast) and System 2 (slow). Between them they determine how we react and make decisions. The book exposes many behavioural studies concerning the relationship between psychology and economics, and the competition between these two disciplines to explain people’s actions and decisions, including a brief mention of the new discipline of neuroeconomics. According to the author, we are primitive in the art of prediction. We lack methodology. We suffer from illusions of validity. Why does, on one side, someone decide to sell a stock, and on the other another decides to buy it? The evidence shows that more active stock sellers have worse results. Studies show that forecasts by doctors, investment advisors, sports analysts, politicians, economists and myriads of other professionals don’t compare favourably with machine prediction. I would add the element of ‘destructive cleverness’ whereby people tend to apply their expertise in other non-relevant areas to a problem that exists within a different set of givens, contributing factors, and noise factors that need to be properly appreciated. People perhaps tend to think too often out of the box because of lack of familiarity with a subject, and are invariably inconsistent in doing so. The art of decision making needs to be demystified. It needs to be transformed into a more scientifically and factually-based procedure. Might justice be better delivered by computer? Planning fallacies include over-optimism in costs and time due to only taking the inside view and failing to refer to references classes. Optimism is the life blood of entrepreneurs, and only 35% of businesses in the USA survive 5 years. There is the notion of optimistic martyrs; firms that fail, market fodder, if you like, yet signal new markets to more qualified competitors. Potentially dangerous groupthink can be moderated by carrying out a pre-mortem; asking participants to write a reason why a project might have failed (the discipline of FMEA in industrial language). We learn of hedonometry that quantifies pleasure and pain. We have a less unfavourable memory of pain if it tails off at the end. Our memory of an experience may not be the same as during the experience (example of a scratch at the end of a record). There is a difference between the experiencing and remembering selves. In summing up someone’s life we are over influenced by how it ends. One Unpleasant Index survey of women yielded double for child-rearing than watching TV, which was the same level as socialising. Being alone is more pleasurable than the presence of an immediate boss. Increasing focus is being placed on the measurement of well-being. Above a certain salary (quoted as $75,000 in high-cost areas) affluence does not improve a feeling of well-being, possibly, it is argued, because richer people no longer have the opportunity to enjoy in the same way the small pleasures of life (bars of chocolate). The author focuses on System 1, for which the cardinal rule is WYSIATI – what you see is all there is. It is impulsive, intuitive, our minds appear over-influenced by bias and spin. It is rarely indifferent to emotional words (a ‘survival chance’ of 90% is preferred to a ‘mortality rate’ of 10%). It jumps to conclusions, and can even govern important decisions depending upon how a problem is presented. Intuition requires training in skills (a top class chess player requires about 10,000 hours of practice). Reminding people of their mortality increases the appeal of authoritarian ideas. There is the Lady Macbeth Effect whereby when people feel their soul is stained they have the desire to clean themselves. The Florida Effect was illustrated when students, who had been encouraged to think of words related to old age, walked more slowly down a corridor. When we place a pen crosswise in our mouth, thereby forcing a smile, we tend to think more favourably of things. The Halo Effect occurs, for example, when people like a president’s politics because they like his voice and appearance. In the Availability Cascade, biases, popular reactions and exaggerated fears (often influenced by the media and popular reactions) influence policy. We are unduly worried about unlikely events, for example when a teenage daughter is late at night. Terrorism speaks directly to System 1 even though, even in the worst cases, it may be responsible for nowhere near the number of deaths by car accident. In decisions related to numbers, there is an Anchor Effect whereby a suggested value influences our decision. Hindsight bias causes us to blame the intelligence services for 9/11. System 1, in effect, tries to make sense of the world, to stereotype, making it predictable and explicable and overestimating predictability. It even breeds overconfidence. It averages instead of adds. People will assign less value to a larger set of dinner crockery that contains some broken items than a smaller set of the same quality but with no broken items. We tempt to rationalise the past in order to predict the future. We suffer from theory-induced blindness. The Endowment Effect provokes an aversion to loss and determines economic behaviour. We may be prepared to sell, but only at a higher price than buy (the ratio is higher in the USA than the UK). People who are poor see a small amount of income as a reduced loss rather than a gain. The brain has a rapid mechanism to detect threats, but no such thing for good events. The negative trumps the positive. A single cockroach will destroy the appeal of a bowl of cherries, but a single cherry will have no effect on a bowl of cockroaches. A stable marital relationship has been found to require at least 5:1 ratio of good interactions to bad ones. A friendship that may take years to develop can be destroyed with one single action. Golfers try harder to avoid a bogie than to gain a birdie. In relation to rational probability, our decisions are skewed negatively near 100% and positively near 0%. People attach value to gains or losses rather than wealth. System 1 makes us overweigh improbable outcomes unless we have prior experience. We tend to overestimate our chances and overweigh estimates. People tend to be risk averse in potential gains and risk-seeking in potential losses. The Sunk Cost fallacy keeps people for too long in poor jobs, unhappy marriages, and unpromising research projects. We tend to reinvest in a project in which we are already implicated, even if the prospects have deteriorated, rather than divert out effort into a more promising venture, so as not be part of a failure. We are reluctant to cut our losses. We tend to be risk-averse and environmental and safety laws, for example, are set up to protect us, yet these laws might have prevented development of the airplane, x-rays, open-heart surgery. We prepare ourselves for the feeling of regret. We avoid being too hopeful about a potential football win. People more readily forgo a discount than pay a surcharge even when the end result is identical. System 2 includes rational thinking and reasoning, but it is inherently lazy. To counteract the negative effects of System 1 determining a choice, we can ask ourselves to produce more arguments to support it. Disbelieving something is hard work. Governments making decisions bases solely on hard facts and statistics as opposed to popular reactions.
C**X
Perfect
A great book still reading as a lot to absorb
R**N
Provokes deep thought, but it's frightening.
This is an interesting read. At first I was a little irritated by the author's habit of telling me how I was feeling but I soon realised why he does it. He wants his readers to understand that the mistakes other people make are not just weaknesses in other people, but in ALL people. I have a few niggles with the book. Some are tiny. He uses examples which rely on the reader knowing things about academic structures in the USA. Also, conversely, he refers to a specific type of election in the UK when there are in fact no elections of that type. He has, in effect, used US jargon to describe UK structures. This may help Americans to understand, but it grates for non-Americans. A more serious niggle is that I disagree strongly with his analysis of the "Linda" case, and one or two others. However, the points he makes are usually very sound and well supported by evidence. Not only that, the experiments he describes are wonderfully clever designs and reveal very interesting stuff about how the human mind works. These techniques are sometimes his own, or things he developed with Amos (whom he obviously misses a great deal), and sometimes from other researchers. I picked up from the book some of the excitement these researchers feel in their efforts to understand what makes people tick. The book is frightening in two ways. It forces me to confront the fragility of my own mind and the ease with which I can be diverted and deluded by quite simple events. It also makes me gloomy about the human race: how can we deal with big issues like war and poverty when our reasoning is so prone to error? But it is good to be challenged in that way and I heartily recommend the book to anyone interested in understanding themselves and their limitations.
B**G
Still a great read, but needs a new edition
For reasons I can’t remember, I didn’t read this pop psychology title at the time of its 2011 publication. (I was really surprised it was that recent - in my mind it was about 20 years old.) Had I done so, I would have loved it. I used to hoover up these books describing all the ways our brains mislead us (even though I found it difficult to remember the vast swathes of different effects and the many biases that were being described). And there’s still a lot to enjoy here. But… It’s impossible now to read a book like this that is based on a whole host of small and/or poorly sampled experiments without being all too aware of the replication crisis. For example, Kahneman’s chapter on priming has been described as a 'train wreck', based as it is on a set of experiments that have almost all been discredited. Not only does this concern apply where you happen to know these details, it prompts (surely a psychological effect that Kahneman would be able to write about) suspicion when presented with some findings where I don’t know how good the trial was. For example, we’re told of an example where participants were presented with two lists of characteristics, three good, three bad. These were applied to two 'people', one with the good attributes listed first, the other leading with the bad ones. Apparently, because of the halo effect, when the good ones were closer to the name, people thought the person was better. But surely this would also apply if people assumed the common convention of putting more significant attributes earlier in a list? There's a reason that questionnaires shuffle up the order of choices when be asked to pick out a few - early ones are given priority. But it's nothing to do with closeness in layout to the thing being considered. You could put the name at the end, after the list of attributes and still get the same effect. This is an entertaining read of its kind, though, as is often case with a book looking at psychological biases, it covers too many and after a while they get absorbed into a mental mush. But the replication crisis demands a 'start again from scratch, please' rating. I'd be interested to see a new edition taking this into account.
N**Y
What You See Is All That Is...
Excellent book. Has really left me stumped. This is by no means light reading. It is a wonderful and enlightening voyage through the human mind and consciousness, topics which never fail to intrigue me. Overall Kahneman seems to have developed the System 1 and System 2 thought processes. I call system 1, originating creations or Primaries. System 2 I refer to a Secondaries. Secondaries are re-examinations of originating creations. They often change the way we think and perceive and although most of us tend to operate through a System 2 consciousness, it is far from foolproof as you will fast discover when reading this book. At the time of writing this review I am still only half way through the book so I intend to return with a more thorough review in time. However the book invites the reader to assess your own thinking in clear and unambiguous language and is a superb introduction to self psychology or possibly spiritualistic examination. As Kahneman says WYSIATI - What You See Is All That Is. To appreciate the essence of WYSIATI you really need to read the book as it would take a book to explain it here. But once you grasp the essence that what we see tends to get reinterpreted by us with the assistance of outside forces (including System 2 thinking) such that we 'very often' ignore the reality with which we are faced and reinterpret a new reality which better suits our preferences. The book covers immense areas of daily life from assessments of news and people through to routine decision making and interactions. As an investor I originally bought this to help in assessing the financial markets. It is without doubt in my top 10 must reads for anyone interested in Investing, politics, human associations, theory, psychology, academia or indeed simply daily life routine. I can honestly say it will change the way you view your own thinking and it will shed light on how other people think. UPDATE: well I am now about 75% through this gem of a book and even more impressed than before. Kahneman has given me a lot to think about and it all fits in neatly with my own studies on spiritualism and dimensions. I think he comes close to being able to follow this book up with the next stage which is an analysis of the origins of System 1 and System 2 thinking. Although the book explores the two concepts of thought, it does not address 'where' these emanate from in sufficient detail. Is that because he does not want to preach or perhaps he is still working in that theory. Anyway - more on that later. But for now I came across a fabulous piece of advice on page 264 that I have to share - the PreMortem. This is advocated by Gary Klein and in a nutshell it goes as follows: When an organisation has come to an important decision but has not formally committed itself, Klein proposes gathering for as brief session a group of individuals who are knowledgeable about that decision. The premise of the session is a short speech "imagine that we are a year into the future, we implemented the plan as it now exists. The outcome was a disaster. Please take 5-10 minutes to write a brief history of the disaster" Now apply that to any decision you make in your own life, be it starting a new business venture, buying a house, getting married, or indeed divorced; the insights that this wee little exercise can produce should improve productivity and decision making manyfold in almost anyone. As I stated before, this book should help any reader (who can understand the concepts) to stop, and assess the thinker rather than the thought. I regret to say that I will be back again with another update soon....! sorry to bore you
T**E
What you see is not all there is
The important premise of this thought provoking book is that the human mind is far less rational than we are led to believe. This is partly due to evolutionary pressures and the dichotomy that exists within the way we think. This conclusion Kahneman proceeds to assert represents a major challenge to well excepted models of human behaviour such as the rational agent theory of decision making, and the expected utility theory of economics. Kahneman argues that far from being rational and consistent, the human mind is prone to biases and heuristics (short cuts) in the manner in which it thinks about and creates models of the world. He proceeds to attempt to create a lexicon of such faults in our thinking with the worthy aim of creating a metalanguage which can be used to recognise, discuss and ultimately limit the effect of such biases and heuristics in the decision making processes of individuals, institutions and societies. Thus the reader is introduced to many labels for concepts with which they are familiar or have had experience of without previously having the vocabulary to adequately think about and discuss them. Without a common language to label and discuss the mind's shortcomings, it is impossible to effectively identify inconsistencies in thinking and seek ways of overcoming them. Notions of 'substitution', 'cognitive ease', 'accessibility', associative memory,' 'loss aversion', 'peak -end effect', 'duration neglect' and 'priming' may initially sound arcane but are brilliantly elicited from the reader's own experiences, compelling them to question their own beliefs and the internal consistencies of their values. Crucially, Kahneman suggests ways of overcoming mental biases and heuristics e.g. the use of the 'outside view' to overcome the 'planning fallacy' and self imposed regression to the mean when predicting financial or performance success. Concepts are introduced and revisited throughout the book, being expertly cross referenced to new ideas thus consolidating and deepening the reader's understanding. Particularly effective in this respect is the use of reflective questions and humorous sound bites at the end of each chapter. Perhaps the only fault of this book is the lack of a glossary to which the reader can continually refer. Whilst the contribution of Prospect Theory to economics and decision making is self evident, Kahneman interestingly raises many political and philosophical considerations towards the end of the book. Perhaps most notable in this is his notion of the 'experiencing self' versus 'the remembering self' and the implications for medical practices for instance. The break down in the rational agent model also has major implications for politicians, notably those in favour of a free market approach to economic growth and welfare provision. This book has much to advice politicians and public servants on such diverse issues as: how to encourage saving for old age, organ donation and on how to frame questions in referendums.
ترست بايلوت
منذ 4 أيام
منذ شهر