Two Systems - On Investment
- BedRock
- Sep 4, 2020
- 12 min read
Updated: Apr 8, 2024
Thinking, Fast and Slow was probably the best book I have read in the first half of 2020. Here are some of the notes I took after reading and worth sharing.
Systematic errors are known as biases, and they recur predictably in particular circumstances.
As we navigate our lives, we normally allow ourselves to be guided by impressions and feelings, and the confidence we have in our intuitive beliefs and preferences is usually justified. But not always. We are often confident even when we are wrong.
We called this reliance on the ease of memory search the availability heuristic.
People tend to assess the relative importance of issues by the ease with which they are retrieved from memory –and this is largely determined by the extent of coverage in the media. Frequently mentioned topics populate the mind even as others slip away from awareness. In turn, what the media choose to report corresponds to their view of what is currently on the public’s mind. It is no accident that authoritarian regimes exert substantial pressure on independent media.
Two systems:
System 1 operates automatically and quickly, with little or no effort and no sense of voluntary control.
System 2 allocates attention to the effortful mental activities that demand it, including complex computations. The operations of System 2 are often associated with the subjective experience of agency, choice, and concentration.
The often-used phrase “pay attention”is apt: you dispose a limited budget of attention that you can allocate to activities, and if you try to go beyond your budget, you will fail. It is the mark of effortful activities that they interfere with each other, which is why it is difficult or impossible to conduct several at once. You can do several things at once, but only if they are easy and undemanding.
Everyone has some awareness of the limited capacity of attention. Intense focusing on a task can make people effectively blind, even to stimuli that normally attract attention.
In summary, most of what you (your System 2) think and do originates in your System 1, but System 2 takes over when things get difficult, and it normally has the last word.
System 1 has biases, however, systematic errors that it is prone to make in specified circumstances. One further limitation of System 1 is that it cannot be turned off.
Because System 1 operates automatically and cannot be turned off at will, errors of intuitive thought are often difficult to prevent. Biases cannot always be avoided, because System 2 may have no clue to the error. Even when cues to likely errors are available, error can be prevented only by the enhanced monitoring and effortful activity of System 2. As a way to live your life, however, continuous vigilance is not necessarily good, and it is certainly impractical. Constantly questioning our own thinking would be impossibly tedious, and System 2 is much too slow and inefficient to serve as a substitute for System 1 in making routine decisions.
Attention and Effort
The response to mental overload is selective and precise: System 2 protects the most important activity, so it receives the attention it needs; “spare capacity”is allocated second to other tasks.
As you become skilled in a task, its demand for energy diminishes (The practice makes perfect!). Talent has similar effect. Highly intelligent individuals need less effort to solve the same problems.
In the economy of action, effort is a cost, and the acquisition of skill is driven by the balance of benefits and cost. Laziness is built deep into our nature.
Mental tendency: One of the significant discoveries of cognitive psychologists in recent decades is that switching from one task to another is effortful, especially under time pressure.
Time pressure is another driver of effort. The most effortful forms of slow thinking are those that require you to think fast.
We normally avoid mental overload by dividing our tasks into multiple easy steps, committing intermediate results to long-term memory or to paper rather than to an easily overloaded working memory. We cover long distances by taking our time and conduct our mental lives by the law of least effort. (That’s why how important a computer can help us)
The lazy controller
Self-control and deliberate thought apparently draw on the same limited budget of effort.
This is how the law of least effort comes to be a law.
The evidence suggests that you would be more likely to select the tempting chocolate cake when your mind is loaded with digits. System 1 has more influence on behavior when System 2 is busy, and it has a sweet tooth. People who are cognitively busy are more likely to make selfish choices, use sexist language, and make superficial judgements in social situations.
An effort of will or self-control is tiring (that’s why study is anti-human nature); if you have had to force yourself to do something, you are less willing or less able to exert self-control when the next challenge comes around. The phenomenon has been named ego depletion. The emotion effort in the first phase of the experiment reduces the ability to withstand the pain of sustained muscle contraction, and ego-depleted people therefore succumb more quickly to the urge to quit.
Activities that impose high demands on System 2 require self-control, and the exertion of self-control is depleting and unpleasant.
One of the main functions of System 2 is to monitor and control thoughts and actions “suggested” by System 1, allowing some to be expressed directly in behavior and suppressing or modifying others.
One of the main functions of System 2 is to monitor and control thoughts and actions “suggested” by System 1, allowing some to be expressed directly in behavior and suppressing or modifying others.
Intelligence is not only the ability to reason; it is also the ability to find relevant material in memory and to deploy attention when needed. The extent of deliberate checking and search is a characteristic of System 2, which varies among individuals.
The associative machine
David Hume reduced the principles of association to three: resemblance, contiguity in time and place, and causality.
In the current view of how associative memory works, a great deal happens at once. An idea that has been activated does not merely evoke one other idea. It activates many ideas, which in turn activate others. Furthermore, only a few of the activated ideas will register in consciousness; most of the work of associative thinking is silent, hidden from our conscious selves.
Priming phenomena arise in System 1, and you have no conscious access to them.
Cognitive ease
Cognitive strain is affected by both the current level of effort and the presence of unmet demands.

Remember that System 2 is lazy, and that mental effort is aversive. If possible, the recipients want to stay away from anything that reminds them of effort, including a source with a complicated name. What psychologists believe is that all of us live much of our life guided by the impressions of System 1 – and we often do not know the source of these impressions.
The mere exposure effect:
The effect of repetition on liking is a profoundly important biological fact, and that it extends to all animals. To survive in a frequently dangerous world, an organism should react cautiously to a novel stimulus, with withdrawal and fear. Survival prospects are poor for an animal that is not suspicious of novelty. However, it is also adaptive for the initial caution to fade if the stimulus is actually safe. The mere exposure effect occurs because the repeated exposure of a stimulus is followed by nothing bad. Such a stimulus will eventually become a safety signal, and safety is good.
Ease, Mood, and Intuition
The essence of creativity: creativity is associative memory that works exceptionally well.
Mood evidently affects the operation of System 1: when we are uncomfortable and unhappy, we lose touch with our intuition. Good mood, intuition, creativity, gullibility, and increased reliance on System 1 form a cluster. At the other pole, sadness, vigilance, suspicion, and analytic approach, and increased effort also go together. A happy mood loosens the control of System 2 over performance: when in good mood, people become more intuitive and more creative but also less vigilant and more prone to logical errors.
Here again, as in the mere exposure effect, the connection makes biological sense. A good mood is a signal that things are generally going well, the environment is safe, and it is all right to let one’s guard down. A bad mood indicates that things are not going very well, there may be a threat, and the vigilance is required. Cognitive ease is both a cause and a consequence of a pleasant feeling.
Assessing Normality
The main function of System 1 is to maintain and update a model of your personal world, which represents what is normal in it. The model is constructed by associations that link ideas of circumstances, events, actions, and outcomes that co-occur with some regularity, either at the same time or within a relatively short interval.
Things appear normal because they recruit the original episode, retrieve it from memory, and are interpreted in conjunction with it.
A machine to jumping to conclusions
Jumping to conclusions is efficient if the conclusions are likely to be correct and the costs of an occasional mistake acceptable, and if the jump saves much time and effort. Jumping to conclusions is risky when situation is unfamiliar, the stakes are high, and there is no time to collect more information. There are circumstances in which intuitive errors are probable, which may be prevented by a deliberate intervention of System 2.
When uncertain, System 1 bets on an answer, and the bets are guided by experience. The rules of the betting are intelligent: recent events and the current context have the most weight in determining an interpretation. When no recent event comes to mind, more distant memories govern.
Conscious doubt is not in the repertoire of System 1; it requires maintaining incompatible interpretations in mind at the same time which demands mental effort. Uncertainty and doubts are domain of System 2.
Bias to believe and confirm
Unbelieve is an operation of System 2.
The moral is significant: when System 2 is otherwise engaged, we will believe almost anything. System 1 is gullible and biased to believe, System 2 is in charge of doubting and unbelieving, but System 2 is sometimes busy, and often lazy. Indeed, there is evidence that people are more likely to be influenced by empty persuasive messages, such as commercials, when they are tired and depleted.
The confirmation biases
The operations of associative memory contribute to general confirmation bias.
A deliberate search for confirming evidence, known as positive test strategy, is also how System 2 tests a hypothesis. And the confirmatory bias of System 1 favors uncritical acceptance of suggestions and exaggeration of the likelihood of extreme and improbable events.
Exaggerated emotional coherence (halo effect)
The tendency to like (or dislike) everything about a person – including things you have not observed – is known as the halo effect.
First impressions matter
Real evidence is missing, and the gap is filled by a guess that fits one’s emotional response.
In other situations, evidence accumulates gradually, and the interpretation is shaped by the emotion attached to the first impression.
The initial traits in the list change the very meaning of the traits that appear later. The sequence in which we observe characteristics of a person is often determined by chance. Sequence matter, however, because the halo effect increases the weight of first impressions, sometimes to the point that subsequent information is mostly wasted.
The reason for consistency
The consistency I had enjoyed earlier was spurious; it produced a feeling of cognitive ease, and my System 2 was happy to lazily accept the final grade. By allowing myself to be strongly influenced by the first question in evaluating subsequent ones, I spared myself the dissonance of finding the same student doing very well on some questions and badly on others. The uncomfortable inconsistency that was revealed.
What you see is all there is (WYSIATI)
The remarkable asymmetry between the ways our mind treats information that is currently available and information we do not have.
An essential design feature of the associative machine is that it represents only activated ideas. Information that is not retrieved (even unconsciously) from memory might as well not exist. System 1 excels at constructing the best possible story that incorporates ideas currently active, but it does not (cannot)allow for information it does not have.
The measure of success for System 1 is the coherence of the story it manages to create. The amount and quality of the data on which the story is based are largely irrelevant. When information is scarce, which is a common occurrence, System 1 operates as a machine for jumping to conclusions.
The combination of a coherence-seeking System 1 with a lazy System 2 implies that System 2 will endorse many intuitive beliefs, which closely reflect the impressions generated by System 1.
It is the consistency of the information that matters for a good story, not its completeness. Indeed, you will often find that knowing little makes it easier to fit everything you know into a coherent pattern.
WYSIATI facilitates the achievement of coherence and of the cognitive ease that causes us to accept a statement as true. It explains why we can think fast, and how we are able to make sense of partial information in a complex world.
Much of the time, the coherent story we put together is close enough to reality to support reasonable action. However, there is a long and diverse list of biases of judgment and choice which caused by WYSIATI.
Overconfidence: As the WYSIATI rule implies, neither the quantity nor the quality of the evidence counts for much in subjective confidence. The confidence that individuals have in their beliefs depends mostly on the quality of the story. We often fail to allow for the possibility that evidence that should be critical to our judgment is missing –what we see is all there is. Furthermore, our associative system tends to settle on a coherent pattern of activation and suppresses doubt and ambiguity.
Others e.g. Framing effects, Base-rate neglect, etc.
How judgements happen?
System 2 receives questions or generate them: in either case it directs attention and searches memory to find the answer. System 1 operates differently. It continuously monitors what is going on outside and inside the mind, and continuously generates assessments of various aspects of the situation without specific intention and with little or no effort. These basic assessments play an important role in intuitive judgement, because they are easily substituted for more difficult questions – this is the essential idea of the heuristics and biases approach. An intention of System 2 to answer a specific question or evaluate a particular attribute of the situation automatically triggers other computations, including basic assessments.
System 1 has been shaped by evolution to provide a continuous assessment of the main problems that an organism must solve to survive: How are things going? Is there a threat or a major opportunity? Is everything normal? Should I approach or avoid? The questions are perhaps less urgent for a human in a city environment than for a gazelle on the savannah, but we have inherited the neural mechanisms that evolved to provide ongoing assessments of threat level, and they have not been turned off. Situations are constantly evaluated as good or bad, requiring escape or permitting approach. Good mood and cognitive ease are the human equivalents of assessments of safety and familiarity.
System 1 represents categories by a prototype or a set of typical exemplars, it deals well with averages but poorly with sums. The size of the category, the number of instances it contains, tends to be ignored in judgements of what I will call sum-like variables.
The mental shotgun
System 1 carries out many computations at any one time. Some of these are routine assessments that go on continuously. I call this excess computation the mental shotgun.
An intention to answer one question evoked another, which was not only superfluous but actually detrimental to the main task.
Substituting questions
If a satisfactory answer to a hard question is not found quickly, System 1 will find a related question that is easier and will answer it. I call the operation of answering one question in place of another substitution.
We asked ourselves how people manage to make judgments of probability without knowing precisely what probability is. We concluded that people must somehow simplify that impossible task, and we set out to find how they do it. Our answer was that when called upon to judge probability, people actually judge something else and believe they have judged probability. System 1 often makes this move when faced with difficult target questions, if the answer to a related and easier heuristic question comes readily to mind. But the heuristics that I discuss are not chosen; they are a consequence of the mental shotgun, the imprecise control we have over targeting our responses to questions.
The mental shotgun makes it easy to difficult questions without imposing much hard work on lazy System 2.
Something is still missing from this story: the answers need to be fitted to the original questions. Another capability of System 1, intensity matching, is available to solve that problem.
Of course, System 2 has the opportunity to reject this intuitive answer, or to modify it by incorporating other information. However, a lazy System 2 often follows the path of least effort and endorses a heuristic answer without much scrutiny of whether it is truly appropriate.
The affect heuristic
People let their likes and dislike determine their beliefs about the world.
If you dislike any of these things, you probably believe that its risks are high and its benefits negligible.
The primacy of conclusions does not mean that your mind is completely closed and that your opinions are wholly immune to information and sensible reasoning. Your beliefs, and even your emotional attitude, may change (at least a little) when you learn that the risk of an activity you disliked is smaller than you thought.
In the context of attitudes, however, System 2 is more of an apologist for the emotions of System 1 than a critic of those emotions—an endorser rather than an enforcer. Its search for information and arguments is mostly constrained to information that is consistent with existing beliefs, not with an intention to examine them. An active, coherence-seeking System 1 suggests solutions to an undemanding System 2.
To be continued...
Comentarios