Lessons from the CIA: Best ideas from Richards J. Heuer
The following is a compendium of 43 lessons I learnt from CIA veteran Richards J. Heuer´s book "Psychology of Intelligence Analysis"
Introduction:
1. There is a common human tendency by which newly acquired information is analyzed, evaluated and processed from the lens of the existing analytical model, instead of being used to reassess the premises of the model itself.
2. There is not such a thing as too much information or expertise. However, information and expertise are necessary but not sufficient conditions for intelligence analysis.
3. Think not only about information, but also the lenses we use to assess that information. Most errors come from the latter.
4. One key for a good analysis is the use of the Analysis of Competing Hypothesis (ACH). It highlights the competition among alternative hypothesis to see which ones survive
5. Analysis often neglect the possibility of deception because they do not see evidence of it. However, if deception is well executed, they should expect no evidence of it. Absence of evidence is not evidence of absence.
Thinking about thinking:
6. Herbert Simon brought the concept of bounded rationality: We are rational, but only in our mental model. People construct their own version of reality based on:
- Content: Information from the senses
- Channel: The mental models (simplifications of reality) they have to analyze that information
- Result: Their own understanding of reality
Perception: Why can´t we see what is there to be seen?
7. Perception is an active rather than passive process, we construct reality rather than record reality. It is a process of inference. We tend to perceive what we expect to perceive!
8. Initial exposure to blurred stimuli, interferes with future accurate perception - the greater the ambiguity of that initial stimuli, the greater the impact of preconceptions in the final evaluation
Memory: How do we remember what we know?
8. There are 3 memory processes that compose the overall memory system: (i) The Sensory Information Storage (ii) Short-Term Memory, (iii) Long-Term Memory. While the ST memory is physiologically limited, LT memory has no limitations on the amount of information that can handle. However, there is a bottleneck, which is the process of storing information into it, and retrieving information from it.
9. Memories are stored as patterns of connections between neurons. A schema is any pattern of relationship among data stored in the memory. This is key, as the amount of bits of information at your disposal depends on your schema. We need schemas and mental models (simplifications of reality) to assess information and improve our memory. We need categories to remember things. If information does not fit into what we know or think we know, we will have greater difficulty processing it. However, in wicked domains, schemata can be dangerous, as having an open mind is increasingly important…
Memory: How do we remember what we know?
10. Judgement is arriving at conclusions or decisions on the basis of indications and probabilities when the facts are not clearly associated
11. Three strategies for generating and evaluating hypothesis:
- Situational logic: What are their goals and what means they think are going to be the most efficient
- Application of theory: Theory is just a generalization of practice. Theory economics thought.
- Comparison: We need an historical set of data to select the best analogy but also depth of knowledge to understand the differences…
12. Our energy is finite but probabilities are infinites, so many times we will select the alternatives that look good enough. This is called "satisficing". This entails the problem of selective perception - we only see what we are looking for, and we overlook what is not included in our search strategy
13. We should always try to adopt the conceptual strategy of seeking to refute, rather than confirming our hypothesis. ALWAYS LOOK FOR DISCONFIRMING EVIDENCE, NOT FOR CONFORMING ONE. Many apply this principle to statistical analysis but fail to do in their more conceptual daily life.
Do we really need more information?
14. The illusion of knowledge: Once you have enough information, additional information does not improve accuracy, only increases confidence (many times leading to overconfidence)
15. Mosaic Theory of Intelligence: Small pieces of information are put together like a puzzle, so that analysts can perceive a clear version of reality. The problem is that analysts typically form a picture first and then select the pieces to fit! Remember that it is easier to improve analysis than to improve data collection
Keeping an open mind:
16. When writing and stuck in a paragraph, say outload what you are trying to communicate, this is because speaking and writing activate different regions of the brain.
17. Intelligence analysis is often limited by the unconscious self-imposed constrains or cages in the mind
18. Some techniques for improving perception:
- Thinking backwards: What needs to happen 3/6/12 months before for this to happen… this is about going from whether something may happen, to how may happen (or what I need to believe for this to happen)
- Cristal ball: Imagine someone with a crystal ball tells you an outcome you do not expect / want. What is the most likely reason? What is the wrong assumption?
- Role playing or devil´s advocate: Only works if one person is "living the role". It helps seeing thing is a new light
19. There is a difference between strategic assumptions and tactical indicators. Thinking that Japan was not going to attack USA in 1941 was am strategic assumption. A report showing that Japan was moving part of their jets was a tactical indicator. We should challenge more our strategic assumptions in the face of tactical indicators.
20. Four principles for creative thinking:
A) Deferred judgement: Separate idea generation from critical thinking - idea generation should be unconstrained!!!
B) Quantity leads to quality: Increase the number of ideas
C) No self-imposed constrains
D) Cross fertilization of ideas: Always try to combine ideas with each other
Analysis of Competing Hypotheses:
21. The analysis of competing hypothesis requires an analyst to explicitly identify all the reasonable hypothesis and have them compete against each other. The analyst has to compare hypothesis at the same time, the problem of not doing that is they will often obtain evidence in favor of their hypothesis, but that evidence would also be consistent with alternative hypotheses! You don´t get closer to the truth even if in your mind you think you are… If an item of evidence is consistent with all the hypothesis it has no diagnostic value.
22. Absence of evidence is not evidence of absence. It is necessary to distinguish hypothesis that have been disproven from those that are simply unproven! By definition deception should be a hypothesis that should be unproven, never disproven! As said before, if deception is well executed, they should expect no evidence of it.
Whenever someone says "There is no evidence that…" the analyst should ask "if the hypothesis is true, can I realistically expect to see evidence of it?
23. What is not reported is as important as what is reported. (Something I learnt years ago dealing with all kind of personalities in the corporate world is that many times "lack of information is best information"
Some practical (sarcastic) examples:
- If you ask for somebody´s feedback and they do not say the person in question is "highly competent, great person, etc. most of the times you can assume the person in question is actually bad!
- How can you know if someone works at Goldman / McKinsey? If they do not mention it when they present themselves is because they do not work there
- If someone has "Investment Banking" or "Investment Banking Analyst" in their bio on LinkedIn without mentioning the firm, chances are they do not work for a top-tier firm (if there is such a thing, at least according to conventional standards)
24. Steps for Intelligence Analysis: (i) Consider all possible hypothesis. (ii) Look for pieces on information with diagnostic value. (iii) Seek evidence to refute the hypothesis.
25. "Doubt is not a pleasant state, but certainty is a ridiculous one" - Voltaire
Biases in Evaluation of Evidence:
26. Cognitive biases are mental errors caused by our simplified information processing system.
27. The Vividness Criterion: The impact of information on the human mind is only imperfectly related to its true value as evidence. Direct interaction with the country/person/phenomenon you are trying to analyze can be insightful but also deceptive. Some CIA agents never interacted with citizens of the country they were analyzing because it could distort the importance of the information you receive. Anecdotal evidence may be insightful but also dangerous!
28. Man who syndrome: What happens when N=1 anecdotal evidence distorts reality!
29. Distorted evidence can persist and impact perception even when it has been refuted It has been proven that previously perceived information based on deceptive evidence will still impact current perception even once information has been disproven. Once information rings the bell, the bell cannot be un-rung.
Biases in Perception of Cause and Effect:
30. People overestimate the extent to which countries are pursuing planned and coherent plans, and thus they overestimate their own ability to predict it. My guess is that the same happens many times in corporate strategy or when people think in terms MBA-type game theories for their daily life decisions.
31. Be careful with historians, as many times they use their own imagination to construct a coherent story out of fragmented data. One can find a pattern / causal explanation in any random data set.
32. A fundamental error made by intelligence analysts is over-attribute relevance to internal factors (intrinsic motivators of humans), and underestimating the effect of external factors, and the situation (randomness). Don´t be fooled by randomness.
33. History is by and large, a record of what people did, not what they failed to do… And this brings limitations...
Biases in estimating probabilities
34. The availability rule: When thing about the probability of an event, we do not consider the full data set, but data we more easily remember. And the ease with which things come to mind is affected by many factors unrelated to probabilities.
35. When making a quick, gut judgement, we are more likely to fall into the availability bias. Just the act of making a detailed scenario, makes it more readily available, thus generating a bias in the perceived probability.
36. Anchoring: Even when an analyst has done an estimate before, that previous analysis works as an anchor for further analysis. Awareness of the anchoring bias alone is not enough. One way to try to overcome it is to start the problem from scratch.
37. There is a problem with intelligence reports due to the difference between objective probability vs subjective probability: If people who read intelligence reports changes subjective probability statements (e.g., somewhat likely, very unlikely, almost impossible, very probable, etc.) with their understanding of what the author intended to say, there would be massive differences between individuals. We should standardize expressions of uncertainty!
38. Base rate fallacy: The base rate fallacy is the phenomenon by which we tend to ignore statistical information in favor of case specific information. Heuer offers a good detailed example from the Vietnam War, however, the following picture taken from Mr. Google may is easier to understand:
Hindsight Biases:
39. Three ways hindsight bias work: (i) Analysts overestimate the accuracy of their past judgements. (ii) Intelligence consumers underestimate how much they learned from intelligence reports. (iii) Overseers claim that results were much more foreseeable
40. The problem with cognitive biases, is that, just as optical illusions, they remain compelling even after we become aware of them
41. It is much easier after the fact to short the relevant from the irrelevant. After the event of course, a signal is always crystal clear
Improving intelligence analysis:
42. If you find yourself thinking you already know the answer, ask yourself what would cause you to change your mind; then look for that information.
43. Alternative mindsets; 3 questions
(i) What implicit assumptions has the analyst made that it is not discussed in the draft?
(ii) What alternative hypotheses have been rejected? For what reasons?
(iii) What could make the analyst change his mind?
Comments