Austhink is a critical thinking research, training and consulting group specializing in complex reasoning and argumentation. 

 Austhink Home

 

 

Sound Analysis Lies in Reassessing Mindsets

A version of this essay appeared in the Australian Financial Review, 14 Feb 2004

Amid the welter of commentary about intelligence and weapons of mass destruction in recent months, a persistent question has been: are the intelligence agencies competent? Inquiries are being launched or demanded, left and right. Whether in Washington, London or Canberra, the key agencies are under scrutiny.

It is a good time to ask some searching questions. Unfortunately, the politically charged atmosphere is not especially conducive to the right questions being asked, or the right answers being found.

A particularly ill-considered criticism of the CIA on February 3, by New York Times columnist David Brooks deserves singling out, because it exhibits a mentality that all too many people are prone to and which, if given its head, could do a great deal more harm than good.

In his piece headed "The CIA: Method and Madness", he claimed the CIA's analysis was not politicised; but deplored its (alleged) obsession with scientific and rational analysis as the source of its errors; and called for a stiff dose of intuition and imagination at CIA headquarters.

His prescription is to bring in people who have a decided aura of the unscientific: "When it comes to understanding the world's thugs and menaces, faster than I'd trust a conference-load of game theorists or risk-assessment officers, I'd trust politicians, who, whatever their faults, have finely tuned antennae for the flow of events. I'd trust Mafia bosses, studio heads and anybody who has read a Dostoyevsky novel during the past five years."

Well, perhaps, if you want an intelligence service run like Tammany Hall, the Mafia or Hollywood. But if you want to avoid error, those may not be especially promising paths to go down.

Brooks went so far as to identify, on the CIA's own website, an e-book, Psychology of Intelligence Analysis, by Richards Heuer Jr, which he describes as "scientism in full bloom" and representative of the methodology that is the problem at Langley. He cannot have read it or thought about it very clearly if he believes this.

The problem at the CIA, I suggest, has not been slavish adherence to some dogmatic methodology, but rather the failure to inculcate in the agency the principles of critical thinking outlined in this very book.

In an illuminating foreword to the book, written in 1999, Douglas MacEachin, former deputy director (intelligence) at the CIA, asks: "How many times have we encountered situations in which completely plausible premises, based on solid expertise, have been used to construct a logically valid forecast - with virtually unanimous agreement - that turned out to be dead wrong?"

He doesn't answer his own question, but he's plainly aware of a litany of errors. Where he differs from Brooks is in his diagnosis of the problem. Here's what he wrote: "Too often, newly acquired information is evaluated and processed through the existing analytic model, rather than being used to reassess the premises of the model itself."

The kind of intuition Brooks calls for might introduce people who think outside the "existing analytic model", but they would have implicit analytic models of their own. If they deliver accurate forecasts, that's just lucky, because there is no clear basis for assuming that, as circumstances change, the invisible and unexamined premises of those models will continue to yield accurate forecasts. So, we'd quickly be back where we started.

It is the explicitation and critical examination of the premises themselves, the assumptions underlying mental models, that is the key to really sound analysis. Brooks implies that the CIA does too much of this and that this is what causes its errors. He could not be more mistaken.

MacEachin's testimony on this point is trenchant. He notes, first, that, far from being excessively wedded to critical analysis, "many CIA officers tend to react sceptically to treatises on analytic epistemology" - because it tends to offer models as generic answers to problems, when what is needed is fluidity of thinking in a policy-oriented world.

"But," he goes on, "that is not the main problem Heuer is addressing. What Heuer examines so clearly and effectively is how the human thought process builds its own models through which we process information. This is not a phenomenon unique to intelligence ... it is part of the natural functioning of the human cognitive process, and it has been demonstrated across a broad range of fields from medicine to stockmarket analysis." (emphasis added). In other words, it will be as true for politicians, Mafia bosses and studio heads as for country experts at the CIA or, say, Merrill Lynch.

"The commonly prescribed remedy for shortcomings in intelligence analysis and estimates - most vociferously after intelligence 'failures' - is a major increase in expertise," McEachin went on to remark. But "the data show that expertise itself is no protection from the common analytic pitfalls that are endemic to the human thought process. A review of notorious intelligence failures demonstrates that the analytic traps caught the experts as much as anybody. Indeed, the data show that when experts fall victim to these traps, the effects can be aggravated by the confidence that attaches to expertise - both in their own view and the perception of others." (emphasis added).

These observations by McEachin are spot on. Is the answer, then, to bring in people distinguished by their lack of expertise? Surely not. Rather, it is to bring people in whose expertise consists in re-examining mindsets, mental models, premises and assumptions; winkling them out from where they often hide and coaching the content experts themselves in seeing their reasoning and their world views in a new light. Bringing in the Mafia, I suggest, would be a very hit-and-miss affair - and the hits might not be the ones you'd really want.

Being able to see and critically examine one's own reasoning processes is more difficult than is intuitively evident. Strong experimental evidence suggests that, in fact, experts in various domains have a poor grasp of how they actually use evidence in making judgements.

They typically tend, in Heuer's words, to "overestimate the importance of factors that have only a minor impact on their judgement and underestimate the extent to which their decisions are based on a few major variables".

Those variables are often assumptions so deeply embedded in the expert's mental model as to be invisible to the critical eye. They come in below the radar, as it were. What happens, therefore, is that evidence is sought to confirm, or test, variables which turn out not to be crucial.

Moreover, an obsession with getting "all the facts" can compound this problem, by obscuring the underlying reasoning process and lending a false confidence to errant judgements grounded in unexamined premises.

Both at the CIA and here in Australia, if we want better intelligence analysis, we would do well to invest heavily in creating a cadre of analysts especially skilled in what might be dubbed "Heueristics", rather than handing over the house to inspired amateurs with no disciplined training in critically examining their own mental models or their own self-confidence. This is no easy task and Brooks is quite mistaken if he believes that it was long ago accomplished at the CIA.

It is, however, the task that 21st century intelligence organisations must tackle - for their own good and for ours.