Should government officials be licensed or pass a test before they can receive intelligence?

The debate about whether to believe intelligence reporting on Russian hacking of the Democratic National Committee (DNC) highlights a chronic problem. Steps were taken to improve collection, analysis, and reporting in response to the wildly wrong assessments about Saddam’s WMD in Iraq a decade ago. This included better training of analysts.

But there is no training for politicians, bureaucrats, and other recipients of intelligence reports. Most consumers, and especially new political leaders, have no idea how intelligence is gathered or how intelligence analysis is created. The types of reports, confidence levels, systemic weaknesses, prejudices, and processes are often only gradually learned ad hoc by political leaders over time. In order for a consumer to make a valid judgment about the quality of a report provided, he/she must have some understanding of how these things come to be.

As someone who has been involved in such matters for decades, here are a few observations that illustrate pitfalls for untrained recipients of intelligence:

There is a tendency to think a highly classified report that few people can see is more important and more believable, than a less classified report.

A report based on stolen secrets is deemed more valuable than one full of information gained another way. (If your enemy doesn’t know that you have the information, that’s another and different characteristic—important, but does not necessarily relate to accuracy.)

There is a tendency to highlight the unusual or explosive. If a source in a bar in Beirut says he overheard someone say they had seen no WMD in Iraq, that may get reported, but it would not get much attention in Washington. But, if a source reported that he overheard someone saying his cousin had a WMD container in Iraq—that account would go up the food chain very quickly. The data in the first report is of no different quality than the second.

Confidence levels are now usually attached to assessments, but does the reader really understand what low, medium or high mean?

What is the character of the source of the information? There are a lot of different sources, each with weaknesses and flaws and potential for misinterpretation.

Transcripts of conversations appear very compelling. But they can be very misleading. Are they translations? Do they account for sarcasm? Imagine reading a transcript of a discussion where discussants mention the “nuclear option.” If Saddam had said that, it would have created quite a stir. If a Senator on the Hill says it, that’s another matter.

For intelligence analysts, when making an assessment, the downside of over-estimating a threat is often less than under-estimating it.

It’s very difficult to make a downward revision from a high confidence assessment to a lower confidence assessment. It is awkward to explain that new information may have undermined a previous judgment.

There are a host of biases inherent in the system, and there are a myriad of peculiarities of how intelligence is gathered, sorted, and reported. Readers of these products can only know how to judge intelligence judgments if they have some understanding of how they’re created.

One useful thing the much-maligned office of the Director of National Intelligence could do is create a short course on intelligence products and processes for consumers and mandate that only those who have taken the course should receive material.

The government spends a lot of time and money checking the backgrounds of people to be “cleared” to receive intelligence. That makes sense to protect sensitive information. Doesn’t it make sense to expend some additional effort to make sure the recipients understand the intelligence they are getting?

You need to pass a knowledge test to drive a car or carry a concealed weapon. Untrained use of intelligence can be very dangerous too.