As the Cold War began to heat-up in the early sixties, the CIA became increasingly interested in the potential of ‘the computer’ in their work. A 1962 paper entitled ‘Some Far-out Thoughts on Computers’ was described by its author as a “Jules Verne look” at the issue. Orrin Clotworthy foresaw the need for the continual analysis of properly weighed and measured factors to answer the sorts of questions that the intelligence community needed to respond to quickly and accurately.

He proposed that the blending of behavioural sciences with correctly exploited information could lead to the prediction of how people and nations would act; in the same way that physics and chemistry tells us what molecules and bodies-in-motion will do. Clotworthy also had a vision of all this for operational intelligence:

As a final thought, how about a machine that would send, via closed-circuit television, visual and oral information needed immediately at high-level conferences or briefings? Let’s say that a group of senior officers are contemplating a covert action program for Afghanistan. Things go well until someone asks “Well, just how many schools are there in the country, and what is the literacy rate?” No one in the room knows. (Remember, this is an imaginary situation). So the junior member present dials a code number into a device at one end of the table. Thirty seconds later, on the screen overhead, a teletype printer begins to hammer out the required data. Before the meeting is over, the group has been given, through the same method, the names of countries that have airlines into Afghanistan, a biographical profile of the Soviet ambassador there, and the Pakistani order of battle along the Afghanistan frontier. Neat, no?

Neat, indeed. And spooky, given that such systems are now widely used. Less the electromechanical typewriter, of course. One does not simply teletype into Mordor. Anyway…

Fast-forward to 2013, and we have the CIA’s Chief Technology Officer talking of the “Grand Challenges” of Big Data to the The Company‘s activities:

The security environment in which Gus Hunt operates is obviously very different to that faced by Clotworthy. Perhaps more significantly, for analysts at least, so is the data environment. As Hunt points out (at 10.07), the social, mobile and cloudy nature of information has completely changed the flow of data across the planet. It used to be a simple stream, emanating from the few, towards the many, via a restricted number of media. Now, it is a complex model of many-to-many distribution. Which is a lot less easy to take advantage of.

For Hunt, this raises four major challenges. First, getting the right data and separating signals from the noise. It does not matter how sophisticated your analytical tools are if the data is irrelevant, wrong or incompatible.

Second, using data to empower individuals. Not everyone can be a top-level data scientist. Any system and process needs to be able to get the power of big data and analytics “into the hands of the average user”.

Then, related to the issue of relevancy, is that of timeliness. Data has a time value in the same way that money does. It’s worth more now than it will be in the future. ‘Real time’ is really the only relevant time to get your analysis distributed.

Finally, ensuring that the context – the personal context – is right. Whether it’s enabling the personalisation of the data and the analytical services applied to it. Or making flexible the computational power being employed overall (through, for example, elastic computing).

The relevancy of these is seeping out well beyond the walls of intelligence agencies’ HQs. Businesses of all sizes face exactly the same obstacles. And ensuring that the right technologies, processes and human resources are in place to not just face the challenges of Big Data, but also to exploit its opportunities, could well make the difference between thriving and struggling in our rebooted global economy.