The CIA stands by good old gum shoe spy tech. Yesterday, @CIA tweeted the agency’s affection for “moleskine notebooks & mechanical pencils.” But they’re also rightly pretty proud of their growing capacity to anticipate future events with more and more precision. It’s like sentiment analysis on steroids, and they’ve been trumpeting their progress over the last several years at this week’s Fedstival (get it?) here in Washington, D.C.
INSTRUMENTATION OF THE GLOBE
That’s what CIA’s Deputy Director for Digital Innovation Andrew Hallman calls it, “instrumentation of the globe.” And he’s referring, in large part, to the incomprehensible amount of open source data out there, ready for analysis. If NGA is learning how to effectively predict bad actors inside the company with sentiment analysis based on relatively minute amounts of data, imagine what the CIA is accomplishing with the massive computing power sucking up big, big data, algorithming it to death, and spitting it back out.
I suspect it’s a lot more graceful than that. But that’s essentially what the CIA’s newest office, the Directorate of Digital Innovation, is doing. Nextgov’s Frank Konkel reports, “The agency . . . has significantly improved its ‘anticipatory intelligence,’ using a mesh of sophisticated algorithms and analytics against complex systems to better predict the flow of everything from illicit cash to extremists around the globe.”
Deep learning is a function artificial intelligence. It’s about machines interpreting data to anticipate outcomes and recommend—or give—responses. The CIA’s been pursuing this technology for more than two decades that we know of. Today, deep learning is about computers surveying large volumes of data sets and then, by identifying otherwise imperceptible trends that have led to a particular events like revolutions or civil wars or coups, making very educated guesses about what’s coming in. It’s like those old Zoltar fortune telling machines in beachfront arcades that I used to trust. Not really.
Sentiment analysis is about the same thing, but on a relatively tiny scale, and without the profound outcomes the CIA seeks. Anticipatory Intelligence, as Hallman puts it, is about the agency is taking “’what we know from social sciences on the development of instability, coups and financial instability, and take what we know from the past six or seven decades’” and combining that with what’s happening now in order to anticipate future events. According to Konkel, Hallman reports that the agency has “been able to improve our forecast to the point of being able to anticipate the development of social unrest and societal instability . . . as near as three to five days out. . . .’” Like Zoltar did, only more slowly.
Back on June 1, 2002, President Bush told the West Point class of 2002, “If we wait for threats to fully materialize, we will have waited too long.” Which more explicitly than implicitly means reacting before there’s a stimulus, reacting when we can only anticipate what may develop. That 2002 principle depends on 2016’s principle of anticipatory intelligence. Hallman imagines an ability to give decision-makers (deciders) “a clearer picture of events unfolding—or about to unfold . . . .”
That didn’t go to well in 2003. Let’s hope Hallman is appropriately skeptical. Even on our instrumented globe, patience is still a virtue.