The intelligence community doesn’t get enough credit for its contribution to the information age. When government and industry were still only tepidly considering the weird and alien concept of “computers,” the IC was charging forward, having immediately recognizing the utility of processing power and its possible applications. Today, the spy world continues pushing the limits of what computers can do. Here are a few famous supercomputers used by the intelligence community.



Major General Ralph Canine, the founding director of the National Security Agency, wasn’t satisfied with his new agency’s progress. Enemy encryption was improving and data collected by the agency was piling up. The technology required to keep pace with the growing needs of SIGINT and ELINT was years away, at best. His engineers came to him with a proposal for a new computer that would theoretically see a hundred-fold increase in processing speed over top-tier computers on the market. General Canine wanted a long-term solution, and he wanted audacious plans to make it happen. “Dammit,” he said, “I want you fellows to get a jump on those guys [computer companies]! Build me a thousand-megacycle machine! I’ll get the money!”

Because World War II-era generals practically had power over the laws of physics and the limits of human potential, Canine got what he wanted. President Eisenhower backed Project Lightning, a five-year-plan to design and manufacture a thousand-megacycle machine, dubbed Harvest, for $25 million. (Adjusted for inflation, it would cost $200 million today.)

The model for Lightning was the Manhattan Project. As James Bamford reported, “Contractors on the project, believed to be the largest government-supported computer research program in history, included Sperry Rand, RCA, IBM, Philco, General Electric, MIT, University of Kansas, and Ohio State.”  Each of the eight contractors focused on a different area of research. They produced a computer that remained in operation until 1976 and advanced the entire industry by decades. Their research is still paying dividends—Lightning research into practical applications for the Josephson Junction would apply half a century later to the development of quantum computers.

In a bit of trivia, because the engineers developing the computer couldn’t use actual material collected by the NSA for testing, they ran copies of Time Magazine through the system. It successfully abstracted the articles.



On Jeopardy!, an IBM supercomputer called Watson famously bested his human competitors.  The Central Intelligence Agency uses a copy of Watson for its mass data analytics (which is geek-speak for “connecting the dots”). Details of how, exactly, the computer is employed are scarce. Our friends at National Security Counselors are involved in a Freedom of Information Act lawsuit to compel the CIA to be more forthcoming.

The Federal Bureau of Investigation has an interesting solution to the computing problem. Where Watson is a single, giant number cruncher, the FBI uses a “grid” of thousands of regular desktop computers to find criminals and terrorists. Whenever a Bureau computer isn’t in use, a program kicks on that pulls a chunk of data from a central server, crunches it, and returns the processed data for placement into the larger puzzle. Multiply that times a thousand for each computer, and you’ve got a free supercomputer. (This type of distributed computing is famously used by SETI to help find space aliens.)



The most powerful computer of them all is located at a $2 billion complex in the middle of the Utah desert. The exact details of what can be found at the NSA’s Community Comprehensive National Cybersecurity Initiative Data Center are hazy, though whatever is in there is really big. The facility consumes a staggering 65 megawatts of electricity—about half as much as the Hadron Super Collider—and since the NSA isn’t in the particle acceleration business, that probably means raw computing power. Reportedly, the data center houses a supercomputer codenamed Vesuvius, which might well be a quantum computer. Such a machine would be able to execute 100 undecillion (that’s a one with 38 zeroes behind it) calculations at once, which means it’s almost powerful enough to run the video game Crysis at maximum detail settings. It also means it could theoretically brute force PGP, a data encryption system with no known vulnerabilities. (Even with the most theoretically powerful conventional supercomputer, it would take right at 10 trillion years to break PGP. For sake of comparison, the universe itself is only 13 billion years old.)



During World War II, the Army commissioned the first general-purpose electronic computer. Described to the press as a “giant brain” (how else would you describe a computer to a world that had never before seen one?), ENIAC cost $6 million in today’s dollars, weighed 30 tons and took up 1800 square feet, which is about the size of a house. Anecdotally, it used so much electricity to operate that each time it was switched on, it caused lights in Philadelphia to dim.

The system was developed to calculate artillery firing tables for the Ballistic Research Laboratory, but when scientists from the Manhattan Project found out about it, they co-opted the system to run calculations for the Bomb.

Today, panels from ENIAC can be seen at the Smithsonian.

D.B. Grady is the pseudonym of author David Brown. He is co-author of The Command: Deep Inside the President’s Secret Army (Wiley, 2012) and Deep State: Inside the Government Secrecy Industry (Wiley, 2013). He can be found at or on Twitter at @dbgrady.

Related News

David Brown is a regular contributor to ClearanceJobs. His most recent book, THE MISSION (Custom House, 2021), is now available in bookstores everywhere. He can be found online at