Cleared contractors are found in massive, hyper-modern companies situated in manicured industrial parks. They’re also found in small businesses that have been in existence since WWII. No matter where they are or how big their operating base, they will be affected by Artificial Intelligence (AI). What began as an almost magical way to collate data has turned into a means of analyzing and even predicting based upon the collected information. What began as a good way to streamline boring employee work became in some companies a robot which will do the work. All of this comes with a price. For the secret, classified world, it can be monumental.
How AI is Reshaping the Drone Industry
I noticed one colleague who cherished his corner office. He identified one of his own weaknesses there, however. He knew that if he faced the window, he’d tend to daydream. So, he turned himself around and placed his workstation facing away from the window in order to better concentrate. I pointed out that anyone with a seeing eye could observe his classified computer from outside. “How?” he demanded. “We’re three floors up!” I gave him a company brochure which showed some of the capabilities of CCTV cameras today. “And those are the good guys watching!” I commented. “What about those who secretly observe what you are writing and reading?” He turned his computer around so its back was to the window again. I hope he doesn’t daydream too much.
A simple introduction to how cameras, computers, and tireless eyes can spy on us is to ask yourself this question. How often am I recorded during the day? Who sees me? Who hears me? What can my company legally do to watch or monitor my every move, who I talk to, and when? Remember all the old movies of masses of industrial workers clocking in and out at the factory? You do this today whenever you swipe into a cleared area. What and who authorizes the use of the data collected? What can your boss read of your emails? What can he do to monitor what you do during the day? What, if anything, of your work life is private? If you have a security clearance, the expectation should be that none of your work-day activities are outside of the oversight and careful eye of government monitoring.
Access Control and Your Personal Identity
As citizens we are faced with great challenges from AI, but as clearance holders we can only anticipate some of the many issues to face us. As access control becomes greater, under what circumstances are we compelled to agree to our identity being shared? Consider that now physical facial identification is the target means of future access control. Your face becomes your access “badge”. In some plants, fingerprint codes already are. So ask yourself, what level of protection is necessary? What happens to this data of your many facial identifying data points? This is a real concern.
Studies in China have shown that a man, once facially identified, cannot escape from any major city. Even the smallest photographic capture by a poor CCTV can locate him once his facial ID is entered. For police and counter-terrorist investigations, this is a godsend. For free people, it is worrisome. Consider that the data about you, stored by your company, could be given to researchers who can find you wherever you go. The issues for someone committing a crime are obvious. But what if you are a husband cheating on your wife? Is this something a company can give to a private investigator? Could a pre-employment screening suddenly become not just about the information that’s included in a standard background check, but what is available through other online monitoring systems?
This only touches a single aspect of AI’s new field of research. China’s vast investment in machine learning devotes the largest proportion of its money to the study of computer vision. The Russians have deduced that collecting data by AI saves them vast amounts of human investment in their disinformation projects to influence foreign governments. What might seem to a company as a research question by a legitimate firm seeking data for some honest research could be manipulated by adversaries for use in attacking our democracy. As companies continue to seek out more and more data, it’s critical that the methods to protect it are equally robust.