The United States Air Force of the future may have fewer human pilots in the cockpits, as computer-controlled wingmen are increasingly becoming the reality. Early designs had originally envisioned drones that could operate alongside the Air Force’s future sixth-generation fighters. But efforts are underway that could allow artificial intelligence (AI) and machine learning to take control of the actual jet fighters.

This month, the Defense Advanced Research Projects Agency (DARPA) announced that in less than three years, AI algorithms developed as part of its Air Combat Evolution (ACE) program advanced significantly from controlling a simulated F-16 Fighting Falcon in an aerial dogfight on a computer screen to controlling an actual F-16 in flight.

A highly modified F-16 – known as the X-62A or VISTA (Variable In-Flight Simulator Test Aircraft) – successfully flew for more than 17 hours in early December at the Air Force Test Pilot School (TPS) at Edwards Air Force Base, California. Over the course of multiple flights conducted over several days, the tests demonstrated how AI agents can control a full-scale fighter jet.

“Thanks to the outstanding teamwork and coordination between DARPA, the Air Force Test Pilot School, the Air Force Research Laboratory, and our performer teams, we’ve made rapid progress in Phase 2 across all areas of the ACE program,” said Air Force Lt. Col. Ryan “Hal” Hefron, the DARPA program manager for ACE. “VISTA allowed us to streamline the program by skipping the planned subscale phase and proceeding directly to a full-scale implementation, saving a year or more and providing performance feedback under real flight conditions.”

DARPA performers EpiSci, PhysicsAI, Shield AI, and the Johns Hopkins Applied Physics Laboratory flew a number of different F-16 AI algorithms on the X-62A. The aircraft was programmed to demonstrate the flight-handling characteristics of a variety of different aircraft types. VISTA was upgraded recently with the System for Autonomous Control of Simulation (SACS), making the aircraft a perfect platform to test ACE’s autonomous F-16 AI agents.

While AI-controlled the aircraft in the flights in December, a safety pilot was on board the VISTA aircraft to take control if anything went awry. The tests included multiple sorties from takeoff to landings.

ACE is now part of more than six hundred Department of Defense (DoD) projects that are incorporating AI. In 2018, the government committed to spending up to $2 billion on AI initiatives over the next five years, while $2.58 billion was spent on AI research and development last year.

Humans Will Remain in the Cockpit

This particular effort isn’t meant to actually fly aircraft such as the F-16 without a pilot, but instead, AI could help allow the human pilot to focus on large battle management tasks. In addition, as previously reported, other efforts are underway where AI could act as a second co-pilot.

The United States Air Force and Merlin Labs have been developing software for the Lockheed Martin C-130J Hercules to fly with just a single pilot. This is meant to help reduce the crew size in the face of a global shortage of pilots.

The Army has also explored ways to utilize DARPA’s Aircrew Labor In-Cockpit Automation System (ALIAS) program that can allow pilots to focus on mission management instead of the mechanics during a flight. The AI could assist pilots day or night and in a variety of difficult conditions, such as contested, congested, and degraded visual environments.

Other Visual Assistance

It isn’t just aircraft that could get some AI assistance. This month, the United States Army also shared images of an AI-driven target recognition system that is used on the M1 Abrams main battle tank (MBT). This targeting system could help increase the speed at which potential threats are spotted and engaged.

The AI target recognition system was tested during the five-week Project Convergence 2022 (PC22) exercises, which saw a number of advanced military prototypes employed in real-world situations. During the exercise, U.S. Army soldiers tested the Advanced Targeting and Lethality Aided System (ATLAS), which demonstrated a wide range of aided target acquisition, tracking, and reporting capabilities while operating in a realistic combat environment.

ATLAS has been in development for several years and was previously demonstrated to members of the press in 2020 at the Aberdeen Proving Ground in Maryland. It utilizes advanced sensors, machine learning algorithms, and a new touchscreen display to automate the process of finding and firing at targets, while further allowing crews to quickly respond to potential threats. Part of the system is a small Aided Target Recognition (AiTR) sensor mounted to the top of the turret, while other sensors can help supply additional information to the crew.

By automating what have been manual tasks, ATLAS could help reduce end-to-end engagement times. This can help ensure that an Abrams crew will have an advantage on the modern battlefield. Though the Army won’t say how much faster a crew can react, reports suggest that an operator could engage three targets in the time it now takes to engage just one.

That could mean the difference between life and death for the crew.

This is an important point, as some critics of AI-supported weapons worry about the threat it poses, where instead it now appears that the technology will simply aid the warfighter, and help them come home alive.

Related News

Peter Suciu is a freelance writer who covers business technology and cyber security. He currently lives in Michigan and can be reached at You can follow him on Twitter: @PeterSuciu.