A few months ago I was at Johns Hopkins University's Applied Physics Lab in suburban Maryland, where I serve as a senior fellow. A group of us — mostly retired four-star military officers — were there to witness a computer-simulated dogfight of a unique character: man against machine.
I was seated next to retired Admiral John Richardson, who until last fall had been chief of naval operations, the highest-ranking officer in the fleet. We were both skeptical that the artificial intelligence program that would be piloting one of the virtual aircraft would be able to outfight the human pilot, call sign “Banger,” from the Air Force's equivalent of the Navy’s legendary TOPGUN fighter-tactics instruction program.
It was a remarkable blend of software development, AI, modelling and simulation, combat-aircraft dynamics and controls, and advanced video production — it felt like watching an ESPN sports event. We observed a half-dozen runs, and Banger had his hands full, losing more often than not.
It was clear as the demonstration progressed that, over time, the AI entity — which was constantly in machine-learning mode — was not only improving, but becoming dominant. At the end of the demo, Richardson and I agreed that the AI would eventually beat the human every time. We consoled ourselves by agreeing that perhaps a Navy fighter pilot would have lasted longer than the Air Force's best. But inwardly, I doubted that would be the case.
Since then, the competition – called the AlphaDogfight Trials and run by the Pentagon’s Defense Advanced Research Projects Agency — has rolled along in a series of additional events, culminating in a five-event sweep by AI against Banger and his simulated F-16 last week. (Banger’s name is being withheld for security reasons, and perhaps to protect him from severe teasing from his squadron-mates.) It was a way to showcase the growing importance of AI to the warfighter, and allowed commercial companies to enter their AI competitors.
The winner was Heron Systems Inc., a small Maryland firm that was the most aggressive and accurate of the eight competitors invited by Darpa. True, there are a fair number of caveats to the AI accomplishment — such as that the computer had perfect real-time information, which is never the case in actual combat, and the human pilot was not flying a "real” plane, but operating in a simulator.
Yet it is an important moment, not unlike IBM’s Deep Blue computer defeating the Russian champion Garry Kasparov in chess in 1997, or an AI machine beating the Chinese Go master Ke Jie in 2017. Are the days of Goose and Maverick, the cinematic "Top Gun" pilots, numbered? And, more importantly, where is the competition in AI headed between the US and China, where the combat advantage could affect operations in the South China Sea and elsewhere?
First, there is still a big leap from a computer simulation in a lab to putting AI in charge of a $50 million jet and sending it into a dogfight — arguably the most complex airborne task for military operations.
In addition to resolving “fog of war” ambiguities, the engineering capability to have the AI system run a cockpit are still years away. Developing, testing and deploying a fully capable AI system will probably occur first in drones, then logistics and refueling aircraft, and then land-attack strike platforms before moving into pure air-to-air fighter combat systems.
But the global competition in AI is fierce. Eric Schmidt, former chairman of Google, has been focusing on these issues on behalf of the Department of Defense for some time, and he’s told me that the edge the US once held over China is diminishing rapidly – from years to perhaps months. A huge concern is that the Chinese will potentially leapfrog over all previous research and development in the US through their effective system of industrial espionage. Russia, likewise, is moving forward on AI capability, although it’s considerably behind the US and China. Lesser military powers including France, India, Israel, Saudi Arabia and the UK are also interested, as is Iran.
For the US, the implications of AI are perhaps approaching those of Space Race of the 1960s. But the Pentagon cannot win the race on its own: It needs to find more and better ways to work cooperatively with Silicon Valley; enhance its cybersecurity, since all of these systems will be vulnerable to cyberattacks; and consider how to mesh AI with manned activities, particularly Special Operations forces. This will require the enormous, lumbering Defense Department to be innovative and nimble over the near term.
Go and chess are games. An AI program that defeats a human being is amusing copy on a slow news day. But when an AI program provides a real advantage in deeply complex combat operations, we need to pay closer attention, and recognize the challenges ahead. Banger will be flying for some years to come. But not forever.
Bloomberg