AI-powered Valkyrie drone designed for swarming China

The United States Air Force (USAF) has just tested an advanced autonomous drone showcasing new cutting-edge technologies, an artificial intelligence-powered flex of how the US may fight an air war with China over Taiwan.

On July 25, the XQ-58A Valkyrie drone successfully carried out aerial combat tasks autonomously using new AI-driven software, the Warzone reported.

The test, launched from Eglin Air Base in Florida, lasted three hours and was part of a tiered approach of training algorithms millions of times in simulations before conducting other testing.

The drone drill reflected the USAF’s phased approach to develop, mature and build trust in AI-driven autonomous capabilities and to migrate them from the laboratory into more operational environments.

The USAF Research Laboratory developed the algorithms used for the test, which The Warzone noted is part of the Collaborative Combat Aircraft (CCA) program, a critical component of the so-called Next Generation Air Dominance (NGAD) modernization initiative.

The artificial intelligence/machine learning test reportedly established a multi-layer safety framework, solving a tactically relevant challenge during airborne operations, The Warzone reported.

While the USAF has not provided details about the specific tasks involved in the test, it did stress that there will be a human operator in the loop for employing highly advanced autonomous drones.

Asia Times noted in January 2022 that the emergence of loyal wingman drones such as the  XQ-58A Valkyrie reflects a requirement for mass-produced and expendable aircraft to be used in a potential conflict with China.

Expendable drones give a numbers advantage to their operators, acting as mass decoys, a swarming force or a force multiplier complementing crewed aircraft.

A XQ-58A Valkyrie drone alongside F-35A and F-22 fighters. Photo: Facebook

Loyal wingman drones can also extend the sensor ranges of stealthy crewed aircraft such as the NGAD, F-35 and F-22, operating in areas deemed too dangerous for the latter due to advanced air defenses or aerial threats.

They can also extend weapons ranges by designating targets with onboard target designators while the launching crewed aircraft stays out of range of enemy air defenses while remaining electronically silent.

The results of the 25 July tests could speed the development of drone swarms, which may prove decisive in a Taiwan conflict.

In February 2023, Asia Times reported about the US Department of Defense’s (DOD) low-profile Autonomous Multi-Domain Adaptive Swarms-of-Swarms (AMASS) project to develop autonomous drone swarms that can be launched from sea, air and land to overwhelm enemy air defenses.

AMASS aims to develop the capability to launch and command thousands of autonomous drones to destroy an enemy’s defenses and critical assets, including air defenses, artillery pieces, missile launchers, command and control posts and radar stations.

Drone swarms can flood enemy radar scopes with multiple targets, forcing the latter to waste limited missiles and ammunition, revealing their positions for crewed platforms, armed drones and loitering munitions to move in for the kill.

Machine learning and AI also allow drone swarms to look at targets from multiple angles, cross-check various targeting data streams and suggest the best point of attack.

Caitlin Lee and other writers noted in May 2023 for Aerospace America that AI will be a game changer for air combat since it reduces the risk to pilots’ lives and the cost of air superiority.

The writers argued that AI may eventually do everything that a human pilot can, noting that the US military is already experimenting with AI in dogfighting, the most challenging aspect of aerial combat.

Lee and the other writers cited the US Defense Advanced Research Project’s Agency (DARPA) Air Combat Evolution, which pitted a highly-experienced fighter pilot versus an AI-driven fighter in a series of simulated aerial contests.

They noted that the AI fighter scored cannon kills against the human pilot every time since it could aim its cannon with superhuman accuracy from seemingly impossible attack angles, outmatching the human pilot in a classical, close-range, turning dogfight.

However, Lee and the other writers note that while AI excels in black-and-white situations, air combat in the real world will present many grey areas requiring human judgment for the foreseeable future.

They argue that no matter how fast AI advances, human judgment will always be needed in making high-risk decisions in dynamic air combat situations.

Artist’s concept of a drone swarm. Credit: C4ISRNET

A human-AI combination may be the ideal solution, as it combines human flexibility and moral judgment with the precision and reliability of automation.

Operator-in-the-loop systems architecture is still needed to avoid unintended incidents and assuage moral concerns about AI deciding whether to use lethal force. It may also prevent autonomous drones from going against their operators through flawed logic, software glitches or enemy interference.

While the emergence of increasingly autonomous drones may herald a more significant shift toward the eventual “dronification” of warfare, humans will still be needed to devise crucial concepts, strategy and tactics.