ADF must master the fight tonight before betting on tomorrow’s autonomy

In the span of five years, first-person view (FPV) racing drones have graduated from local parks to the front lines of modern warfare. The Ukraine–Russia conflict has irrevocably demonstrated the democratisation of air power. Cheap, commercial off-the-shelf technology now allows non-state actors and smaller countries to generate precision effects previously reserved for major powers.

However, amid the daily flood of strike videos on social media, a dangerous narrative is taking hold in defence circles: the belief that fully autonomous, AI-driven swarms are already here, or just a procurement decision away.

Soon-to-be-published research, Forecasting the readiness of drone racing technology for wider autonomous use, suggests otherwise. By applying forensic literature mapping and data-driven analysis to the evolution of drone technology, I have identified a critical gap between the marketing hype of current levels of automation and the technological reality of the vision of full autonomy. For the Australian Defence Force, the lesson is clear: we must not let the promise of a future autonomous capability distract us from the urgent need to invest in the human-centric systems available to fight tonight.

To understand where drone technology was going, we first looked at where it came from. The development of the systems currently dominating the battlefield can be traced using Litmapsan analysis tool that visualises the ancestry of scientific citations.

There is a clear maturity signature. The multi-rotor control systems and flight control software, such as Betaflight, that enable today’s Ukrainian drone pilots to fly with such precision are not new. The foundational research behind them peaked between 2011 and 2015. There is a dense, interconnected web of academic validation supporting these technologies. This significant scientific foundation explains why these systems were ready for scale militarisation in 2022: the science was settled, robust and open-source long before the first shot of the Ukraine–Russia War was fired.

This is the fight-tonight reality. Human skill, improvisation and mature open-source software define the dominance of FPV drones today. It relies on a globally networked community of operators who share code, tactics and hardware fixes in real-time, and has been in military circles since the first Military International Drone Racing Tournament in 2018.

By contrast, when the same analytical method was applied to autonomous drone racing and small, cheap terminal guidance technologies, the map looked very different.

There is much hype about AI-enabled drones that can lock onto targets and strike without human input. Reports from the field often claim hit rates between 80 and 100 percent for these systems. However, our analysis suggests we should view these claims with scepticism. The literature map for these technologies lacks the density and anchor nodes of mature science.

Instead, we are likely seeing bootleg adaptations: systems where a companion computer hacks the flight controller to mimic a pilot’s thumb inputs, keeping a target centred in the frame. While innovative, these are not the robust, military-grade AI systems often advertised. They are prone to flight oscillation, are easily confused by complex environments and are often less accurate than a skilled human pilot.

This brings us to the trough of disillusionment, a concept explored by Gartner Corporation and, more recently, in a defence technology application by defence scholar Ash Rossiter. There is a genuine risk that defence establishments, driven by excessive expectations, will spend heavily on these immature autonomous systems. When these bootleg solutions fail to meet the operational requirements of a complex, contested battlefield, the resulting disappointment could stifle funding just as the technology actually begins to mature.

So if full autonomy isn’t ready today, when will it be?

History offers a guide. The lag between the publication of breakthrough academic research on manual FPV flight (around 2012) and its widespread military efficacy (2022) was roughly a decade. However, commercial adoption through Betaflight happened much faster, within three to four years of the research peak.

Applying this timeline to autonomous drone racing – where the breakthrough research by groups such as the University of Zurich’s Robotics and Perception Group was only published in 2023 – analysis suggests that robust, reliable autonomy capable of engaging targets on a simple battlefield won’t be ready for scale adoption until 2026 to 2028 at the earliest. The lead researcher from the University of Zurich team has written that it’s more likely to be decades away, because no battlefield is as simple and predictable as his lab in Zurich. To him, this timeline is due to the difficulty of training an AI to operate on a complex, global battlefield.

This has created a best-case three-to-five-year window that we are currently in, and it gets many people excited and hyped. The battlefield of tomorrow, characterised by fire-and-forget autonomy that can operate independently in GPS-denied environments, is coming. But it is not the battlefield of today.

For the ADF, this timeline dictates a two-pronged strategy. We cannot afford to wait for the perfect autonomous solution while our adversaries master good-enough manual solutions. First, we need to invest in the fight-tonight technology. The human element needs to be the immediate priority. Current FPV capability relies entirely on a finite pool of expert operators. We need to formalise and scale pilot training, leveraging the same open-source simulators and competitive leagues that honed the skills of commercial racers. We need to treat operators as the capability, equipping them with the best available analogue and digital links to fight through electronic warfare jamming that exists now.

Secondly, we need to prepare data for future fights. While we fight with humans today, we must prepare for tomorrow’s machines. The barrier to full autonomy isn’t hardware; it’s data. Truly robust autonomous drone racing requires massive, diverse machine-learning training sets to teach drones how to navigate complex, unpredictable environments, such as flying through a shattered window or under a forest canopy, and through fog and battlefield smoke, all while being engaged by systems designed to counter uncrewed aerial systems.

The ADF and the Australian defence industry shouldn’t focus on replicating low-margin hardware that can be bought cheaply from commercial vendors. Instead, value generation lies in mastering high-fidelity simulation and creating the global training sets that will eventually allow us to graduate from bootleg adaptations to resilient full autonomy.

The democratisation of air power is here, but the removal of humans from the loop is not yet a reality. The hype cycle threatens to distract us with visions of the future while we neglect the tools of the present. By accepting the evidence that robust full autonomy is still years away, the ADF can avoid the trough of disillusionment of the hype cycle. We need to commit to mastering the manual, human-centric fight tonight by building a cadre of skilled operators and resilient supply chains, while methodically investing in the data and doctrine required for the autonomous fight tomorrow.