Top Top Top

Autonomous Divisions

Blended Categories enable unexpected exploits


Militaries typically adhere to doctrines of:

  • Power (massed units, large-effect weapons)

  • Mobility (speed, maneuvering, and encirclement)

  • Stealth (concealment, ambush)

Blitzkrieg tactics in WW2 were highly effective because they combined massed Power deployed dynamically through Mobility. Modern stealth bombers integrate all three elements, but are costly to build and operate. However, a new military doctrine is emerging:

  • Autonomy (swarms, expendability, harassment)

Autonomy entails "rush in, if 90% fail it doesn't matter". It enables long-ranged or lurking weapons for stay-behind ambushes, combining aspects of land and sea mines. Autonomous swarms can act as an expendable blitzkrieg, emerging suddenly and unexpectedly, operating stealthily in land, sea, air, or space.

The mass deployment of drone armies can rapidly shift the balance in a conflict, especially autonomous weapons comparable to human soldiers or manned military vehicles.

The mass deployment of drone armies can very quickly change the balance in a conflict, especially autonomous weapons with functions comparable to a human soldier, or a manned military vehicle. The deployment of 50,000 drones per month in the Ukrainian battlespace, many of which costs as little at $6,000 is changing the face of conflict. Drones armed with inexpensive MANPADS-style warheads present an inexpensive and increasingly attractive alternative to conventional armies, especially when used in a defensive context. Iron Dome costs $100,000 for every $800 rocket it intercepts. This asymmetry facilitates autonomous wild weasel-style tactics. Autonomous (potentially nuclear-powered) stealth carriers can deploy drones in various configurations, waiting weeks or months before sudden activation.

Together, these developments may make conventional war increasingly untenable. Ukraine has been an inflection point in which the state of warfare has pivoted from asymmetric, high tech forces totally dominating the battlefield back to ground forces duking it out via artillery and drone strikes. This stalemate, and the utter carnage of drone driven conflicts which maim more often than kill outright, will necessitate a transition to AI-driven warfighting.

However, reduced human agency in conflict could enable robotic Einsatzgruppen to murder civilians become more feasible to murder civilians, as extreme loyalty is unnecessary and insider witness risks are lower. Policy organizations must ensure humans always remain in the loop and no machine autonomously decides to kill. This is especially difficult where there is increasing hand-off to machines to autonomously finish the job, side-stepping issues of jamming and pilot error.


There could be blowback from a snap decision to ban to dissuade the use of AI weapons in war. For example, a conflict with fewer human warfighters may be a less trauma-inducing one, both physically and emotionally. 

Autonomous systems provide powerful defensive mechanisms for smaller, vulnerable states against larger aggressors. An international ban may weaken good faith actors while leaving bad faith actors and terrorist groups unaffected or strengthened.

However, AI has applications in conflict beyond overt warfare. Fifth Generation Hybrid Warfare focuses on demoralizing enemies while inoculating one's own group. It undermines enemies through plausibly deniable means not directly linkable to any actor.

We live in a world where malware, including other neural networks can now be secretly embedded in AI systems, potentially enabling tacit unauthorized control at a later time, perhaps even enabling a plausibly deniable false flag operation. AI also enables Zersetzung-style demoralization and gaslighting attacks upon persons of interest such as dissidents, and influential foreign nationals.

Banning overt AI in conflict may drive evolution towards more devastating covert applications. While tragic, nations can recover from loss of soldiers or civilians. Recovering from demoralization attacks that foment permanent polarization and societal breakdown of trust may be impossible. These invisible weapons of mass destruction could lead to apocalyptic outcomes. This must be considered as a potential consequence of legislation against overt AI warfare.


The Ukraine war and allied weapons supply efforts (without deploying soldiers) present a potential loophole in international law. Legally, deploying soldiers differs greatly from deploying weapons. However, autonomous weapons are classified as weapons like rifles, creating a legal conundrum.

A race to the bottom in safety is possible, with little regard for civilians during and after engagements. The specter of autonomous weapons terrorizing populations is concerning, especially concealed ‘stay-behind’ semi-active area denial units which deploy latently. Such units may be designed to maim but not kill per se, thereby evading legislation on ‘lethal’ autonomous weapons.

Dual-use technologies such as ‘autonomous firefighting and rescue equipment’ or ‘weed-blasting laser drones’ might be provided for one purpose but rapidly repurposed.

Addressing insidious autonomous weapons applications is crucial as war becomes increasingly tacit and deniable – an invisible war of demoralization and infrastructure attacks. Future conflict will be two-pronged: clandestine attacks on civilians, and drone-oriented autonomous doctrine when things turn hot.


Another disruptive aspect is the inexpensive routine launch of large payloads into orbit. This may make kinetic bombardment ‘rods from God’ economically feasible, and deployable to orbit on short notice as an intimidation tactic, with yields comparable to a small nuclear weapon, yet without necessarily invoking Mutually Assured Destruction or Non-Proliferation Treaties. Even non-state actors could hold people to ransom by threatening to drop dense but ostensibly legitimate payloads (like tungsten parts or Radioisotope Thermoelectric Generator isotopes) onto precarious geological faults, dams, or major cities. Coordinated responses to such threats are important, hopefully in a manner less likely to risk Kessler Syndrome.

Regulators and legislators must act now as these weapons create a 21st century version of trench warfare. The Geneva Conventions must also urgently be updated to accommodate the protection of civilians in these times, both from lethal autonomous weapons, and demoralization techniques.