Two Military Experts Propose Using Artificial Intelligence to Control Nuclear Weapons

Thursday, 05 September 2019 - 10:55AM
Artificial Intelligence
Thursday, 05 September 2019 - 10:55AM
Two Military Experts Propose Using Artificial Intelligence to Control Nuclear Weapons
< >
It doesn't take incredibly insightful or prolonged meditation to realize that the arms race that emerged from the ruins of World War II is predicated on a singular assumption: that if a country's leader knows that the consequence of launching an attack on a rival country will provoke an immediate and escalated retaliation, that leader will be disincentivized to do so, compelled – or so the assumption goes – to place the continued existence of his or her country and its people above all other considerations. The implied threat is this: you may succeed in killing many of us, but we will succeed in killing just as many or more of you. Even on a personal level in the uglier parts of the world, the assertion of mutually assured destruction gives even the most bellicose predators pause.

So far, it's worked out: at the time of this writing, we have still somehow managed to avoid nuclear war. Our country's enemies instead opt for asymmetric strategies and tactics to which we have not only proven ourselves vulnerable to, but completely incapable of mounting an appropriate escalation of force in retaliation. Our political process may be completely compromised, but at least we're not having our eyes burned to ash in the vaporizing blast of Russian ICBMs crossing the Atlantic bearing Putin's signature. It could, as I am fond of saying, always be worse.

Nevertheless, the arms race persists, despite what you might have learned in your American history class. Spurred by the U.S. exit from the Intermediate-Range Nuclear Forces Treaty (INF) pact last month and the previously-banned missile testing that followed, Putin declared today that Russia would return to making land-based cruise missiles with effective ranges of 310-3,400 miles. Moreover, the Federation of American Scientists reports that the U.S. and Russia own or control 93% of humanity's nuclear warheads, with Russia having a slightly larger stockpile, including hypersonic missiles which, according to experts, fly at five times the speed of sound. As such, they cannot be defended against. Rich Moore, a senior engineer at RAND, a global think-tank, had this to say about them:

Opening quote
Hypersonic cruise missiles are powered all the way to their targets using an advanced propulsion system called a SCRAMJET. These are very, very fast. You may have six minutes from the time its launched until the time it strikes.
Closing quote

In the event of a first strike by Moscow, the only thing the U.S. could do in such an event would be to unleash its own nuclear arsenal. That, of course, would demand that 1) There was anyone left alive to do so and 2) They were prepared to orient, decide, and act in the six minutes – likely less – that they would have before the first waves of fire broke. 

To address that challenge, two American military experts – Dr. Adam Lowther, the Director of Research and Education at the Louisiana Tech Research Institute (LTRI), and Curtis McGiffin, an Associate Dean at the Air Force Institute of Technology's School of Strategic Force Studies – are suggesting that America place at least some of its nuclear arsenal under the control of Artificial Intelligence. Lowther and McGiffin explored the idea in an article in War On The Rocks entitled, "America Needs a 'Dead Hand,'" a title which suggests that the strategy represents the logical conclusion of deterrence through mutually assured destruction: a switch that is activated even if there's no human left to activate it. 

This is not, however, explicated: it's merely implied. Invoking AI's prowess in "processing vast amounts of information very quickly and assessing the pros and cons of alternative actions in a thoroughly unemotional manner," the authors suggest that "artificial intelligence may, to a small degree, mitigate the tyranny of attack-time compression and accelerate wartime decision-making." 

Regardless of whether or not one agrees with either the stated idea or its logical conclusion – I think they're onto something, but lack imagination thanks to decades of institutionalization – the authors are correct in their assessment in America's positioning against countries like Russia and China.

"U.S. adversaries are not interested in maintaining the status quo," Lowther and McGiffin write. "They are actively working to change it. U.S. adversaries are working on their own fait accompli that will leave the United States in a position where capitulation to a new geostrategic order is its only option. The United States cannot allow that. The United States must re-examine its view of an old concept in light of fundamental technological change. Moving forward as if twentieth-century paradigms are still valid is not an option."

What's the worst that could happen, right?


Military Tech
Artificial Intelligence