Hasta la vista, baby

Dangers lurk at every corner whenever autonomous weapons are deployed, foremost of which is the potential for unintended escalation of conflict
john henry dodson

Peoples of the world have been engaged in wars since time immemorial and only the methodologies of annihilating one another have changed, getting more lethal with each passing century until the development of nuclear weapons, with the capability to destroy Earth and everything on it many times over.

Picture that comical caricature of a caveman with a club and his other hand dragging a woman by the hair, one kidnapped from another household or taken during a raid on an opposing village. When man started lusting over another’s possessions, so was also ingrained in his psyche the mantra “kill or be killed” or “protect one’s own at all cost.”

Initially, small-scale conflicts were fought mainly with spears, swords and bows, and the tactics employed were rudimentary — heavily reliant on sheer strength, numbers, barbarity and the so-called fear factor. War cries, face paint and the use of demonic or animalistic battle helmets, armor and gear were precisely intended to curdle the blood of enemies, striking fear into their hearts, the better to end a fight even before it started.

Fast-forward to the present and what seemed to be implausible fiction — with Arnold Schwarzenegger's Terminator movie series — becoming too worrisome a reality, of fully autonomous weapons being deployed in modern battlefields like Ukraine. The “rise of the machines” can no longer be denied and, with it, the possibility that they could soon turn against humans, their makers.

After clowning with comedians in an obvious Vatican PR job, Pope Francis buckled down to deathly serious business as he addressed the Group of Seven (G7) nations to call for a total ban on the development and deployment of “killer robots” that need little or no human intervention to do what they’ve been hardwired to accomplish — what else but kill.

The “lethal autonomous weapons,” the Pontiff warned, presents “new ethical and existential challenges to humanity.” Powered by numbers-crunching micro-computers, and the now pervasive artificial intelligence (AI), the killer robots (autonomous drone swarms, tanks, etc.) have upended anew the balance in favor of affluent and tech-savvy nations.

We had seen it during George Bush Sr.’s presidency when he led a coalition of nations to expel from Kuwait Iraqi invaders, with the latter surrendering en masse to the early types of drones after getting a taste of “death coming from the sky.” Those Iraqis thought they were surrendering to fighter pilots, clueless that they were turning in their arms under the cold gaze of IR and heat-sensitive cameras manned by geeks miles away staring at computer monitors.

The beauty of these killer robots, if we can even call it that, is that they leverage technology like AI to operate independently, making real-time battlefield decisions once reserved for real tank commanders, real pilots and real people operating missile defense systems.

Just to name a few, the Turkish-made Kargu-2 drone, employed in the Libyan civil war, is capable of engaging targets autonomously; ditto with the Israeli Harpy drone system designed to loiter in airspace, search for radar signals and destroy them without human guidance.

Such systems throw the spotlight on the growing reliance on fully autonomous systems intended to reduce human oversight in critical combat operations. Friendly fire and battle mistakes have been made and would continue to be made by humans faced with the fog of war and the stresses of being under fire.

But the machines, even those purportedly aided by AI redundancies, can also be expected to muck up that universal challenge to any person, army or nation facing life-and-death situations: Identifying friends from foes, especially when they are mere radar bleeps that could be misidentified as a Stratofortress bomber instead of a commercial airline.

Dangers lurk at every corner whenever autonomous weapons are deployed, foremost of which is the potential for unintended escalation of conflict as the systems may misinterpret data, sparking unintended engagements.

Autonomous weapons could also lower the threshold for entering into conflicts considering the relative ease of deploying such systems. Amid these concerns, Pope Francis underscored the moral imperative of ensuring human oversight in military operations as the delegation of life-and-death decisions to machines undermines the ethical frameworks that have governed warfare for centuries.

Wars waged directly by humans — imagine two knights trying to unhorse each other — are bad enough, but conflicts are made much worse with technology.

Arnie, in his best cyborg voice, may just be telling us, “Hasta la vista, baby.”

logo
Daily Tribune
tribune.net.ph