Monday, August 03, 2009

Robot Weapons

Detective Del Spooner: What if I'm right?

Lt. John Bergin: [sighs] Well, then I guess we're gonna miss the good old days.

Detective Del Spooner: What good old days?

Lt. John Bergin: When people were killed by other people.

—Memorable scene from I, Robot (2004)


The future is creeping up too quickly. Last week I read that a certain wine-tasting robot thinks humans taste like bacon. That was scary enough.

Then today I read something much more sobering. Military killer robots could endanger civilians. Sounds like they already are:
"The next thing that's coming, and this is what really scares me, are armed autonomous robots," said Prof Sharkey speaking to journalists in London. "The robot will do the killing itself. This will make decision making faster and allow one person to control many robots. A single soldier could initiate a large scale attack from the air and the ground.

"It could happen now; the technology's there."

A step on the way had already been taken by Israel with "Harpy", a pilotless aircraft that flies around searching for an enemy radar signal. When it thinks one has been located and identified as hostile, the drone turns into a homing missile and launches an attack - all without human intervention.

Last year the British aerospace company BAe Systems completed a flying trial with a group of drones that could communicate with each other and select their own targets, said Prof Starkey. The United States Air Force was looking at the concept of "swarm technology" which involved multiple drone aircraft operating together.

Flying drones were swiftly being joined by armed robot ground vehicles, such as the Talon Sword which bristles with machine guns, grenade launchers, and anti-tank missiles.

However it was likely to be decades before such robots possessed a human-like ability to tell friend from foe.

Even with human controllers, drones were already stacking up large numbers of civilian casualties.

As a result of 60 known drone attacks in Pakistan between January 2006 and April 2009, 14 al Qaida leaders had been killed but also 607 civilians, said Prof Sharkey.

The US was paying teenagers "thousands of dollars" to drop infrared tags at the homes of al Qaida suspects so that Predator drones could aim their weapons at them, he added. But often the tags were thrown down randomly, marking out completely innocent civilians for attack.
On a side note, those infrared tags are the missing piece of the CTTL puzzle that 60 Minutes never explained. The use of such tagging devices requires some human judgment. Though human judgment may be shitty and clouded by other motivations, I'll never trust a computer to make better decisions -- even one programmed with the three rules of robotics as outlined by Isaac Asimov:
Law I: A robot may not harm a human or, by inaction, allow a human being to come to harm.

Law II: A robot must obey orders given it by human beings except where such orders would conflict with the First Law.

Law III: A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
Though simple for a human to understand, those rules cannot be comprehended by computers without huge advances in artificial intelligence. But I'm veering off topic here. No military would even want that first law anyway.

When robots fight our wars with or without human intervention and there is not a single human casualty on our side, then what will be our incentive for peace?

Meanwhile, Cyclone Power Technologies wants to assure us that all their military robots are vegetarians. Well, that's a relief.

No comments: