A call to ban the development of Terminators

This is a guest post by FOSEP member Jaclyn Saunders, thanks Jaci for your input!

As a grad student at UW with her entire family back on the East Coast, I spend my Thanksgivings sharing great food with friends. The friends are often other science/engineering types, which is a far cry from my childhood turkey meal company. Now my Thanksgiving conversations revolve around slightly different debates and conversations… like the recent call to institute an internal weapons ban on killer robots. That’s right folks, and official ban against the Terminator.

I apparently live in la-la land, because I was blissfully unaware that these killer bots exist (although they have not yet been used with full artificial intelligence capacity). The organization Human Rights Watch and Harvard Law School’s International Human Rights Clinic recently called for an international treaty to ban the use of weapons systems that can kill on their own without prior human consent.

My super smart weapons expert friend chuckled at my astonishment of learning about “bots” that patrol the demilitarized zone between North and South Korea. These bots can detect a human up to 2 km away, and the current program requires human consent to shoot, but they could be easily reprogrammed to shoot on their own. Similar devices are also being used along the Israel-Gaza border.

The US also uses drones which require human consent prior to firing; however, the distance between the target 7,000 miles away and the pilot, whose own safety is not at risk, might produce a video-game like atmosphere that desensitizes the process of taking life in war time.

Weapons expert friend acknowledged the risk and need for human responsibility on the firing end. Without a human pulling the trigger, there is a loss of accountability.  For humanitarian violations, who would be responsible for war crimes potentially committed by a fully autonomous weapons system? Can you hold the Terminator accountable? Or do you hold those who programmed the bot accountable?

Supporters of these fully autonomous systems have some good points: they feel these systems can be safer than a human who may suffer from fear and stress in a wartime situation and make poor decisions or have a lack response time.

The realization that Terminator style warfare is a reality and that my GI Joe is now less brawny action hero and more nerdy programmer  was eye-opening. I was also surprised to learn that development of the basic control systems and artificial intelligence used in these weapons are often partially developed at academic institutions by graduate students like me.  While I stuffed my face with turkey, all I could think of is “where will I get a vat of lava to push a Terminator into when they inevitably turn on us?”.

http://articles.washingtonpost.com/2011-04-24/world/35231816_1_reaper-aircraft-drones-air-force-predator

More news articles the use of autonomous weapons systems: http://www.lawfareblog.com/2012/12/readings-autonomous-weapon-systems-and-their-regulation/

Human Rights Watch: http://www.hrw.org/

For more on use of drones:

http://www.theglobeandmail.com/news/world/ban-urged-on-killer-robots/article5456209/

http://www.pbs.org/wnet/need-to-know/five-things/drones/12659/

 

 

 

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s