APLawrence.com -  Resources for Unix and Linux Systems, Bloggers and the self-employed

Killer Robots


2007/01/01

I was watching something on Discovery about "Ten Ways the World Will End". I only caught part of it, but number six or seven had us threatened by super-intelligent machines. I think they had Stephen Hawking saying something (probably something related to his comments about machines dominating our world and someone I didn't recognize waxing on about how computers could become hundreds of times more intelligent than humans. Stephen's not the only one to raise this spectre; Bill Joy and others have expressed similar concerns.

Yeah, yeah. We're a long, long way from that. I'm not saying there's nothing to worry about: the use of computers in weaponry will surely produce much more danger in warfare, but that's threats from other humans wielding advanced technology, not our machines turning against us. The danger there isn't really intelligence at all; it's just clever and fast algorithms.

Question: would you rather be stuck in cage with an angry lion or an angry man? If you'd rather have the human to deal with, would you rather deal with a stupid person or someone quite bright? You might answer that you'll take the less intelligent choice in hope of "out-smarting" them, but let's change that parameters a bit: now you have a choice of the two people, one quite bright and one not so bright, but you don't know whether they are angry or mean you any harm. Which would you choose then? I'd bet most of us would prefer to take our chances with the brighter person: they might be less likely to be a threat to us, and if they are, they might be willing to listen to reason. That's my feeling about intelligent machines: if they ever were "hundreds of times more intelligent", I doubt we'd have anything to fear.

However, that doesn't mean Bill Joy and the others are wrong. Self replicating weapons or even self replicating devices that accidentally turn into threats are extremely dangerous, but they are more dangerous because of their stupidity: a weapon that simply replicates, seeks and destroys could be extraordinarily effective with nearly no "intelligence". That's dangerous; a reasoning robot with an IQ of several thousand probably isn't.



Got something to add? Send me email.



2 comments



Increase ad revenue 50-250% with Ezoic


More Articles by

Find me on Google+

© Anthony Lawrence







Tue Jan 2 15:32:54 2007: 2793   anonymous



Not a problem just remember "Gort - Klatuu Verada Nicktoe"

It'll stop em dead in their tracks.



Wed Jan 3 02:41:00 2007: 2795   BigDumbDinosaur


And if all else fails, cutting off the power will do the trick. That or running it on Windows.




------------------------
Kerio Samepage


Have you tried Searching this site?

Unix/Linux/Mac OS X support by phone, email or on-site: Support Rates

This is a Unix/Linux resource website. It contains technical articles about Unix, Linux and general computing related subjects, opinion, news, help files, how-to's, tutorials and more.

Contact us





The nice thing about standards is that you have so many to choose from. (Andrew S. Tanenbaum)

Today the theory of evolution is about as much open to doubt as the theory that the earth goes round the sun. (Richard Dawkins)







This post tagged: