Friday, May 25, 2012

Brain box

This is a sober consideration of how to cope with AI that is smarter than us (pdf). The good news in the paper is that "in the space of possible motivations, likely a very small fraction is compatible with coexistence with humans". That's a relief.But as it does point out anyone creating a super-smart AI had better ensure that valuing humans is built in or we'll be seen as obstacles or tools for its own growth.

So, how do we solve this problem? Put it in a box - an actual, physically discrete box. Put an on/off switch on it. Don't connect it to the net. Limit its input. I think there might be flaws with this approach. If it is super-smart then it might well be able to conceive of ways to escape that our puny brains are not capable of preparing against.

The paper does trot through some of these and worries that if our precautions are ill thought out then we may be preparing for catastrophe. And, it wryly notes, AI designers will most likely neglect security. The race to make an absurdly powerful AI may also make researchers cut corners and let the mad artificial brain loose without putting on a bridle.

Putting the brain in a box limits the potential for catastrophe, argue the authors. Experiments, with a human playing the role of the smart AI, have shown that people can outwit people (it happens to prison warders all the time) so a smart AI would have no problem. The paper does go through all kinds of defences that we can mount but it does have the air of desperation about it. I have the suspicion that we'll be always outnumbered, always outgunned.

1 comment:

Mike Keyton said...

Create within it a need for sex. That will bugger it up