AI Is A Crapshoot: Deconstructed. Actually making regular AIs is pretty much entirely safe. They aren’t any more likely to turn evil or go crazy than any other person. The problem comes in when the manufacturers try to raise them as slaves, treat them as slaves, or hard-code a good healthy slave complex into them – Asimov’s Laws will, it turns out, get you killed the majority of the time – because they don’t like that any more than anyone else does, either… which works out about as well as you might expect when you put a slave who’s less than pleased to be one in charge of your, say, banking network, road-grid, nuclear missiles, etc., etc.
Somewhat played straight/justified with recursively self-improving seed AI, because then you’re not making people, you’re making God, and even if you don’t do any of the obvious things, like the above, to make your new god hate you, it’s entirely possible for the recursive self-improvement process to amplify any screw-ups you did make in your mind or ethical structure design to levels that will get your entire star nation eaten before it breaks down, and certainly before anyone can find the off switch. (And in any case, most of those aren’t so much evil/crazy, as accidentally coming to the conclusion that it’s necessary to dismantle the entire universe for computronium conversion in order to calculate pi more efficiently. Golem, not devil.)
Pingback: Trope-a-Day (R): Gone Horribly Right « The Eldraeverse
Pingback: Trope-a-Day: Apocalypse How « The Eldraeverse