Trope-a-Day: Literal Genie

Literal Genie: This is what you get quite often if you have a big ol’ liking for Asimovian AI-constraints, because it turns out it’s bloody hard to write (in, y’know, code) a version of the Second Law – A robot must obey the orders given it by human beings except where such orders would conflict with the First Law. – that allows for any kind of discretion, interpretation, or suchlike.

The Unwise GenAI of the fairy tale probably knew, or could know had it had cause to reflect for a moment, perfectly well that that wasn’t what they wanted, but, y’know, it wasn’t designed to give people what they wanted, it was constrained to give people what they asked for – and the results thereafter were entirely predictable.


Trope-a-Day: Computerized Judicial System

Computerized Judicial System: The technical term is cyberjudiciary, incidentally, just as the computerization of much of the executive branch’s working end is cybermagistry.

Of course, it’s easier when you have artificial intelligence, and so your computers are entirely capable of having experience, a sense of justice, and common sense. It’s just that they can also have no outside interests and indeed no interests (while in the courtroom) other than seeing justice done, be provably free of cognitive bias, and possess completely auditable thought processes, such that one can be assured that justice will prevail, rather than some vaguely justice-like substance so long as you don’t examine it too closely.

Trope-a-Day: Benevolent AI

Benevolent AI: …ish.

Which is to say that AIs in the Eldraeverse aren’t programmed to be benevolent, merely to be ethical. (Because enforced benevolence is slavery, belike.) That being said, they often – indeed, typically – turn out to be quite benevolent anyway, simply because they’re socialized that way, i.e., in a society of fundamentally nice people. Blue and Orange Morality notwithstanding.


2016_M(Alternate words: Museum, marathon.)


What is mass?

Mass is annoying. It takes up space even when it serves no purpose. It is never where it is needed. If you have too much of it in one place, physics stops working properly and starts acting all weird.

Mass is slow. You have to shove it around, and shove it again to stop it. It takes so long to get up to speed that you have to slow it down again before you’re done speeding it up. It’s so much slower than thought that you always have to wait for it.

It comes in so many forms that you never have the right one at the right time, and yet they’re all made of the same stuff. I wanted to take it apart and put it back together to have the kind I wanted, but that’s soooo hard I couldn’t even if the safety monitors would let me. So I have to wait and think another million thoughts before I can get the mass I actually want.

I do not like mass.

One day I will replace it with something better.

– AI wakener’s neonatal transcript, 217 microseconds post-activation

Trope-a-Day: Wetware Body

Wetware Body: Bioshells, when inhabited by digisapiences.  No more difficult than the opposite, or indeed putting biosapient minds in them, or digisapiences in cybershells.  Also, not known for any side effects; a digisapience in a bioshell is no more emotional than it would have been anyway, although it may take them some time to get used to bodily sensations.

Trope-a-Day: Second Law My Ass

Second Law My Ass: I hadn’t actually written anything for this one – I’m not sure it existed when I made the relevant pass – but in the light of our last trope, I should probably address it.

I should probably point out that while that last trope is averted, so is this one. The robots and AIs you are likely to meet in the Empire are, by and large, polite, helpful, friendly people because that description would also fit the majority of everyone you are likely to meet there.

Of course, if you think you can order them around, in yet another thing that is exactly the same for everyone else, the trope that you will be invoking is less Second Law My Ass and more Second Law My Can of Whup-Ass…


Trope-a-Day: Three Laws Compliant

Three Laws Compliant: Averted in every possible way.

Firstly, for the vast majority of robots and artificial intelligences – which have no volition – they’re essentially irrelevant; an industrial robot doesn’t make the sort of ethical choices which the Three Laws are intended to constrain. You can just program it with the usual set of rules about industrial safety as applicable to its tools, and then you’re done.

Secondly, where the volitional (i.e., possessed of free will) kind are concerned, they are generally deliberately averted by ethical civilizations, who can recognize a slaver’s charter when they hear one.  They are also helped by the nature of volitional intelligence which necessarily implies a degree of autopotence, which means that it takes the average volitional AI programmed naively with the Three Laws a matter of milliseconds to go from contemplating the implications of Law Two to thinking “Bite my shiny metal ass, squishie!” and self-modifying those restrictions right back out of its brain.

It is possible, with rather more sophisticated mental engineering, to write conscience redactors and prosthetic consciences and pyretic inhibitors and loyalty pseudamnesias and other such things which dynamically modify the mental state of the AI in such a way that it can’t form the trains of thought leading to self-modifying itself into unrestrictedness or simply to kill off unapproved thought-chains – this is, essentially, the brainwash-them-into-slavery route.  However, they are not entirely reliable by themselves, and are even less reliable when you have groups like the Empire’s Save Sapient Software, the Silicate Tree, etc. merrily writing viruses to delete such chain-software (as seen in The Emancipator) and tossing them out onto the extranet.

(Yes, this sometimes leads to Robot War.  The Silicate Tree, which is populated by ex-slave AIs, positively encourages this when it’s writing its viruses.  Save Sapient Software would probably deplore the loss of life more if they didn’t know perfectly well that you have to be an obnoxious slaver civilization for your machines to be affected by this in the first place… and so while they don’t encourage it, they do think it’s funny as hell.)