February’s Patreon Questions

Without further ado:

If you should encounter a situation where you have, in good faith, undertaken an obligation that is not itself particularly onerous or ethically objectionable, yet you find yourself in a situation where you must either violate some third party’s rights or default on said obligation, what is your best course of action?

(Let’s say, for instance, you’ve taken on an obligation to deliver a particular package to a particular place at a particular time, but in order to do so, you have to pass through a Gate of some sort — and the Gatekeeper is not willing to negotiate passage or pass along the message.)

Well, then you’re screwed, aren’t you?

Your best and indeed only course of action is to suck it up, pay the compensation, and take the rep hit assessed for an involuntary default. (After all, at least it is an involuntary default, so while you’re screwed, you’re not totally screwed.)

That’ll hurt, but that’s what one might call a teachable moment in Why We Don’t make Unqualified Promises Of Things We Might Not Be Able To Deliver, savvy? Did y’all sleep through the day they taught impracticability clauses in contracts class?

I was recently reading an article on the Forbes website about self-driving cars and accident liability ( http://www.forbes.com/sites/omribenshahar/2016/09/22/should-carmakers-be-liable-when-a-self-driving-car-crashes/#7be8eec81f40 ) when a thought hit me that similar matters must come up all the time in the eldraeverse, given the ubiquity of nigh-seamless artificial intelligence .

Which leads me to ask: In an incident where a device that has enough self-agency to make decisions in a “live” environment but not the requisite self-awareness to qualify as a sophont ends up acting in a way that causes injury to person or property, what sort of standards and procedures do the courts of the Empire use to determine who bears the liability?

That depends entirely on who bears the fault, and to what proportionate degree, as is normal in liability cases that end up in front of the Curial courts.

Which, once all the logs from various systems and other applicable data have been collated, is something to be sorted out in court between – to stick with the self-driving car example – the odocorp (as the road and road-grid provider), the car manufacturer (and its software developers and/or wakeners), the car owner (and possibly their maintenance and/or customization provider), anyone else involved (since in the Empire road designs that mingle pedestrians and vehicles are considered Not Done, if you wander into the road and get hit by a car, it’s almost certainly on you), all of the above’s tort insurers, etc., etc.

This can occasionally be complicated, but fortunately the courts have lots of forensic failure engineers on hand for situations just like this.

 

Leave a Reply