Necessary Evil

According to my cliodynamic studies, it is the case that a policy of routine intervention to prevent any perversion from blooming is suboptimal, inasmuch as it opens up the possibility for bad actors to work around the Coricál Consensus by performing a broad spectrum of experiments in computational theogeny and observing which ones call down intervention teams.

In addition, the perception that DEMIURGE ERRANT will always be there to prevent disasters and clean up the mess weakens the general perception of the field as extremely hazardous to a point which causes a statistically significant increase in the frequency of attempts leading to perversions.

In short: permitting a small number of idiots to have their brains eaten by their errant creations is indeed the best way to prevent a large number of people, mostly lesser idiots, from having their brains eaten by the greater idiots’ errant creations.

Black Box, advisory archai to the Imperial Security Executive

Trope-a-Day: Black Box

Black Box: Quite a few of them lying around in the form of leftover elder race artifacts and other archaeological recoveries.  Sensible civilizations and corporations (like Probable Technologies, ICC) really hate this, because they know exactly how Sealed-Evil-In-A-Can dangerous that sort of thing can be, and the likelihood of unknown side effects, and decline to extensively use or commercialize any of them until they’ve figured out not only how to reproduce them, but also just how, exactly, the things work.  Very minor, very benign examples may be sold off to collectors, but no-one’s making them a part of their infrastructure until they know all about it.

There are, of course, plenty of sense-challenged people out there.

(On a lesser scale, there are some other examples: the secrets of stabilizing wormholes and building stargates, for example, are both a state secret of the Voniensa Republic and the highest possible grade of commercially-sensitive information for Ring Dynamics, ICC, for reasons in both cases less about maintaining their monopoly and more about wanting to discourage people from screwing with the infrastructure of their really expensive interstellar transportation system – so while the rough details of how they work are known to any schoolchild, that’s about it.  Likewise, the algorithms for producing recursively self-improving AI seeds are generally considered proprietary and closely held by informal agreement [the “Corícal Consensus“] of the people who have them, due to the tendency of amateurs to do really stupid things that Go Horribly Right.)

[Of course, in fairness to everyone else, it’s not like in their universe they ever ran into a recovered Black Box that was quite so all-fired useful as, say, Mass Effect‘s mass relay network.  On the other hand, I am fairly certain that, while the Imperials might have been unable to resist the urge to put that one into immediate operation, they also would have been sure to find a less important one somewhere that they could take apart to figure out how the damn things worked…]