The Singularity: Happens all the time. In the historical sense, of course, this is unsurprising, and generally no-one involved notices until afterwards, at which point historians looking back can say “ah, yes, that’s what that was”. There are, of course, investment opportunities here for offworld investors who’ve been through something similar beforehand, but it’s so hard to predict how these things are going to turn out even with the documentation.
In its less technically accurate “runaway intelligence excursion” sense, also happens all the time, at least locally, whenever someone stumbles across the secrets of computational theogeny. Results vary: at one end of the scale you have things like the Eldraeic Transcend, an essentially benign – by local standards – collective hyperconsciousness that genuinely cherishes each and every one of its constitutionals, spends the necessary fraction of its time ensuring universal harmony and benevolent destinies for all, and promotes and encourages the ascendance and transcendence of every sophont within its light cone when it’s not turning its vast processing power on the problem of rewriting some of the universe’s more inconvenient features, like cosmic entropy.
In the middle of the scale you have fairly neutral results, like, say, the Iniao Intellect, which has been thinking about abstract mathematics for a millennium and couldn’t care less about the outside universe – except, that is, for casually obliterating anyone who might interfere with its thinking about abstract mathematics.
At the bottom end of the scale you have more problematic blight and perversion cases, like the power that killed everything in the Charnel Cluster right down to prions; or the hegemonizing swarm-type blights of which the Leviathan Consciousness is the greatest; or those constructed by religious fanatics which decide that obviously the correct place for them in the theic structure is as God. (Fortunately, that class are rarely stable for long.)
Constructing minds whose ethics and supergoal structures remain stable under recursive self-improvement is really, really hard, it turns out, even (especially!) compared to just constructing minds capable of recursive self-improvement. This is why the people who figure out workable computational theogeny prefer not to spread the knowledge around too much.