This excellent post illustrates why I don’t talk about “technological singularity” in any of my SF stories.
You’ve probably heard the narrative before. At some point, we will invent an artificial intelligence that is more intelligent than we are. The superhuman intelligence will then have the capability to either build an improved version of itself, or engineer upgrades that improve its own intelligence. This will set off a process where the system upgrades itself, with its greater intelligence come up with new ways to enhance itself, and then upgrade itself again, looping in a rapid runaway process, producing an intelligence explosion.
Given that we only have human level intelligence, we have no ability to predict what happens next. Which is why Vernor Vinge coined the phrase “the technological singularity” in 1993. The “singularity” part of the label refers to singularities that exist in math and science, points at which existing theories or frameworks break down. Vinge predicted that this would happen “within…
View original post 1,203 more words