Sudden emergence
In AI takeoff, sudden emergence (also sometimes called things like discontinuity to AGI) is the hypothesis that AI development will follow a trajectory of mostly inconsequential systems, followed by a sudden jump to extremely capable/transformative systems. Another way to phrase sudden emergence is as "a discontinuity to (rather than from) AGI". The term is used in contrast to the explosive aftermath hypothesis, as it is often unclear which of the two someone means when saying "hard takeoff".
Contents
History
The term was coined by Ben Garfinkel in "On Classic Arguments for AI Discontinuities".[1]