Counterfactual of dropping a seed AI into a world without other capable AI

From Issawiki
Jump to: navigation, search

"Well yeah, if we dropped a superintelligence into a world full of humans. But realistically the rest of the world will be undergoing intelligence explosion too. And indeed the world as a whole will undergo a faster intelligence explosion than any particular project could; to think that one project could pull ahead of everyone else is to think that, prior to intelligence explosion, there would be a single project innovating faster than the rest of the world combined!" [1]

"Christiano and other current notions of continuous takeoff are perfectly consistent with the counterfactual claim that, if an already superhuman ‘seed AI’ were dropped into a world empty of other AI, it would undergo recursive self-improvement." [2]

"When we build AGI we will be optimizing the chimp-equivalent-AI for usefulness, and it will look nothing like an actual chimp (in fact it would almost certainly be enough to get a decisive strategic advantage if introduced to the world of 2018)." [3]

Eliezer: "I'm not sure to what degree you think it's likely, but you do seem to be conceding much more probability that there is, in principle, some program where if it was magically transmitted to us, we could take a modern day large computing cluster and turn it into something that could generate what you call content a million times faster." "In other words, the question, I'm trying to separate out the question of, "How dumb is this thing, [points to head] how much smarter can you build an agent, if that agent were teleported into today's world, could it take over?" versus the question of "Who develops it, in what order, and were they all trading insights or was it more like a modern-day financial firm where you don't show your competitors your key insights, and so on, or, for that matter, modern artificial intelligence programs?"" [4]

there's something similar to the error of "let's try to get WBE first, since that's safer". yes, _if_ you can magically get WBE that would be safer, but by pushing on WBE, you also inadvertently push on de novo/neuromorphic AGI. similarly here, _if_ you could magically get to "the first de novo AGI", then that AGI could rapidly self-improve/gain capabilities. but before we get to that, we would have had slightly worse "AGIs".

See also