Difference between revisions of "Missing gear for intelligence"

From Issawiki
Jump to: navigation, search
Line 1: Line 1:
 
'''Missing gear for intelligence''' (also called '''[[one wrong number problem]]''', '''step function''', '''understanding is discontinuous''', '''payoff thresholds argument''') is an argument for a [[Discontinuous takeoff|discontinuity]] in AI takeoff. Unlike a [[secret sauce for intelligence]], the missing gear argument does not require that the final part of AI development be a huge conceptual breakthrough.
 
'''Missing gear for intelligence''' (also called '''[[one wrong number problem]]''', '''step function''', '''understanding is discontinuous''', '''payoff thresholds argument''') is an argument for a [[Discontinuous takeoff|discontinuity]] in AI takeoff. Unlike a [[secret sauce for intelligence]], the missing gear argument does not require that the final part of AI development be a huge conceptual breakthrough.
  
In IEM [[Eliezer]] writes "If the nearest competitor was previously only seven days behind, these seven days have now been amplified into a technological gulf enabling the leading AI to shut down, sandbox, or restrict the growth of any competitors it wishes to fetter."<ref>https://intelligence.org/files/IEM.pdf#page=71</ref>
+
In IEM [[Eliezer]] writes "If the nearest competitor was previously only seven days behind, these seven days have now been amplified into a technological gulf enabling the leading AI to shut down, sandbox, or restrict the growth of any competitors it wishes to fetter."<ref>https://intelligence.org/files/IEM.pdf#page=71</ref> The idea that a seven-day lead can result in a local foom makes me think Eliezer does not require the final missing gear to be a huge conceptual breakthrough.
  
 
==External links==
 
==External links==

Revision as of 23:54, 6 June 2020

Missing gear for intelligence (also called one wrong number problem, step function, understanding is discontinuous, payoff thresholds argument) is an argument for a discontinuity in AI takeoff. Unlike a secret sauce for intelligence, the missing gear argument does not require that the final part of AI development be a huge conceptual breakthrough.

In IEM Eliezer writes "If the nearest competitor was previously only seven days behind, these seven days have now been amplified into a technological gulf enabling the leading AI to shut down, sandbox, or restrict the growth of any competitors it wishes to fetter."[1] The idea that a seven-day lead can result in a local foom makes me think Eliezer does not require the final missing gear to be a huge conceptual breakthrough.

External links

See also

  • https://intelligence.org/files/IEM.pdf#page=71