Difference between revisions of "Missing gear vs secret sauce"

From Issawiki
Jump to: navigation, search
Line 1: Line 1:
 
I want to distinguish between the following two framings:
 
I want to distinguish between the following two framings:
  
* missing gear/[[one wrong number problem]]/step function/understanding is discontinuous/[https://aiimpacts.org/likelihood-of-discontinuous-progress-around-the-development-of-agi/#Payoff_thresholds payoff thresholds]: "missing gear" doesn't imply that the last piece added is all that significant -- it just says that adding it caused a huge jump in capabilities.
+
* [[Missing gear for intelligence|missing gear]]/[[one wrong number problem]]/step function/understanding is discontinuous/[https://aiimpacts.org/likelihood-of-discontinuous-progress-around-the-development-of-agi/#Payoff_thresholds payoff thresholds]: "missing gear" doesn't imply that the last piece added is all that significant -- it just says that adding it caused a huge jump in capabilities.
 
* [[secret sauce for intelligence]]/small number of breakthroughs: "small number of breakthroughs" says that the last added piece must have been a significant piece (which is what a breakthrough is).
 
* [[secret sauce for intelligence]]/small number of breakthroughs: "small number of breakthroughs" says that the last added piece must have been a significant piece (which is what a breakthrough is).
  

Revision as of 23:34, 6 June 2020

I want to distinguish between the following two framings:

  • missing gear/one wrong number problem/step function/understanding is discontinuous/payoff thresholds: "missing gear" doesn't imply that the last piece added is all that significant -- it just says that adding it caused a huge jump in capabilities.
  • secret sauce for intelligence/small number of breakthroughs: "small number of breakthroughs" says that the last added piece must have been a significant piece (which is what a breakthrough is).

I'm not sure how different these two actually are. But when thinking about discontinuities, I've noticed that I am somewhat inconsistent about conflating these two and distinctly visualizing them.

"But this conversation did get me thinking about the topic of culturally transmitted software that contributes to human general intelligence. That software can be an *important* gear even if it's an algorithmically shallow part of the overall machinery. Removing a few simple gears that are 2% of a machine's mass can reduce the machine's performance by way more than 2%. Feral children would be the case in point." "But as necessary as it may be to avoid feral children, this kind of shallow soft-software doesn't strike me as something that takes a long time to redevelop, compared to hard-software like the secrets of computational neuroscience." [1]