Difference between revisions of "Resource overhang"
Line 4: | Line 4: | ||
Whether we are already in [[hardware overhang]] / other "resource bonanza". So whether we are in overhang depends on whether future early AGI systems can make use of existing resources much more efficiently, which means overhang has a dependence on whether few insights are needed to get to AGI? More precisely, there's a feedback loop, where resource overhang amplifies any existing discontinuity. | Whether we are already in [[hardware overhang]] / other "resource bonanza". So whether we are in overhang depends on whether future early AGI systems can make use of existing resources much more efficiently, which means overhang has a dependence on whether few insights are needed to get to AGI? More precisely, there's a feedback loop, where resource overhang amplifies any existing discontinuity. | ||
+ | |||
+ | Andy Jones argues we are already in a resource overhang because GPT-3 could be scaled up to be much more capable but no company has tried to do so.<ref>https://www.greaterwrong.com/posts/N6vZEnCn6A95Xn39p/are-we-in-an-ai-overhang</ref> | ||
==External links== | ==External links== |
Revision as of 03:18, 24 February 2021
Does overhang just mean "not a bottleneck"?
Resource overhang and AI takeoff
Whether we are already in hardware overhang / other "resource bonanza". So whether we are in overhang depends on whether future early AGI systems can make use of existing resources much more efficiently, which means overhang has a dependence on whether few insights are needed to get to AGI? More precisely, there's a feedback loop, where resource overhang amplifies any existing discontinuity.
Andy Jones argues we are already in a resource overhang because GPT-3 could be scaled up to be much more capable but no company has tried to do so.[1]