Resource overhang
Does overhang just mean "not a bottleneck"?
Resource overhang and AI takeoff
Whether we are already in hardware overhang / other "resource bonanza". So whether we are in overhang depends on whether future early AGI systems can make use of existing resources much more efficiently, which means overhang has a dependence on whether few insights are needed to get to AGI? More precisely, there's a feedback loop, where resource overhang amplifies any existing discontinuity.
Andy Jones argues we are already in a resource overhang because GPT-3 could be scaled up to be much more capable but no company has tried to do so.[1] See scaling hypothesis.