Difference between revisions of "Importance of knowing about AI takeoff"

From Issawiki
Jump to: navigation, search
Line 3: Line 3:
 
* Weight on [[AI prepping]]: In a FOOM scenario it is unlikely that there are any selfish actions that benefit individuals, so one should shift those resources into either altruistic actions (like helping with alignment) or short-term consumption. In contrast, with more continuous takeoff, AI prepping becomes relatively more important.
 
* Weight on [[AI prepping]]: In a FOOM scenario it is unlikely that there are any selfish actions that benefit individuals, so one should shift those resources into either altruistic actions (like helping with alignment) or short-term consumption. In contrast, with more continuous takeoff, AI prepping becomes relatively more important.
 
* Working to mitigate specific threats: e.g. if one expects AI-powered propaganda to be a thing during AI takeoff, then it makes sense to spend time thinking about it now.
 
* Working to mitigate specific threats: e.g. if one expects AI-powered propaganda to be a thing during AI takeoff, then it makes sense to spend time thinking about it now.
 +
 +
==See also==
 +
 +
* [[Importance of knowing about AI timelines]]
  
 
[[Category:AI safety]]
 
[[Category:AI safety]]

Revision as of 01:37, 5 March 2021

Importance of knowing about AI takeoff is about the "so what?" of knowing which AI takeoff scenario will happen. How will our actions change if we expect a FOOM or a continuous takeoff or some other scenario?

  • Weight on AI prepping: In a FOOM scenario it is unlikely that there are any selfish actions that benefit individuals, so one should shift those resources into either altruistic actions (like helping with alignment) or short-term consumption. In contrast, with more continuous takeoff, AI prepping becomes relatively more important.
  • Working to mitigate specific threats: e.g. if one expects AI-powered propaganda to be a thing during AI takeoff, then it makes sense to spend time thinking about it now.

See also