Difference between revisions of "Importance of knowing about AI takeoff"

From Issawiki
Jump to: navigation, search
(Created page with "'''Importance of knowing about AI takeoff''' is about the "so what?" of knowing which AI takeoff scenario will happen. How will our actions change if we expect a FOOM...")
 
 
(5 intermediate revisions by the same user not shown)
Line 1: Line 1:
 
'''Importance of knowing about AI takeoff''' is about the "so what?" of knowing which [[AI takeoff]] scenario will happen. How will our actions change if we expect a [[FOOM]] or a [[continuous takeoff]] or some other scenario?
 
'''Importance of knowing about AI takeoff''' is about the "so what?" of knowing which [[AI takeoff]] scenario will happen. How will our actions change if we expect a [[FOOM]] or a [[continuous takeoff]] or some other scenario?
 +
 +
* Weight on [[AI prepping]]: In a FOOM scenario it is unlikely that there are any selfish actions that benefit individuals, so one should shift those resources into either altruistic actions (like helping with alignment) or short-term consumption. In contrast, with more continuous takeoff, AI prepping becomes relatively more important.
 +
* Working to mitigate specific threats: e.g. if one expects AI-powered propaganda to be a thing during AI takeoff, then it makes sense to spend time thinking about it now.
 +
* Under some scenarios we should expect to see [[warning shots]]/early misalignment, whereas in others we should expect a [[treacherous turn]]. This has consequences for how careful we should be about alignment/how much of an "on-paper understanding" we should have about alignment.
 +
 +
TODO: more listed at the bottom of https://sideways-view.com/2018/02/24/takeoff-speeds/
 +
 +
==See also==
 +
 +
* [[Importance of knowing about AI timelines]]
  
 
[[Category:AI safety]]
 
[[Category:AI safety]]

Latest revision as of 02:13, 5 March 2021

Importance of knowing about AI takeoff is about the "so what?" of knowing which AI takeoff scenario will happen. How will our actions change if we expect a FOOM or a continuous takeoff or some other scenario?

  • Weight on AI prepping: In a FOOM scenario it is unlikely that there are any selfish actions that benefit individuals, so one should shift those resources into either altruistic actions (like helping with alignment) or short-term consumption. In contrast, with more continuous takeoff, AI prepping becomes relatively more important.
  • Working to mitigate specific threats: e.g. if one expects AI-powered propaganda to be a thing during AI takeoff, then it makes sense to spend time thinking about it now.
  • Under some scenarios we should expect to see warning shots/early misalignment, whereas in others we should expect a treacherous turn. This has consequences for how careful we should be about alignment/how much of an "on-paper understanding" we should have about alignment.

TODO: more listed at the bottom of https://sideways-view.com/2018/02/24/takeoff-speeds/

See also