Difference between revisions of "Future planning"

From Issawiki
Jump to: navigation, search
Line 2: Line 2:
 
** AI safety vs something else? right now AI safety seems like the best candidate for the biggest/soonest change, but i want to investigate some other things.
 
** AI safety vs something else? right now AI safety seems like the best candidate for the biggest/soonest change, but i want to investigate some other things.
 
** if AI safety, then what technical agenda seems best? this matters for (1) deciding what to do technical research on, if at all; (2) what technical research to pay attention to/support.
 
** if AI safety, then what technical agenda seems best? this matters for (1) deciding what to do technical research on, if at all; (2) what technical research to pay attention to/support.
** if AI safety, then what will the end of the world look like (basically this is takeoff speeds + what the most likely failure mode looks like)? (this matters for prepping)
+
** if AI safety, then what will the end of the world look like (basically this is takeoff speeds + what the most likely failure mode looks like)? (this matters for [[AI prepping|prepping]])
 
** how likely is the end of the world?
 
** how likely is the end of the world?
 
** when will AGI come?
 
** when will AGI come?
Line 19: Line 19:
 
| Age of em || WBE ||
 
| Age of em || WBE ||
 
|-
 
|-
| Paul's end of world || AGI || ? || || || 20 years? || AI prepping???
+
| Paul's end of world || AGI || ? || || || 20 years? || [[AI prepping]]???
 
|-
 
|-
 
| Great Stagnation (world stays basically the same for the next 100 years) ||
 
| Great Stagnation (world stays basically the same for the next 100 years) ||

Revision as of 01:26, 10 March 2020

  • the most decision-relevant questions for me right now (everything else should feed into one of these questions):
    • AI safety vs something else? right now AI safety seems like the best candidate for the biggest/soonest change, but i want to investigate some other things.
    • if AI safety, then what technical agenda seems best? this matters for (1) deciding what to do technical research on, if at all; (2) what technical research to pay attention to/support.
    • if AI safety, then what will the end of the world look like (basically this is takeoff speeds + what the most likely failure mode looks like)? (this matters for prepping)
    • how likely is the end of the world?
    • when will AGI come?

Example scenarios:

Name of scenario Next big thing? AI safety technical agenda What will end of the world look like? How likely is the world to end? When will the world end/reach a singularity/cure aging? What I should do
Eliezerian end of world AGI MIRI? FOOM/AI takeover Highly likely? 10-20 years? Don't have kids. Probably don't even bother with wife search. Focus on AI safety research, either field building or figuring out how to contribute to MIRI-like technical work.
Eliezerian end of world, but longer timelines AGI MIRI? FOOM/AI takeover Highly likely? 40 years
Eliezerian end of world, but even longer timelines AGI MIRI? FOOM/AI takeover Highly likely? 60 years
Age of em WBE
Paul's end of world AGI ? 20 years? AI prepping???
Great Stagnation (world stays basically the same for the next 100 years)
Super long AI timelines (AI won't come for 100 years)