Difference between revisions of "Pivotal act"

From Issawiki
Jump to: navigation, search
(Examples)
(Examples)
Line 6: Line 6:
  
 
* "Build self-replicating open-air nanosystems and use them (only) to melt all GPUs."<ref>https://www.lesswrong.com/posts/7im8at9PmhbT4JHsW/ngo-and-yudkowsky-on-alignment-difficulty#1_1__Deep_vs__shallow_problem_solving_patterns</ref>
 
* "Build self-replicating open-air nanosystems and use them (only) to melt all GPUs."<ref>https://www.lesswrong.com/posts/7im8at9PmhbT4JHsW/ngo-and-yudkowsky-on-alignment-difficulty#1_1__Deep_vs__shallow_problem_solving_patterns</ref>
* Make progress on the full (i.e. not restricted to a limited AI system or [[minimal AGI]]) alignment problem faster than humans can<ref>https://www.lesswrong.com/posts/7im8at9PmhbT4JHsW/ngo-and-yudkowsky-on-alignment-difficulty#1_1__Deep_vs__shallow_problem_solving_patterns</ref>
+
* Make progress on the full (i.e. not restricted to a limited AI system like present-day systems or [[minimal AGI]]) alignment problem faster than humans can<ref>https://www.lesswrong.com/posts/7im8at9PmhbT4JHsW/ngo-and-yudkowsky-on-alignment-difficulty#1_1__Deep_vs__shallow_problem_solving_patterns</ref>
  
 
==Non-examples==
 
==Non-examples==

Revision as of 14:41, 26 February 2022

A pivotal act is (putting down my guess here) something that allows a single entity (?) to take over the world? form a singleton? prevent others from taking over the world or forming a singleton or achieving AGI?

according to some views of AI takeoff, a pivotal act is an important part of the plan for preventing existential doom from AI.

Examples

  • "Build self-replicating open-air nanosystems and use them (only) to melt all GPUs."[1]
  • Make progress on the full (i.e. not restricted to a limited AI system like present-day systems or minimal AGI) alignment problem faster than humans can[2]

Non-examples

External links

References

What links here