Late 2021 MIRI conversations
https://www.lesswrong.com/s/n945eovrA3oDueqtq
Title | Summary | Things that were interesting to me | Further reading |
---|---|---|---|
Ngo's view on alignment difficulty | Richard Ngo puts forth his own case about why he is more optimistic (compared to Eliezer Yudkowsky) about humanity handling the creation of AGI well. |