Simple core of consequentialist reasoning

From Issawiki
Revision as of 00:25, 19 February 2020 by Issa (talk | contribs) (Created page with ""the MIRI intuition that there is a small core of good consequentialist reasoning that is important for AI capabilities and that can be discovered through theoretical research...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

"the MIRI intuition that there is a small core of good consequentialist reasoning that is important for AI capabilities and that can be discovered through theoretical research." https://agentfoundations.org/item?id=1220

and Nate's comment https://agentfoundations.org/item?id=1228