Difference between revisions of "Simple core of consequentialist reasoning"
(Created page with ""the MIRI intuition that there is a small core of good consequentialist reasoning that is important for AI capabilities and that can be discovered through theoretical research...") |
(No difference)
|
Revision as of 00:25, 19 February 2020
"the MIRI intuition that there is a small core of good consequentialist reasoning that is important for AI capabilities and that can be discovered through theoretical research." https://agentfoundations.org/item?id=1220
and Nate's comment https://agentfoundations.org/item?id=1228