Difference between revisions of "Simple core of consequentialist reasoning"

From Issawiki
Jump to: navigation, search
(Created page with ""the MIRI intuition that there is a small core of good consequentialist reasoning that is important for AI capabilities and that can be discovered through theoretical research...")
 
Line 2: Line 2:
  
 
and Nate's comment https://agentfoundations.org/item?id=1228
 
and Nate's comment https://agentfoundations.org/item?id=1228
 +
 +
==See also==
 +
 +
* [[Hardware-driven vs software-driven progress]]

Revision as of 00:26, 19 February 2020

"the MIRI intuition that there is a small core of good consequentialist reasoning that is important for AI capabilities and that can be discovered through theoretical research." https://agentfoundations.org/item?id=1220

and Nate's comment https://agentfoundations.org/item?id=1228

See also