Difference between revisions of "Reference class forecasting on human achievements argument for AI timelines"
Line 1: | Line 1: | ||
− | '''Reference class forecasting on human achievements''' uses various [[reference class]]es like "ambitious STEM technology" or "notable mathematical conjectures" to get an [[outside view]] probability of [[AI timelines]]. The [[Open Philanthropy]] report on semi-informative priors is an example of this type of forecasting.<ref>https://www.openphilanthropy.org/blog/report-semi-informative-priors</ref> | + | '''Reference class forecasting on human achievements''' uses various [[reference class]]es like "ambitious STEM technology" or "notable mathematical conjectures" to get an [[outside view]] probability of [[AI timelines]]. The [[Open Philanthropy]] report on semi-informative priors is an example of this type of forecasting.<ref>https://www.openphilanthropy.org/blog/report-semi-informative-priors</ref> This estimate might then further be adjusted according to the inputs going into AGI creation (e.g. number of researchers and amount spent on computation). |
==References== | ==References== |
Revision as of 01:42, 5 April 2021
Reference class forecasting on human achievements uses various reference classes like "ambitious STEM technology" or "notable mathematical conjectures" to get an outside view probability of AI timelines. The Open Philanthropy report on semi-informative priors is an example of this type of forecasting.[1] This estimate might then further be adjusted according to the inputs going into AGI creation (e.g. number of researchers and amount spent on computation).