2

PeterMcCluskey comments on A model of the Machine Intelligence Research Institute - Oxford Prioritisation Project - Effective Altruism Forum

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (4)

You are viewing a single comment's thread.

Comment author: PeterMcCluskey 22 May 2017 02:41:16PM 5 points [-]

Can you explain your expected far future population size? It looks like your upper bound is something like 10 orders of magnitude lower than Bostrom's most conservative estimates.

That disagreement makes all the other uncertainty look extremely trivial in comparison.

Comment author: ThomasSittler 23 May 2017 10:55:02AM *  1 point [-]

Do you mean Bostrom's estimate that "the Virgo Supercluster could contain 10^23 biological humans"? This did come up in our conversations. One objection that was raised is that humanity could go extinct, or for some other reason colonisation of the Supercluster could have a very low probability. There was significant disagreement among us, and if I recall correctly we chose the median of our estimates.

Do you think Bostrom is correct here? What probability distribution would you have chosen for the expected far future population size? :)

Comment author: PeterMcCluskey 23 May 2017 07:11:03PM 3 points [-]

colonisation of the Supercluster could have a very low probability.

What do you mean by very low probability? If you mean a one in a million chance, that's not improbable enough to answer Bostrom. If you mean something that would actually answer Bostrom, then please respond to the SlateStarCodex post Stop adding zeroes.

I think Bostrom is on the right track, and that any analysis which follows your approach should use at least a 0.1% chance of more than 10^50 human life-years.