18

Dr_Manhattan comments on Ask MIRI Anything (AMA) - Effective Altruism Forum

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (77)

You are viewing a single comment's thread. Show more comments above.

Comment author: Dr_Manhattan 12 October 2016 08:59:46PM 0 points [-]

I’d guess that humanity as a whole has a fairly low probability of success, with wide error bars.

Just out of curiosity how would your estimate update if you can enough resources to do anything you deemed necessary but not enough to affect current trajectory of the field

Comment author: So8res 12 October 2016 11:40:44PM 0 points [-]

I'm not sure I understand the hypothetical -- most of the actions that I deem necessary are aimed at affecting the trajectory of the AI field in one way or another.

Comment author: Dr_Manhattan 13 October 2016 01:01:01PM 0 points [-]

Ok, that's informative. So the dominant factor is not the ability to get to the finish line faster (which kind of makes sense)