In talking with people about the long-term future, I've found it to be extremely helpful to give an estimate for the percent chance humanity goes extinct by 2100 (or in 100 years). Right now, I say ~10% or 5-19%, and then say something like "it would be really nice if we could get that number below 1%". My estimate is taken from these sources:
- FHI's casual 2008 survey of various x-risks. Taken together, they give a 19% chance by 2100.
- The 2007 Stern Review, a 700-page report on climate change. It uses 0.1% as an upper bound modeling assumption for annual extinction risk, which means 9.5% in the next 100 years (by 2107).
- This July 2018 Vox article from Liv Boeree that references the FHI study and also says "5 to 19 percent chance of complete human extinction by the end of this century". (I'm not sure where the 5% comes from?)
- This 2016 report on GCRs from FHI and the Global Priorities Project. They reference the two sources above and just say it's hard to create a reasonable estimate.
However, I find the strength of these estimates pretty weak. If someone were to ask me to "back up" my 10% number, the best I'd have is an informal survey circulated at an x-risk conference in 2008. So, a couple questions:
- Are there other sources that I'm missing?
- Do others also feel like it would be helpful to have an updated/more rigorous estimate here? (Or is it not actually helpful to operate at this level of abstraction? i.e. Should we concentrate just on individual sources of x-risk instead?)
- Is it even possible to create an estimate like this? Or is the range of uncertainty just too large, that we'd need to give an estimate like 2-59%? (Clearly, this gets more difficult the longer we try to project out. But can't we estimate it for 2050, 2075, or 2100?)
Thanks for your thoughts and help!
don't forget the doomsday argument.
https://arxiv.org/abs/1705.08807 has a question about the probability that the outcome of AI will be "extremely bad."
Where in the Stern report are you looking?
The fixed 0.1% extinction risk is used as a discount rate in the Stern report. That closes the model to give finite values (instead of infinite benefits) after they exclude pure temporal preference discounting on ethical grounds. Unfortunately, the assumption of infinite confidence in a fixed extinction rate, gives very different (lower) expected values than a distribution that accounts for the possibility of extinction risks eventually becoming stably low for long periods (the Stern version gives a probability of less than 1 in 20,000 to civilization surv... (read more)