Hide table of contents

In 1939, Einstein wrote to Roosevelt:[1]

It may be possible to set up a nuclear chain reaction in a large mass of uranium…and it is conceivable — though much less certain — that extremely powerful bombs of a new type may thus be constructed.

Just a few years later, these bombs were created. In little more than a decade, enough had been produced that, for the first time in history, a handful of decision-makers could destroy civilisation.

Humanity had entered a new age, where we faced not only existential risks[2] from our natural environment, but also those of our own creation.

In this new age, what should be our biggest priority as a civilisation? Improving technology? Helping the poor? Changing the political system?

Here’s a suggestion that’s not so often discussed: our first priority should be to survive.

So long as civilisation continues to exist, we’ll have the chance to solve all our other problems, and have a far better future. But if we go extinct, that’s it.

Why isn’t this priority more discussed? Here’s one reason: many people don’t yet appreciate the change in situation, and so don’t think our future is at risk.

Social science researcher Spencer Greenberg surveyed Americans on their estimate of the chances of human extinction within 50 years. The results found that many think the chances are extremely low, with over 30% guessing they’re under one in ten million.[3]

We used to think the risks were extremely low as well, but when we looked into it, we changed our minds. As we’ll see, researchers who study these issues think the risks are over one thousand times higher, and are probably increasing.

These concerns have started a new movement working to safeguard civilisation, which has been joined by Stephen Hawking, Max Tegmark, and new institutes founded by researchers at Cambridge, MIT, Oxford, and elsewhere.

In the rest of this article, we cover the greatest risks to civilisation, including some that might be bigger than nuclear war and climate change. We then make the case that reducing these risks could be the most important thing you do with your life, and explain exactly what you can do to help. If you would like to use your career to work on these issues, we can also give one-on-one support.

Continue reading on 80,000 Hours' website

This work is licensed under a Creative Commons Attribution 4.0 International License.


  1. "In the course of the last four months it has been made probable — through the work of Joliot in France as well as Fermi and Szilárd in America — that it may become possible to set up a nuclear chain reaction in a large mass of uranium, by which vast amounts of power and large quantities of new radium-like elements would be generated. Now it appears almost certain that this could be achieved in the immediate future. "This new phenomenon would also lead to the construction of bombs, and it is conceivable — though much less certain — that extremely powerful bombs of a new type may thus be constructed. A single bomb of this type, carried by boat and exploded in a port, might very well destroy the whole port together with some of the surrounding territory. However, such bombs might very well prove to be too heavy for transportation by air."

    Einstein–Szilárd letter, Wikipedia, Archived link, retrieved 17 October 2017. ↩︎

  2. Nick Bostrom defines an existential risk as event that “could cause human extinction or permanently and drastically curtail humanity’s potential”. An existential risk is distinct from a global catastrophic risk (GCR) in its scope – a GCR is catastrophic at a global scale, but retains the possibility for recovery. An existential threat seems to be used as a linguistic modifier of a threat to make it appear more dire. ↩︎

  3. Greenberg surveyed users of Mechanical Turk, who tend to be 20-40 and more educated than average, so the survey doesn’t represent the views of all Americans. See more detail in this video: Social Science as Lens on Effective Charity: results from four new studies – Spencer Greenberg.

    The initial survey found a median estimate of the chance of extinction within 50 years of 1 in 10 million. Greenberg did three replication studies and these gave higher estimates of the chances. The highest found a median of 1 in 100 over 50 years. However, even in this case, 39% of respondents still guessed that the chances were under 1 in 10,000 (about the same as the chance of a 1km asteroid strike). In all cases, over 30% thought the chances were under 1 in 10 million. You can see a summary of all the surveys here.

    Note that when we asked people about the chances of extinction with no timeframe, the estimates were much higher. One survey gave a median of 75%. This makes sense — humanity will eventually go extinct. This helps to explain the discrepancy with some other surveys. For instance, “Climate Change in the American Mind” (May 2017, archived link), found that the median American thought the chance of extinction from climate change is around 1 in 3. This survey, however, didn’t ask about a specific timeframe. When Greenberg tried to replicate the result with the same question, he found a similar figure. But when Greenberg asked about the chance of extinction from climate change in the next 50 years, the median dropped to only 1%. Many other studies also don’t correctly sample low probability estimates — people won’t typically answer 0.00001% unless presented with the option explicitly.

    However, as you can see, these types of surveys tend to give very unstable results. Answers seem to vary on exactly how the question is asked and on context. In part, this is because people are very bad at making estimates of tiny probabilities. This makes it hard to give a narrow estimate of what the population in general thinks, but none of what we’ve discovered refutes the idea that a significant number of people (say over 25%) think the chances of extinction in the short term are very, very low, and probably lower than the risk of an asteroid strike alone. Moreover, the instability of the estimates doesn’t seem like reason for confidence that humanity is rationally handling these risks. ↩︎

Comments3
Sorted by Click to highlight new comments since:

I think the approach taken in this post is still good: make the case that extinction risks are too small to ignore and neglected, so that everyone should agree we should invest more in them (whether or not you're into a longtermism).

It's similar to the approach taken in the Precipice, though less philosophical and longtermist.

I think it was a impactful post in that it was 80k's main piece arguing in favour of focusing more of existential risk during a period when the community seems to have significantly shifted towards focusing on those risks, and during a period when 80k was likely one of the influences on that.

If I were writing it again, I'd probably emphasise risk from asteroids and nuclear war less than I did, since I think they're small. I'd also try to avoid getting into the issue of what people think the risk of extinction is by default, since different surveys have such variable data about this.

The data and arguments in the Precipice are more thorough and up-to-date, so I'd also try to bring it in line with that.

Given this hasn't been done, these days I normally refer people to our podcast with Toby Ord or the Precipice instead.

Not sure if this is the best place to raise this but the link to the Concreate Problem in AI Safety on 80,000 Hours' site is down. 

Considero que es importante tener claridad de la dimension de la situación, ahi estamos claros, sin embargo creo que cuando hablemos de encuestas para usarlas de referencia es necesario tener encuestas en todo caso diferenciadas, por lugares distintos, edades, sexo, género y la pregunta no debe ser muy general, tiene que ser mas específica, por tanto es muy relativo hablar de una encuesta en los EEUU y el resto de paises, ahora es cierto que el tema de cambio climático en especial no es priorizado en este país, pero creo que podemos hacer análisis más aterrizados

Curated and popular this week
Relevant opportunities