Hide table of contents

I have not researched longtermism deeply. However, what I have found out so far leaves me puzzled and skeptical. As I currently see it, you can divide what longtermism cares about into two categories:

1) Existential risk.

2) Common sense long-term priorities, such as:

  • economic growth
  • environmentalism
  • scientific and technological progress
  • social and moral progress 

Existential risk isn’t a new idea (relative to longtermism) and economic growth, environmentalism, and societal progress aren’t new ideas either. Suppose I already care a lot about low-probability existential catastrophes and I already buy into common sense ideas about sustainability, growth, and progress. Does longtermism have anything new to tell me? 

8

0
0

Reactions

0
0
New Answer
New Comment

2 Answers sorted by

Longtermism suggests a different focus within existential risks, because it feels very differently about "99% of humanity is destroyed, but the remaining 1% are able to rebuild civilisation" and "100% of humanity is destroyed, civilisation ends", even though from the perspective of people alive today these outcomes are very similar.

I think relative to neartermist intuitions about catastrophic risk, the particular focus on extinction increases the threat from AI and engineered biorisks relative to e.g. climate change and natural pandemics. Basically, total extinction is quite a high bar, and most easily reached by things deliberately attempting to reach it, relative to natural disasters which don't tend to counter-adapt when some survive.

Longtermism also supports research into civilisational resilience measures, like bunkers, or research into how or whether civilisation could survive and rebuild after a catastrophe.

Longtermism also lowers the probability bar that an extinction risk has to reach before being worth taking seriously. I think this used to be a bigger part of the reason why people worked on x-risk when typical risk estimates were lower; over time, as risk estimates increased. longtermism became less necessary to justify working on them.

because it feels very differently about "99% of humanity is destroyed, but the remaining 1% are able to rebuild civilisation" and "100% of humanity is destroyed, civilisation ends"

Maybe? This depends on what you think about the probability that intelligent life re-evolves on earth (it seems likely to me) and how good you feel about the next intelligent species on earth vs humans.

the particular focus on extinction increases the threat from AI and engineered biorisks

IMO, most x-risk from AI probably doesn't come from literal human extinction but instead AI s... (read more)

2
Ben Millwood
12d
Yeah, it seems possible to be longtermist but not think that human extinction entails loss of all hope, but extinction still seems more important to the longtermist than the neartermist. valid. I guess longtermists and neartermists will also feel quite different about this fate.

This is an interesting point, and I guess it’s important to make, but it doesn’t exactly answer the question I asked in the OP.

In 2013, Nick Bostrom gave a TEDx talk about existential risk where he argued that it’s so important to care about because of the 10^umpteen future lives at stake. In the talk, Bostrom referenced even older work by Derek Parfit. (From a quick Google, the Parfit stuff on existential risk was from his book Reasons and Persons, published in 1984.)

I feel like people in the EA community only started talking about "longtermism" in the la... (read more)

4
Ben Millwood
12d
I guess I think of caring about future people as the core of longtermism, so if you're already signed up to that, I would already call you a longtermist? I think most people aren't signed up for that, though.
3
KevinO
12d
I agree that if you're already bought in to moral consideration for 10^umpteen future people, that's longtermism.

One takeaway, I think, is that these things which already seem good under common sense are much more important in the longtermist view.  For example, I think a longtermist would want extinction risk to be much lower than what you'd want from a commonsense view.

Does this apply to things other than existential risk?

1
KevinO
12d
Yes. I think your list of commonsense priorities are even more beneficial in the view of longtermism. Factors like "would this have happened anyway, just a bit later" may still apply and reduce the impact of any given intervention. Then again, notions like "we can reach more of the universe the sooner we start expanding" could be an argument for sooner being better for economic growth.
Comments1
Sorted by Click to highlight new comments since: Today at 4:34 AM

Answering this question depends a little on having a sense of what the "non-longtermist status quo" is, but:

  • I think there's more than one popular way of thinking about issues like this,
    • in particular I think it's definitely not universal to take existential risk seriously,
  • I think common-sense and the status quo include some (at least partial) longtermism, e.g. I think popular rhetoric around climate change has often held the assumption that we were taking action primarily with our descendants in mind, rather than ourselves.
Curated and popular this week
Relevant opportunities