Hello Effective Altruism Forum,
I am Seth Baum and I will be here to answer your questions 3 March 2015, 7-9 PM US ET (New York time). You can post questions in this thread in the meantime. Here is some more background:
I am Executive Director of the Global Catastrophic Risk Institute (GCRI). I co-founded GCRI in 2011 with Tony Barrett. GCRI is an independent, nonprofit think tank studying major risks to the survival of human civilization. We develop practical, effective ways to reduce the risks.
There is often some confusion among effective altruists about how GCRI uses the term “global catastrophic risk”. The bottom line is that we focus on risk of catastrophes that could cause major permanent harm. This is similar to some use of “existential risk”. You can read more about that here.
GCRI just announced major changes to GCRI’s identity and direction. We are focusing increasingly on in-house research oriented towards assessing the best ways of reducing the risks. This is at the heart of our new flagship integrated assessment project, which puts all the gcrs into one study to learn the best risk reduction opportunities.
If you’d like to stay up to date on GCRI, you can sign up for our monthly email newsletter. You can also support GCRI by donating.
And GCRI is not active on social media, but you can follow me on Twitter.
I am excited to have this chance to speak with the online effective altruism community. I was involved in the online utilitarianism community around 2006-2007 via my Felicifia blog. I’m really impressed with how the community has grown. A lot of people have put a lot of work into this. Thanks go in particular to Ryan Carey for setting up today’s AMA and for doing so much more.
There are also a few things I’m hoping to learn from you:
First, I am considering a research project on what motivates people to take on major global issues and/or to act on altruistic principles more generally. I would be interested in any resources you know of about this. It could be research on altruism/global issues in general or research on what motivates people to pursue effective altruism.
Second, I am interested in what you think are major open questions in gcr/xrisk. Are you facing decisions to get involved in gcr/xrisk, or to take certain actions to reduce the risks? For these decisions, is there information that would help you figure out what to do? Your answers here can help inform the directions GCRI pursues for its research. We aspire to help people make better decisions to more effectively reduce the risks.
You're welcome!
Well, I regret that GCRI doesn't have the funds to be hiring right now. Also, I can't speak for other think tanks. GCRI runs a fairly unique operation. But I can say a bit on what we look for in people we work with.
Some important things to have for GCRI include: (1) a general understanding of gcr/xrisk issues, for example by reading research from GCRI, FHI, and our colleagues; (2) deep familiarity with specific important gcrs, including research literature, expert communities, and practitioner communities; (3) capability with relevant methodologies in quantitative risk analysis such as risk modeling and expert elicitation; (4) demonstrated ability to publish in academic journals or significant popular media outlets, speak at professional conferences, or otherwise get your ideas heard; (5) ability to work across academic disciplines and professions, and to work with teams of similarly diverse backgrounds.
It depends on what you mean by 'technically inclined'. Could you clarify?
I don't have confident estimates on relative probabilities, but I agree that totalitarianism is important to have on our radar. It's also a very delicate risk to handle, as it points directly to the highest bastions of power. Interestingly, totalitarianism risk resonates well with certain political conservatives who might otherwise dismiss gcr talk as alarmist. At any rate, I would not discourage you from looking into totalitarianism risk further.
First, I commend you for thinking in terms of deconstructed narratives and narratives as tools. I'm curious as to your background. Most people I know who self-identify as 'technically inclined' cannot speak coherently about narrative construction.
This is something I think about a lot. One narrative I use comes from James Martin's book 'The Meaning of the 21st Century'. The title on its own offers a narrative, essentially the same as in Martin Rees's 'Our Final Century'. Within the book, Martin speaks of this era of human civilization as going through a period of turbulence, like in a river with rapids. I don't have the exact quote here but I think he uses the river metaphor. At any rate, the point is that global civilization is going through a turbulent period. If we can successfully navigate the turbulence, we have a great, beautiful future ahead of us. I've used this in a lot of talks with a lot of different audiences and it seems to resonate pretty well.
One common proposal is to stockpile food and other resources, or even to build refuges. This could be very helpful. An especially promising idea from Dave Denkenberger of GCRI and Joshua Pearce of Michigan Tech is to grow food from fossil fuels, trees, and other biomass. So even if the sun is blocked (as in e.g. nuclear winter) we can still feed ourselves. See http://www.appropedia.org/Feeding_Everyone_No_Matter_What. These are some technological solutions. It's also important to have social solutions. These are institutions that respond well to major disturbances, psychological practices, and more. We say a bit on this in http://sethbaum.com/ac/2013_AdaptationRecovery.html and http://gcrinstitute.org/aftermath, but this is an understudied area of gcr. However, there is a lot of great research on local-scale disaster vulnerability and resilience that can be leveraged for gcr.
It's certainly relevant. I used to think it was not promising due to the extremely high cost of space programs relative to activities on Earth. However, Jacob Haqq-Misra (http://haqqmisra.net) of GCRI and Blue Marble Space made the great point that space programs may be happening anyway for other reasons, in particular political, scientific, and economic reasons. It may be reasonably cost-effective to 'piggyback' gcr reduction into existing space programs. This relates back to an earlier comment I made about the importance of stakeholder engagement.
I took an honors BA which included a pretty healthy dose of post-structuralist inflected literary theory, along with math and fine arts. I did a masters in architecture, worked in that field for a time, then as a 'creative technologist' and now I'm very happy as a programmer, trying to learn as much math as I can in my free time.