S

SethBaum

420 karmaJoined Jan 2015

Comments
45

I had a great overall experience at the conference. As a speaker, everything went smoothly for me. The organizers were great and I would definitely recommend them for future events. I would also recommend people attend future EAGxVirtual events.

It's important to emphasize the overall value of remove events. Advantages include reduced greenhouse gas emissions (especially from air travel), lower cost, less time intensive, less time away from family, COVID-safe, no travel visa required (facilitates geographic diversity), and more. I talk about this in my recent Forum post on climate change.

At EAGxVirtual, the geographic diversity struck me as being very good and substantially better than what I recall from in-person EAG events. At one point, I had a great conversation with people from Moscow, Australia, India, Tanzania, & a student in Costa Rica. It's hard to do that at an in-person conference.

Of note, that conversation was spontaneous, with people just walking up in the Gather. Maybe they thought to approach me because they knew I was speaking on Ukraine & nuclear war. But this is why we have conferences - to bring people together with similar interests and give them a chance to interact.

I also had a few good one-on-ones, mostly facilitated by the SwapCard.

Thanks for your thoughts on this. To briefly reply, I would disagree with the idea that renewables, electric cars, etc are a waste. As much as I might personally like to see more car-free urban design, it is still the case that renewables + electric cars can substantially reduce emissions. The point about ending fossil fuel subsidies is an important one, albeit with a caveat about political feasibility. To the extent that there can be political will to end these subsidies, it’s clearly a good thing to do, but finding the political will can be elusive.

Thank you for your thoughtful comments.

To all: Let me just briefly add that I believe this to be a compelling perspective worth taking seriously. Anyone wishing to contact Morton can find his email address on his page at UNEP here.

A point that I hope comes across in that section and throughout the post is that a lot of decisions on what to do on climate change do not depend on how large the catastrophic risk is. There is a role for analysis of the risk, and I have linked to studies doing that analysis. However, for purposes of this post, my interest is in discussing the details of the constructive actions that can be taken to address the risk instead of getting bogged down in analysis of the risk itself.

A more detailed catastrophic risk analysis could be useful for things like evaluating decisions on how much to prioritize climate change relative to other issues. However, even then the risk analysis would only be one component of the decision analysis, alongside a comparison of the quality of the opportunities to address climate change vs. the opportunities for other issues. Some analysis along those lines is in the section "Climate change warrants massive investment—but not necessarily ours".

That's a good question, thanks. My understanding is that opportunities to address cement issues are more specialized; see e.g. this. It could be a worthy point of focus for people pursuing a career in climate change or other more extensive involvement, especially people with relevant skill sets/etc. The neglectedness of cement is a point in favor of work on it. (Ditto refrigerants.) However, those who aren't pursuing something like this are unlikely to encounter cement opportunities. I could be wrong about this - I'm not a cement expert myself - though I can say that for all the various things I've been around, I've never had cement opportunities. There could be value in creating such opportunities and making them more widely available, but that process of creating opportunities would be a more specialized project. If it's something you're interested in I would certainly not discourage it.

In contrast, a basic understanding of how the energy supply works is likely to be useful for a wider range of people. For example, energy issues come up with some regularity in local political debates, such as on whether to shut down nuclear power plants (a major issue in Germany for example) and whether to permit certain renewables projects. A basic understanding may be helpful for people in their capacity as citizens.

Thanks for your comments. Some replies:

On renewables, coal, etc. - to me, the bottom line is the value of an "all of the above" approach to reducing emissions. Where there are opportunities to advance renewables or even nuclear, great. Where there are opportunities to reduce energy consumption, also great. The potential for renewables is amazing but we can't count on it solving the entire emissions problem in a sufficiently timely fashion.

On water shortages, this is not my expertise. There is a lot of work on climate change & water, but it would not surprise me if it has not focused on more extreme scenarios. I could see a role for this within the scope of careers focused on extreme climate change adaptation - in the careers section, see "Bonus idea—for people who are good at human development and emergency management/preparedness".

Thanks for the question.

Asteroid risk probably has the most cooperation and the most transparent communication. Asteroid risk is notable for its high degree of agreement: all parties around the world agree that it would be bad for Earth to get hit by a large rock, and that there should be astronomy to detect nearby asteroids, and that if a large Earthbound asteroid is detected, there should be some sort of mission to deflect it away from Earth. There are some points of disagreement, such as on the use of nuclear explosives for asteroid deflection, but this is a bit more down in the details.

Additionally, the conversation about asteroid risk is heavily driven by scientific communities. Scientists have a strong orientation toward transparency, such as publishing research in the open literature, including details on methods, etc. There are relatively few aspects of asteroid risk that involve the sorts of information that is less transparent, such as classified government information or proprietary business information. There is some, such as regarding nuclear explosives, but it's overall a small portion of the topic. This manifests in a relatively transparent conversation about asteroid risk.

The question of scalability is harder to answer. A lot of the relevance governance activities are singular or top-down in a way that scalability is less relevant. For example, it's hard to talk about the scalability of initiatives to deflect asteroids or make sound nuclear weapon launch decisions because these are things that only need to be done in a few isolated circumstances. 

It's easier to talk about the scalability of initiatives for reducing climate change because there's such a broad ongoing need to reduce greenhouse gases. For example, a notable recent development in the climate change space is the rapid growth in the market for electric bicycles; this is a technology that is rapidly maturing and can be manufactured at scale. Certain climate change governance concepts can also scale, for example urban design concepts that are initially implemented in a few neighborhoods and then scaled up. Scaling things like this up is often difficult, but it at least in principle can be scaled up.

The best way to answer this question is probably in terms of GCRI's three major areas of activity: research, outreach, and community support, plus the fourth item of organization development.

GCRI's ultimate goal is to reduce global catastrophic risk. Everything we do is oriented toward that end. Our research develops ideas and reduces uncertainty about how best to reduce global catastrophic risk. Our outreach gets those ideas to important decision-makers and helps us understand what research questions decision-makers would benefit from answers to. Our community support advances the overall population of people working on global catastrophic risk, including people who work with us on research and outreach. Our organization development work provides us with the capacity to do all of these things.

Phrased in terms of three problems: (1) We don't know the best ways of reducing global catastrophic risk, and so we are advancing research to understand this better. (2) We are not positioned to take all of the necessary actions to reduce global catastrophic risk on our own, so we are doing outreach to other people who are well positioned to have an impact and we are supporting the overall community of people who are working on the risks. (3) We don't have the capacity to do as much to reduce global catastrophic risk as we could, so we are developing the organization to increase our capacity.

I appreciate that this is all perhaps a bit vague. Because we work across so many topics within global catastrophic risk, it's hard to specify three more specific problems that we face. Some further detail is available at our Summary of 2021-2022 GCRI Accomplishments, Plans, and Fundraising, and in other comments on this AMA.
 

I regret that I don't have a good answer to this question. Global catastrophic risk doesn't have much in the way of statistics, due to the lack of prior global catastrophes. (Which is a good thing!)

There are some statistics on the amount of work being done on global catastrophic risk. For that, I would recommend the paper Accumulating evidence using crowdsourcing and machine learning: A living bibliography about existential risk and global catastrophic risk by Gorm Shackelford and colleagues at CSER. It finds that there is a significant body of work on the topic, in contrast with some prior concerns, such as those comparing the amount of research on global catastrophic risk to the amount of research on dung beetles.

Thanks for the question. I see that the question is specifically on neglected areas of research, not other types of activity, so I will focus my answer on that. I'll also note that my answers to this question map pretty closely to my own research agenda, which may be a bit of a bias, though it's also the case that I try to focus my research on the most important open questions.

For AI, there are a variety of topics in need of more attention, especially (1) the relation between near-term governance initiatives and long-term AI outcomes; (2) detailed concepts for specific, actionable governance initiatives in both public policy and corporate governance; (3) corporate governance in general (see discussion here); (4) the ethics of what an advanced AI should be designed to do; and (5) the implications of military AI for global catastrophic risk. There may also be neglected areas of research on how to design safe AI, though it is less my own expertise and it already gets a relatively large amount of investment.

For asteroids, I would emphasize the human dimensions of the risk. Prior work on asteroid risk has included a lot of contributions from astronomers and from the engineers involved in space missions, and I think comparatively little attention from social scientists. The possibility of an asteroid collision causing inadvertent nuclear war is a good example of a topic in need of a wider range of attention.

For climate change, one important line of research is on characterizing climate change as a global catastrophic risk. The recent paper Assessing climate change’s contribution to global catastrophic risk by S. J. Beard and colleagues at CSER provides a good starting point, but more work is needed. There is also a lot of opportunity to apply insights from climate change research to other global catastrophic risks. I've done this before here, here, here, and here. One good topic for new research would be evaluating the geoengineering moral hazards debate in terms of its implications for other risky technologies, including debates over what ideas shouldn't be published in the first place, e.g. Was breaking the taboo on research on climate engineering via albedo modification a moral hazard, or a moral imperative?

For nuclear weapons, I would like to see more on policy measures that are specifically designed to address global catastrophic risk. My winter-safe deterrence paper is one effort in that direction, but more should be done to develop this sort of idea.

For biosecurity, I'm less at the forefront of the literature, so I have fewer specific suggestions, though I would expect that there are good opportunities to draw lessons from COVID-19 for other global catastrophic risks.

Load more