Comment author: Elityre 08 October 2018 02:57:26PM *  6 points [-]

In the short term, senior hires are most likely to come from finding and onboarding people who already have the required skills, experience, credentials and intrinsic motivation to reduce x-risks.

Can you be more specific about the the required skills and experience are?

Skimming the report, you say "All senior hires require exceptionally good judgement and decision-making." Can you be more specific about what that means and how it can be assessed?

Comment author: oliverbramford 09 October 2018 12:17:21PM *  4 points [-]

The required skills and experience of senior hires vary between fields and roles; senior x-risk staff are probably best-placed to specify these requirements in their respective domains of work. You can look at x-risk job ads and recruitment webpages of leading x-risk orgs for some reasonable guidance. (we are developing a set of profiles for prospective high-impact talent, to give a more nuanced picture of who's required).

"Exceptionally good judgement and decision-making", for senior x-risk talent, I believe requires:

  • a thorough and nuanced understanding of EA concepts and how they apply to the context

  • good pragmatic foresight - an intuitive grasp of the likely and possible implications of one's actions

  • a conscientious risk-aware attitude, with the ability to think clearly and creatively to identify failure modes

Assessing good-judgement and decision-making is hard; it's particularly hard to assess the consistency of a person's judgement without knowing/working with them over at least several months. Some methods:

  • Speaking to a person can quickly clarify their level of knowledge of EA concepts and how they apply to the context of their role.

  • Speaking to references could be very helpful, to get a picture of how a person updates their beliefs and actions.

  • Actually working with them (perhaps via a work trial, partnership or consultancy project) is probably the best way to test whether a person is suitable for the role

  • A critical thinking psychometric test may plausibly be a good preliminary filter, but is perhaps more relevant for junior talent. A low score would be a big red flag, but a high score is far from sufficient to imply overall good judgement and decision-making.

Comment author: MichaelPlant 23 October 2017 03:53:09PM 5 points [-]

Hello. I'll reply here even though my comment covers all three posts. My general worry is that I think you've drawn a distinction without a difference. I can see that 'optimise Earth' and 'do the most good' are different sentences, but I'm still not really sure what the practical differences are supposed to be between them. As a test: what would a 'do the most good-er' and an 'Earth optimiser' disagree about?

On your EA paradigm vs systems changes paradigm, I fear you've specifed EA into a particular, not very generous, way just to make your point. For instance, why can't an EA say "we should maximise our collective impact"? I agree that EAs should be thinking about co-operation and the full impact of their actions but EAs tend to be aware of this already. I don't think I know anyone who really wants to maximise their personal impact at the expense of total impact (whatever exactly this means).

Then on this post, you seem to be presenting this as if no EA has ever thought about systemic changes. But that's not true. EAs have often thought about systemic changes, they just haven't been (that) convinced by then. If you want to change people's minds on this issues, I would strongly encourage you to write up one of these areas as a cause profile and compare it to existing ones, e.g. how does it score on scale, tractability and neglectedness? I generally much prefer it if people give (at least somewhat) worked out arguments for why New Cause X is better than Old Cause Y, rather than saying only "ah, but what about New Cause X? Have you considered that?" Advocates of New Cause X tend to be able to do the best job of making the case for that cause anyway, so it makes sense for them to try to do it anyway.

Comment author: oliverbramford 25 October 2017 02:49:55PM 0 points [-]

what would a 'do the most good-er' and an 'Earth optimiser' disagree about?

Great question!

I'm not sure if there is any direct logical incompatibility between a 'do the most good-er' and an 'Earth optimiser'. Rather, I think the Earth optimiser frames the challenge of doing the most good in a particular way that tends to give greater consideration to collective impact and long run indirect effects than is typical in the EA community.

As an Earth optimiser, I am confident that we can substantially improve on our current cause prioritisation methodology, to better account for long run indirect effects and better maximise collective impact. By modelling the Earth as a complex system, defining top-level systemic goals/preferred outcomes, and working backwards to identify the critical next steps to get there, I expect would lead many of us to revise what we currently consider to be top priority causes.

I would strongly encourage you to write up one of these areas as a cause profile and compare it to existing ones

When it comes to complex systems change causes, I think a substantial amount of up front research is typically required to write up a remotely accurate cause profile, that can be compared meaningfully with direct-impact causes. Complex systems typically seem highly intractable at first glance, but a systems analysis may highlight a set of neglected interventions, which when pursued together, make systems change fairly tractable.

As a good example, I am currently part of the leadership team working on a political systems change research project (set up under EA Geneva). This is a year-long project with a team of (part-time volunteer) researchers. We will do a detailed literary review, a series of events with policy makers, and a series of expert interviews. We hope that this will be enough to evaluate the tractability of this as a cause area, and locate it's priority in relation to other cause areas.

Comment author: Michael_Wiebe 23 October 2017 04:28:38PM 0 points [-]

For what it's worth, I currently think the solution requires modelling the Earth as a complex system, clarifying top-level metrics to optimise the system for, and a probability weighted theory of change for the system as a whole.

I'd be interested in seeing this. Do you have anything written up?

Comment author: oliverbramford 25 October 2017 01:51:56PM 0 points [-]

Parts 3 and 5 of the article linked below explain this approach is more detail, although my thinking has moved on a bit since writing this.

There's a good chance that these ideas will be refined and written up collaborative in an applied context as part of GeM Labs' Understanding and Optimising Policy project over the next year. If they are out of scope of this project, I intend to develop them independently and share my progress.

https://docs.google.com/document/d/1DFZ9OAb0g5dtQuZHbAfngwACQkgSpjqrpWWOeMrsq7o/edit?usp=sharing

Comment author: Michael_Wiebe 22 October 2017 06:20:47PM 2 points [-]

In general, a cause needs to score high on each of impact, tractability, and neglectedness to be worthwhile. Getting two out of three is no better than zero out of three. You've listed causes with high impact, but they're generally not tractable. For example, changing the political system is highly intractable.

Overall, I think that EA has already incorporated the key insights from systems change, and there's no need to distinguish it as being separate from EA.

Comment author: oliverbramford 23 October 2017 03:54:28PM *  0 points [-]

System change causes are inherently complex and thus often appear highly intractable initially. However, with detailed systems analysis a set of viable (and perhaps novel) approaches may (sometimes) be identified, which are much more tractable than expected.

For example, the system of animal agriculture and animal product consumption is pretty complex, but ACE have done a great job to identify charities that are working very effectively on different aspects of that system (cultured meat, advocacy to corporates, promoting veganism, etc.).

Analysing a complex system in detail sheds new light on what's broken and why, and can highlight novel and neglected solutions (e.g. cultured meat) that make changing the system far more tractable.


changing the political system is highly intractable

The political systems is very complex, but we don't yet know how tractable it is. We are currently researching this at EA Geneva/Geneva Macro Labs. If we find a political systems change strategy that is even moderately tractable, I suspect it would be worth pursuing due to the magnitude of the likely flow through effects. If we change the political system to better prioritise policies, this would make changing many other important systems (economic, defence, education, etc.) way more tractable.

Comment author: joshjacobson  (EA Profile) 23 October 2017 02:00:23PM 0 points [-]

I don't think there was any reason for this to be split into 3 posts? It'd be better to condense it into one.

Comment author: oliverbramford 23 October 2017 03:20:36PM 0 points [-]

I had written a much longer piece, and had feedback from a number of people that it would be best to split it up.

Comment author: Michael_Wiebe 22 October 2017 05:48:25PM 4 points [-]

I think the marginal vs. total distinction is confused. Maximizing personal impact, while taking into account externalities (as EAs do), will be equivalent to maximizing collective impact.

An Effective Altruist, by focusing on impact at the margin, may ask questions such as: What impact will my next $100 donation make in this charity vs that charity?

It seems you're trying to set up a distinction between EA focusing on small issues, and systems change focusing on big issues. But this is a strawman. Even if an individual makes a $100 donation, the cause they're donating to can still target a systemic issue. In any case, there are now EAs making enormous donations: "What if you were in a position to give away billions of dollars to improve the world? What would you do with it?"

This approach invites sustained collective tolerance of deep uncertainty, in order to make space for new cultural norms to emerge. Linear, black-and-white thinking risks compromising this creative process before desirable novel realities have fully formed in a self-sustaining way.

This is pretty mystical.

Comment author: oliverbramford 23 October 2017 03:14:55PM *  0 points [-]

while taking into account externalities (as EAs do)

I think that the current EA methodology to take into account impact externalities is incomplete. I am not aware of any way to reliably quantify flow-through effects, or to quantify how a particular cause area indirectly affects the impact of other cause areas.

The concept of total impact, if somehow integrated into our cause prioritisation methodology, may help us to account for impact externalities more accurately. I concede that total impact may be too simplistic a concept...

For what it's worth, I currently think the solution requires modelling the Earth as a complex system, clarifying top-level metrics to optimise the system for, and a probability weighted theory of change for the system as a whole.


It seems you're trying to set up a distinction between EA focusing on small issues, and systems change focusing on big issues.

I do not mean to say that EA focuses on small issues and systems change focuses on big issues. Rather, I see EA as having a robust (but incomplete) cause prioritisation methodology, and systems change having a methodology that accounts well for complexity (but neglects cause prioritisation in the context of the system of Earth as a whole).


This is pretty mystical.

On reflection, I think that conducting systems change projects in appropriate phases, with clear expectations for each phase, is a viable way to synthesis EA and systems change approaches and culture. Specifically, a substantial research phase would typically be required to understand the system before one can know what interventions to prioritise.

Comment author: maswiebe 22 October 2017 05:16:14PM 4 points [-]

It's obviously the case that "do the most good" is equivalent to "optimize the Earth". HPMOR readers will remember: "World domination is such an ugly phrase. I prefer to call it world optimisation."

But given that they're equivalent, I don't see that changing the label offers any benefits. For example, the theoretical framework linked to "do the most good" already gives us a way to think about how to choose causes while taking into account inter-cause spillovers (corresponding to 1(iv)).

Comment author: oliverbramford 23 October 2017 11:02:55AM *  0 points [-]

the theoretical framework linked to "do the most good" already gives us a way to think about how to choose causes while taking into account inter-cause spillovers

I think impact 'spill-overs' between causes is a good representation of how most EAs mentally think about the relationship between causes and impact. However, I see this as an inaccurate representation of what's actually going on, and I suspect this leads to a substantial mis-allocation of resources.

I suspect that long term flow-through effects typically outweigh the immediate observable impact of working on any given cause (because flow-through effects accumulate indefinitely over time). 'Spill-over' suggests that impact can be neatly attributed to one cause or another, but in the context of complex systems (i.e. the world we live in), impact is often more accurately understood as resulting from many factors, including the interplay of a messy web of causes pursued over many decades.

I see 'Earth optimization', as a useful concept to help us develop our cause prioritisation methodology to better account for the inherent complexity of the world we aim to improve, better account for long run flow-through effects, and thus help us to allocate our resources more effectively as individuals and as a movement.

1

5 Types of Systems Change Causes with the Potential for Exceptionally High Impact (post 3/3)

Here are five classes of systems change causes that logically have the potential for very high impact. Each class includes numerous cause areas that are worthy of further investigation to evaluate their potential impact. The examples used are neither exhaustive nor intended as recommended causes (although some may become recommended... Read More
1

Effective Altruism Paradigm vs Systems Change Paradigm (post 2/3)

Systems change and Effective Altruism each have their own body of knowledge, tools and ways of working. Both fields are aligned in purpose, but operate in different paradigms: Effective Altruism “I maximize my personal impact” “How can I do the most good?” “The main ways I can maximize my personal... Read More
1

Why to Optimize Earth? (post 1/3)

A new framing of EA’s collective impact There is a current theme in the EA community to focus on how to maximize the collective impact of the movement, rather than just the impact of each individual.  But what does it mean to maximize the impact of the movement? I suggest... Read More

View more: Next