11

Survey of EA org leaders about what skills and experience they most need, their staff/donations trade-offs, problem prioritisation, and more.

For the third year running we've surveyed leadership at EA organisations about a range of issues where their views might be relevant to EAs’ career decisions:

What are the most pressing talent gaps in professional effective altruism in 2018? And which problems are most effective to work on? New survey of organisational leaders.

It complements the 2018 EA Survey which aims to collect information about everyone who says "they can, however loosely, be described as an effective altruist."

We asked leaders about:

  • what skills and experience they most need;
  • what skills and experience they think the community as a whole will need in the future;
  • how many donations they'd be willing to forego for their latest hires;
  • their view on the relative cost-effectiveness of the different EA Funds, and which new funds they'd like to see;
  • how urgent their need for extra donations and staff is;
  • and various other issues.

We also surveyed people who identify as members of the EA community and work directly on problems like animal welfare and poverty, to see how their views on some of these questions would compare.

Here are some of the findings:

  • EA organisation leaders said experience with operations or management, and generalist researchers are what their organisations will need most of over the next five years.
  • They said the community as a whole will most need more government and policy experts, operations experience, machine learning/AI technical expertise, and skilled managers.
  • Most EA organisations continue to feel more 'talent constrained' than funding constrained, rating themselves as 2.8/4 talent constrained and 1.5/4 funding constrained.
  • Leaders thought the key bottleneck for the community is to get More dedicated people (e.g. work at EA orgs, research in AI safety/biosecurity/economics, ETG over $1m) converted from moderate engagement. The second biggest is to increase impact of existing dedicated people through e.g. better research, coordination, decision-making.
  • We asked leaders their views on the relative cost-effectiveness of donations to four funds operated by the community. The median view was that the Long-Term Future fund was twice as effective as the EA Community fund, which in turn was 10 times more cost-effective than the Animal Welfare fund, and twenty times as cost-effective as the Global Health and Development fund. Individual views on this question varied very widely, though 18/28 respondents thought the Long-Term Future fund was the most effective.
  • In addition, we asked several community members working directly on animal welfare and global development for their views on the relative cost-effectiveness of donations to these funds. About half these staff thought the fund in their own cause area was best, and about half thought either the EA Community fund or Long-Term Future fund was best. The median respondent in that group thought that the Animal Welfare fund was about 33% more cost-effective than the Long-Term Future fund and the EA Community fund - which were rated equally cost-effective - while the Global Health and Development fund was 33% as cost effective as either of those two. However, there was also a wide range of views among this group.
  • The organisations surveyed were usually willing to forego over a million dollars in additional donations to get the right person in a senior role 3 years earlier, or several hundred thousand dollars for a junior hire.

Continue reading for details of the method and results...

Most answers were similar to what we found in 2017, so next year we expect to either ask different questions or interview a smaller number of people in greater depth and see whether their responses change after further reflection.

 

Comments (22)

Comment author: Peter_Hurford  (EA Profile) 11 October 2018 06:35:26PM *  8 points [-]

Continuing on the EA talent paradox (“EA orgs need talent but many EAs can’t get hired at EA orgs”), I’m confused why 80,000 Hours is continuing to bemoan earning to give. I get that if someone could be an FHI superstar or earn to give at $50K/yr they should go join FHI and I get that there are many awesome career paths outside of EA orgs and outside ETG that should be explored. Maybe in the past ETG was too much of an easy auto-default and we want to pressure people to consider more of their options. But ETG is an easy auto-default for a reason and I wouldn’t be surprised if it turned out that ETG is genuinely the highest impact option for >50% of the population of people who are EA enough to, e.g., fill out the EA Survey!

It seems pretty discouraging to EAs to make them feel bad about what is a genuinely a really great option. I think we may have overcorrected too strongly against ETG and it may be time to bring it back as a very valid option among the top career paths, rather than “only for people who can donate $1M/yr or more” or “the auto-default for everyone”.

~

Edited to add that it looks like 80K seems to actually promote ETG in the way I recommend - see https://80000hours.org/articles/high-impact-careers/#5-otherwise-earn-to-give - but I don't think this is communicated very clearly outside that section of that article. In general, I get the sense that ETG has become depressing and low-status in EA when it was once high-status, and I'd like to see that trend reversed at least somewhat.

Comment author: 80000_Hours 12 October 2018 06:27:06AM 7 points [-]

Hi Peter,

It sounds like you mostly agree with our take on earning to give in the high impact careers article. That article is fairly new but it will become one of the central pages on the site after a forthcoming re-organisation. Let us know if there are other articles on the site you think are inconsistent with that take - we can take a look and potentially bring them into line.

We agree with you that earning to give can be a genuinely great option and don’t intend to demoralize people who choose that path. As we write in that article, we believe that “any graduate in a high income country can have a significant impact” by earning to give.

That said, we do stand by our recommendation that most people who might be a good fit to eventually enter one of our priority paths should initially pursue one of those paths over earning to give (though while maintaining a back-up option). Those paths have higher upside, so it’s worth testing out your potential, while bearing in mind that they might not work out.

Many of the best options on these paths require substantial career capital, so often this won’t mean starting a direct impact job today. Instead, we think many readers should consider acquiring career capital that can open up these paths, including graduate school in relevant disciplines (e.g. AI/ML, policy, or international relations) entry level policy jobs (e.g. as a Congressional staffer, or working as an early employee at a startup to gain skills and experience in operations. We hope to release an article discussing our updated views on career capital soon.

Of course, these paths aren’t a good fit for everyone, and we continue to believe that earning to give can be a great option for many.

It’s also worth emphasizing that our advice is, of course, influenced by our views on the highest priority problems. We tried to make that clear in “high impact careers” by including a section on how our recommendations would change if someone is focused on global health or factory farming. In that case, we believe “earning to give, for-profit work and advocacy become much more attractive.”

Comment author: Denise_Melchin 10 October 2018 07:08:37PM *  9 points [-]

Echoing David, I'm somewhat sceptical of the responses to "what skills and experience they think the community as a whole will need in the future". Does the answer refer to high impact opportunities in general in the world or only the ones who are mostly located at EA organisations?

I'm also not sure about the relevance to individual EA's career decisions. I think implying it might be relevant might be outright dangerous if this answer is built on the needs of jobs that are mostly located at EA organisations. From what I understand, EA organisations have had a sharp increase in not only the number, but also the quality of applications in recent times. That's great! But pretty unfortunate for people who took the arguments about 'talent constraints' seriously and focused their efforts on finding a job in the EA Community. They are now finding out that they may have little prospects, even if they are very talented and competent.

There's no shortage of high impact opportunities outside EA organisations. But the EA Community lacks the knowledge to identify them and resources to direct its talent there.

There are only a few dozen roles at EA orgs each year, nevermind roles that are a good fit for individual EA's skillset. Even if we only look at the most talented people, there are more capable people the EA Community isn't able to allocate among its own organizations. And this will only get worse - the EA Community is growing faster than jobs at EA orgs.

If we don't have the knowledge and connections to allocate all our talent right now, that's unfortunate, but not necessarily a big problem if this is something that is communicated. What is a big problem is to accidentally mislead people into thinking it's best to focus their career efforts mostly on EA orgs, instead of viewing them as a small sliver in a vast option space.

Comment author: 80000_Hours 10 October 2018 07:31:53PM *  4 points [-]

"Does the answer refer to high impact opportunities in general in the world"

That question is intended to look at the highest-impact jobs available in the world as a whole, in contrast with the organisations being surveyed. Given the top response was government and policy experts, I think people interpreted it correctly.

Comment author: Andy_Schultz 12 October 2018 02:04:16PM 4 points [-]

Interesting that the Long Term Future Fund is thought of as the most cost effective fund, even though the cause area is considered one of the least funding constrained. Sounds like there are still some pretty amazing opportunities for donations in that area!

Comment author: Evan_Gaensbauer 11 October 2018 09:39:09PM 3 points [-]

I've volunteered to submit a comment to the EA Forum from a couple anonymous observes which I believe deserves to be engaged.

The model this survey is based on implicitly creates something of an 'ideal EA,' which is somebody young, quantitative, elite, who has the means and opportunities to go to an elite university, and has the personality to hack very high-pressure jobs. In other words, it paints a picture of EA that is quite exclusive.

Comment author: Peter_Hurford  (EA Profile) 11 October 2018 06:58:41PM *  4 points [-]

The median view was that the Long-Term Future fund was twice as effective as the EA Community fund

This strikes me as an odd statement to make, given that - so far - the two funds have essentially operated as the same fund and have given donations to the exact same organizations with the exact same stated purposes. That being said, I agree it’s reasonable to expect the grantmaking of the funds to diverge under the forthcoming new management and maybe this expectation is what is being priced in here.

Comment author: Denise_Melchin 11 October 2018 09:15:34PM 1 point [-]

I had written the same comment, but then deleted it once I found out that it wasn't quite as true as I thought it was. In Nick's writeup the grants come from different funds according to their purpose. (I had previously thought the most recent round of grants granted money to the exact same organisations.)

Comment author: Peter_Hurford  (EA Profile) 11 October 2018 09:36:26PM 3 points [-]

Ah, I see. There's overlap on 80K and CEA, but the long-term future fund goes to CFAR and MIRI, whereas the EA Community fund goes to Founders Pledge.

Comment author: Michelle_Hutchinson 12 October 2018 08:55:22AM 1 point [-]

I don't know how others answered this question, but personally I didn't answer for how good I thought the last grants were to each other (ie, I wasn't comparing CfAR/MIRI to Founders Pledge) or in expectation of changover in grant maker. I was thinking about something like whether I preferred funding over the next 5 years to go to organisations which focused on the far future vs community building, knowing that these might or might not converge. I'd expect over that period a bunch of things to come up that we don't yet know about (in the same way that BERI did a year or so ago).

Comment author: Peter_Hurford  (EA Profile) 10 October 2018 11:47:59PM *  13 points [-]

I’d really like to hear more about other EA orgs experience with hiring staff. I’ve certainly had no problem finding junior staff for Rethink Priorities, Rethink Charity, or Charity Science (Note: Rethink Priorities is part of Rethink Charity but both are entirely separate from Charity Science)… and so far we’ve been lucky enough to have enough strong senior staff applications that we’re still finding ourselves turning down really strong applicants we would otherwise really love to hire.

I personally feel much more funding constrained / management capacity constrained / team culture “don’t grow too quickly” constrained than I feel “I need more talented applicants” constrained. I definitely don’t feel a need to trade away hundreds of thousands or millions of dollars in donations to get a good hire and I’m surprised that 80K/CEA has been flagging this issue for years now. …And experiences like this one suggest to me that I might not be alone in this regard.

So…

1.) Am I just less picky? (possible)

2.) Am I better at attracting the stronger applicants? (doubtful)

3.) Am I mistaken about the quality of our applicants such that they’re actually lower than they appear? (possible but doubtful)

Maybe my differences in cause prioritization (not overwhelmingly prioritizing the long-term future but still giving it a lot of credence) contributes toward getting a different and stronger applicant pool? …But how precise of a cause alignment do you need from hires, especially in ops, as long as people are broadly onboard?

I’m confused.

Comment author: 80000_Hours 12 October 2018 07:01:50PM 3 points [-]
Comment author: Ben_Todd 13 October 2018 06:31:51PM 0 points [-]

Personally, I see large differences in the expected impact of potential new hires. I'm surprised you don't, especially at the startup stage, and am not sure what's going on there. I would guess you should be more picky for some of the reasons listed in Rob's post.

I also feel very constrained by management capacity etc. This drives the value of past hires up even further, which is what the survey was about (as also in Rob's post).

Comment author: Peter_Hurford  (EA Profile) 14 October 2018 02:51:22AM *  2 points [-]

I do see large differences in expected impact of potential new hires, but I see a lot of hires who would be net positive additions (even after accounting all the various obvious costs enumerated by Rob) and even had to unfortunately turn away a few people I think would have been rather enormously net positive.

We're not constrained by management capacity but we will be soon.

Comment author: Evan_Gaensbauer 11 October 2018 08:55:31PM 0 points [-]

One possibility is because the EA organizations you hire for are focused on causes which also have a lot of representation in the non-profit sector outside of the EA movement, like global health and animal welfare, it's easier to attract talent which is both very skilled and very dedicated. Since a focus on the far-future is more limited to EA and adjacent communities, there is just a smaller talent pool of both extremely skilled and dedicated potential employees to draw from.

Far-future-focused EA orgs could be constantly suffering from this problem of a limited talent pool, to the point they'd be willing to pay hundreds of thousands of dollars to find an extremely talented hire. In AI safety/alignment, this wouldn't be weird as AI researchers can easily take a salary of hundreds of thousands at companies like OpenAI or Google. But this should only apply to orgs like MIRI or maybe FHI, which are far from the only orgs 80k surveyed.

So the data seems to imply leaders at EA orgs which already have a dozen staff would pay 20%+ of their budget for the next single marginal hire. So it still doesn't make sense that year after year a lot of EA orgs apparently need talent so badly they'll spend money they don't have to get it.

Comment author: Peter_Hurford  (EA Profile) 11 October 2018 09:34:47PM 3 points [-]

there is just a smaller talent pool of both extremely skilled and dedicated potential employees to draw from

We have been screening fairly selectively on having an EA mindset, though, so I'm not sure how much larger our pool is compared to other EA orgs. In fact, you could maybe argue the opposite -- given the prevalence of long-termism among the most involved EAs, it may be harder to convince them to work for us.

So the data seems to imply leaders at EA orgs which already have a dozen staff would pay 20%+ of their budget for the next single marginal hire.

From my vantage point, though, their actions don't seem consistent with this view.

Comment author: Evan_Gaensbauer 11 October 2018 10:01:02PM 0 points [-]

Yeah, I'm still left with more questions than answers.

Comment author: DavidNash 10 October 2018 03:25:06PM *  5 points [-]

Looking at this part -

"We did include more people from organisations focused on long-termism. It’s not clear what the right method is here, as organisations that are bigger and/or have more influence over the community ought to have more representation, but we think there’s room for disagreement with this decision."

I think one potential reason there are more people interested in EA working at LTF organisations is that EA and LTF are both relatively new ideas. Not many people are considering careers in these areas, so it is much easier for a community to found and staff the majority of organisations.

If global development had been ignored until 5 years ago, it's very likely most of the organisations in this area would be founded by people interested in EA, and they might be over represented in surveys like this.

There may be talent gaps in other cause areas (beyond development and animals) that are missed out as they don't have leaders with EA backgrounds but that doesn't mean that those gaps should be under weighted.

It may be worth having a separate survey trying to get opinions considering talent gaps in priority areas whether they are led by people involved in EA or not.

Comment author: Robert_Wiblin 10 October 2018 08:32:10PM *  5 points [-]

Tackling just one part of this:

"It may be worth having a separate survey trying to get opinions considering talent gaps in priority areas whether they are led by people involved in EA or not."

Ultimately our goal going forward is to make sure that we and our readers are highly informed about our priority paths (https://80000hours.org/articles/high-impact-careers/). About six out of ten of our current priority paths get direct coverage in this survey, while four do not.

I agree in future we should consider conducing different surveys of other groups - including people who don't identify as part of the EA community - about opportunities they're aware of in order to make sure we stay abreast of all the areas we recommend, rather than just those we are closest to.

Comment author: Evan_Gaensbauer 11 October 2018 09:28:02PM 0 points [-]

We surveyed managers at organisations in the community to find out their views. These results help to inform our recommendations about the highest impact career paths available.

How much weight does 80,000 Hours give to these survey results relative for other factors which together form 80k's career recommendations?

I ask because I'm not sure managers at EA organizations know what in the near future their focus area as a whole will need, and I think 80k might be able to exercise better independent judgement than the aggregate opinion of EA organization leaders. For example, there was an ops bottleneck in EA that is a lot better now. It seemed like orgs like 80k and CEA spotted this problem, and drove operations talent to a variety of EA orgs. But independent of one another I don't recall other EA orgs which benefited from this push helping to solve this coordination problem in the first place.

In general, I'm impressed with 80k's more formal research. I imagine there might be pressure for 80k to give more weight to softer impressions like what different EA org managers think the EA movement needs. But I think 80k's career recommendations will remain better if they're built off a harder research methodology.

Comment author: 80000_Hours 12 October 2018 07:47:04PM 2 points [-]

Hi Evan,

Responses to the survey do help to inform our advice but it’s only considered as one piece of data alongside all the other research we’ve done over the years. Our writeup of the survey results definitely shouldn’t be read as our all-things-considered view on any issue in particular.

Perhaps we could have made that clearer in the blog post but we hope that our frank discussion of the survey’s weaknesses and our doubts about many of the individual responses gives some sense of the overall weight we put on this particular source.

Comment author: Evan_Gaensbauer 13 October 2018 12:08:54AM 0 points [-]

Oh, no, that all makes sense. I was just raising questions I had about the post as I came across them. But I guess I should've have read the whole post first. I haven't finished it yet. Thanks.