14

Gregory_Lewis comments on Announcing the Effective Altruism Handbook, 2nd edition - Effective Altruism Forum

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (56)

You are viewing a single comment's thread. Show more comments above.

Comment author: Gregory_Lewis 05 May 2018 01:06:42AM *  7 points [-]

It's very easy for any of us to call "EA" as we see it and naturally make claims about the preferences of the community. But this would be very clearly circular. I'd be tempted to defer to the EA Survey. AI was only the top cause of 16% of the EA Survey. Even among those employed full-time in a non-profit (maybe a proxy for full-time EAs), it was the top priority of 11.26%, compared to 44.22% for poverty and 6.46% for animal welfare.

As noted in the fb discussion, it seems unlikely full-time non-profit employment is a good proxy for 'full-time EAs' (i.e. those working full time at an EA organisation - E2Gers would be one of a few groups who should also be considered 'full-time EAs' in the broader sense of the term).

For this group, one could stipulate every group which posts updates to the EA newsletter (I looked at the last half-dozen or so, so any group which didn't have an update is excluded, but likely minor) is an EA group, and toting up a headcount of staff (I didn't correct for FTE, and excluded advisors/founders/volunteers/freelancers/interns - all of these decisions could be challenged) and recording the prevailing focus of the org gives something like this:

  • 80000 hours (7 people) - Far future
  • ACE (17 people) - Animals
  • CEA (15 people) - Far future
  • CSER (11 people) - Far future
  • CFI (10 people) - Far future (I only included their researchers)
  • FHI (17 people) - Far future
  • FRI (5 people) - Far future
  • Givewell (20 people) - Global poverty
  • Open Phil (21 people) - Far future (mostly)
  • SI (3 people) - Animals
  • CFAR (11 people) - Far future
  • Rethink Charity (11 people) - Global poverty
  • WASR (3 people) - Animals
  • REG (4 people) - Far future [Edited after Jonas Vollmer kindly corrected me]
  • FLI (6 people) - Far future
  • MIRI (17 people) - Far future
  • TYLCS (11 people) - Global poverty

Totting this up, I get ~ two thirds of people work at orgs which focus on the far future (66%), 22% global poverty, and 12% animals. Although it is hard to work out the AI | far future proportion, I'm pretty sure it is the majority, so 45% AI wouldn't be wildly off-kilter if we thought the EA handbook should represent the balance of 'full time' attention.

I doubt this should be the relevant metric of how to divvy-up space in the EA handbook. It also seems unclear how clear considerations of representation play in selecting content, or if so what is the key community to proportionately represent.

Yet I think I'd be surprised if it wasn't the case that among those working 'in' EA, the majority work on the far future, and a plurality work on AI. It also agrees with my impression that the most involved in the EA community strongly skew towards the far future cause area in general and AI in particular. I think they do so, bluntly, because these people have better access to the balance of reason, which in fact favours these being the most important things to work on.

Comment author: [deleted] 06 May 2018 05:12:13PM *  6 points [-]

+1 for doing a quick empirical check and providing your method.

But the EA newsletter is curated by CEA, no? So it also partly reflects CEA's own priorities. You and others have noted in the discussion below that a number of plausibly full-time EA organisations are not included in your list (e.g. BERI, Charity Science, GFI, Sentience Politics).

I'd also question the view that OpenPhil is mostly focused on the long-term future. When I look at OpenPhil's grant database, and counting

  • Biosecurity and Pandemic Preparedness
  • Global Catastrophic Risks
  • Potential Risks from Advanced Artificial Intelligence
  • Scientific Research

as long-term future focused, I get that 30% of grant money was given to the long-term future.

Comment author: MichaelPlant 05 May 2018 01:01:23PM *  7 points [-]

'full-time EAs' (i.e. those working full time at an EA organisation - E2Gers would be one of a few groups who should also be considered 'full-time EAs' in the broader sense of the term).

I think this methodology is pretty suspicious. There are more ways to be a full-time EA (FTEA) that working at an EA org, or even E2Ging. Suppose someone spends their time working on, say, poverty out of an desire to do the most good, and thus works at a development NGO or for a governent. Neither development NGOs nor governments will count as an 'EA org' on your definition because they won't being posting updates to the EA newsletter. Why would they? The EA community has very little comparative advantage in solving poverty, so what we be the point in say, Oxfam or DFID sending update reports to the EA newsletter? It would frankly be bizarre for a government department to update the EA community. We might say "ah, put people who work on poverty aren't really EAs" but that would just beg the question.

Comment author: Jan_Kulveit 05 May 2018 07:04:30AM 3 points [-]

I think while this headcount is not a good metric how to allocate space in the EA handbook, it is a quite valuable overview in itself!

Just as a caveat, the numbers should not be directly compared to numbers from EA survey, as the later included also cause-prioritization, rationality, meta, politics & more.

(Using such cathegories, some organizations would end up in classified in different boxes)

Comment author: RandomEA 05 May 2018 12:15:07PM 2 points [-]

I think your list undercounts the number of animal-focused EAs. For example, it excludes Sentience Politics, which provided updates through the EA newsletter in September 2016, January 2017, and July 2017. It also excludes the Good Food Institute, an organization which describes itself as "founded to apply the principles of effective altruism (EA) to change our food system." While GFI does not provide updates through the EA newsletter, its job openings are mentioned in the December 2017, January 2018, and March 2018 newsletters. Additionally, it excludes organizations like the Humane League, which while not explicitly EA, have been described as having a "largely utilitarian worldview." Though the Humane League does not provide updates through the EA newsletter, its job openings are mentioned in the April 2017 newsletters, February 2018, and March 2018.

Perhaps the argument for excluding GFI and the Humane League (while including direct work organizations in the long term future space) is that relatively few people in direct work animal organizations identify as EAs (while most people in direct work long term future organizations identify as EA). If this is the reason, I think it'd be good for someone to provide evidence for it. Also, if the idea behind this method of counting is to look at the revealed preference of EAs, then I think people earning to give have to be included, especially since earning to give appears to be more useful for farm animal welfare than for long term future causes.

(Most of the above also applies to global health organizations.)

Comment author: Gregory_Lewis 06 May 2018 12:43:37PM 1 point [-]

I picked the 'updates' purely in the interests of time (easier to skim), that it gives some sense of what orgs are considered 'EA orgs' rather than 'orgs doing EA work' (a distinction which I accept is imprecise: would a GW top charity 'count'?), and I (forlornly) hoped pointing to a method, however brief, would forestall suspicion about cherry-picking.

I meant the quick-and-dirty data gathering to be more an indicative sample than a census. I'd therefore expect significant margin of error (but not so significant as to change the bottom line). Other relevant candidate groups are also left out: BERI, Charity Science, Founder's Pledge, ?ALLFED. I'd expect there are more.