Comment author: JoshP 08 May 2018 09:56:43PM 0 points [-]

Haven't read all of it- but I believe there's an error in the first line, which says this is the "second of three parts", I think it means third. Sorry my engagement isn't more interesting :P

Comment author: Peter_Hurford  (EA Profile) 08 May 2018 10:46:33PM *  2 points [-]

Fixed. Thanks!

6

Cost-Effectiveness of Vaccines: Appendices and Endnotes

This essay was jointly written by Peter Hurford and Marcus A. Davis. Note that because of technical length restrictions on the EA Forum, this essay is the third part of three parts:  Part 1 , Part 2 , and Part 3. To see all three parts in one part, you can  view... Read More
8

Cost-Effectiveness of Vaccines: Exploring Model Uncertainty and Takeaways

This essay was jointly written by Peter Hurford and Marcus A. Davis. Note that because of technical length restrictions on the EA Forum, this essay is the second part of three parts: Part 1 , Part 2, and Part 3 . To see all three parts in one part, you can ... Read More
7

What is the cost-effectiveness of researching vaccines?

This essay was jointly written by Peter Hurford and Marcus A. Davis. Note that because of technical length restrictions on the EA Forum, this essay is broken up into three parts: Part 1, Part 2 , and Part 3 . To see all three parts in one part, you can view... Read More
Comment author: Peter_Hurford  (EA Profile) 04 May 2018 01:40:32AM *  19 points [-]

I find it so interesting that people on the EA Facebook page have been a lot more generally critical about the content than people here on the EA Forum -- here it's all just typos and formatting issues.

I'll admit that I was one of the people who saw this here on the EA Forum first and was disappointed, but chose not to say anything out of a desire to not rock the boat. But now that I see others are concerned, I will echo my concerns too and magnify them here -- I don't feel like this handbook represents EA as I understand it.

By page count, AI is 45.7% of the entire causes sections. And as Catherine Low pointed out, in both the animal and the global poverty articles (which I didn't count toward the page count), more than half the article was dedicated to why we might not choose this cause area, with much of that space also focused on far-future of humanity. I'd find it hard for anyone to read this and not take away that the community consensus is that AI risk is clearly the most important thing to focus on.

I feel like I get it. I recognize that CEA and 80K have a right to have strong opinions about cause prioritization. I also recognize that they've worked hard to become such a strong central pillar of EA as they have. I also recognize that a lot of people that CEA and 80K are familiar with agree with them. But now I can't personally help but feel like CEA is using their position of relative strength to essentially dominate the conversation and claim it is the community consensus.

I agree the definition of "EA" here is itself the area of concern. It's very easy for any of us to call "EA" as we see it and naturally make claims about the preferences of the community. But this would be very clearly circular. I'd be tempted to defer to the EA Survey. AI was only the top cause of 16% of the EA Survey. Even among those employed full-time in a non-profit (maybe a proxy for full-time EAs), it was the top priority of 11.26%, compared to 44.22% for poverty and 6.46% for animal welfare. But naturally I'd be biased toward using these results, and I'm definitely sympathetic to the idea that EA should be considered more narrowly, or we should weight the opinions of people working on it full-time more heavily. So I'm unsure. Even my opinions here are circular, by my own admission.

But I think if we're going to be claiming in a community space to talk about the community, we should be more thoughtful about who's opinions we're including and excluding. It seems pretty inexpensive to re-weigh the handbook to emphasize AI risk just as much without being as clearly jarring about it (e.g., dedicating three chapters instead of one or slanting so clearly toward AI risk throughout the "reasons not to prioritize this cause" sections).

Based on this, and the general sentiment, I'd echo Scott Weather's comment on the Facebook group that it’s pretty disingenuous to represent CEA’s views as the views of the entire community writ large, however you want to define that. I agree I would have preferred it called “CEA’s Guide to Effective Altruism” or something similar.

Comment author: Gregory_Lewis 02 May 2018 06:10:23PM 4 points [-]

Thanks for the even-handed explication of an interesting idea.

I appreciate the example you gave was more meant as illustration than proposal. I nonetheless wonder whether further examination of the underlying problem might lead to ideas drawn tighter to the proposed limitations.

You note this set of challenges:

  1. Open Phil targets larger grantees
  2. EA funds/grants have limited evaluation capacity
  3. Peripheral EAs tend to channel funding to more central groups
  4. Core groups may have trouble evaluating people, which is often an important factor in whether to fund projects.

The result is a good person (but not known to the right people) with a good small idea is nonetheless left out in the cold.

I'm less sure about #2 - or rather, whether this is the key limitation. Max Dalton wrote on one of the FB threads linked.

In the first round of EA Grants, we were somewhat limited by staff time and funding, but we were also limited by the number of projects we were excited about funding. For instance, time constraints were not the main limiting factor on the percentage of people we interviewed. We are currently hiring for a part-time grants evaluator to help us to run EA Grants this year[...]

FWIW (and non-resiliently), I don't look around and see lots of promising but funding starved projects. More relevantly, I don't review recent history and find lots of cases of stuff rejected by major funders then supported by more peripheral funders which are doing really exciting things.

If not, then the idea here (in essence, of crowd-sourcing evaluation to respected people in the community) could help. Yet it doesn't seem to address #3 or #4.

If most of the money (even from the community) ends up going through the 'core' funnel, then a competitive approach would be advocacy to these groups to change their strategy, instead of providing a parallel route and hoping funders will come.

More importantly, if funders generally want to 'find good people', the crowd-sourced project evaluation only helps so much. For people more on the periphery of the community, this uncertainty from funders will remain even the anonymised feedback on the project is very positive.

Per Michael, I'm not sure what this idea has over (say) posting a 'pitch' on this forum, doing a kickstarter, etc.

Comment author: Peter_Hurford  (EA Profile) 03 May 2018 04:31:46AM 0 points [-]

FWIW (and non-resiliently), I don't look around and see lots of promising but funding starved projects.

I'd be curious to see the reject list for EA Grants.

I think EA Grants is a great idea for essentially crowdsourcing projects, but it would be nice to have more transparency around how the funding decisions are made, as well as maybe the opportunity for people with different approaches to see and fund rejected grants.

Comment author: Denise_Melchin 02 May 2018 10:58:34PM 1 point [-]

We do not disagree much then! The difference seems to come down to what the funding situation actually is and not how it should be.

I see a lot more than a couple of funders per cause area - why are you not counting all the EtGers? Most projects don’t need access to large funders.

Comment author: Peter_Hurford  (EA Profile) 03 May 2018 04:02:28AM *  0 points [-]

Glad to hear we agree! :)

why are you not counting all the EtGers?

I'm a bit out of the loop, but my assumption is that there are far fewer EtGers these days and that they're not easy to find. I'm unsold that a crowdfunding platform is a good solution, but I do think that identifying funders for your project is not an easy task, and there might be opportunity around improving the diversity and accessibility of this ETG pool.

Comment author: Peter_Hurford  (EA Profile) 02 May 2018 05:30:10PM 4 points [-]

I don’t think of having a (very) limited pool of funders who judge your project as such a negative thing. As it’s been pointed out before, evaluating projects is very time intensive.

I like the reduction of high time costs and specialization of trade, but a small pool of funders means that if (a) they don't have time for you, your project dies and (b) if they don't share your theory of change, your project dies.

On (a), it does seem like staff time bottlenecks have prevented a lot of funding from going to a lot of good projects (see EA Funds).

On (b), I admit that there's a fine line between "this person is wrong and their project just shouldn't happen" to "this person has a good idea but it just isn't recognized by the few funders". It does seem to me, however, that the current funding system does have some groupthink around certain policies (e.g., "hits based giving") that may not universally select every good project and reject every bad project. It would be nice for there to be somewhat more worldview diversification in what can get funded and I'm seeing a lot of gaps here.

Comment author: Peter_Hurford  (EA Profile) 02 May 2018 05:34:51PM 4 points [-]

Maybe my view of the landscape is naive, but it appears to me that a lot of spaces these days have effectively just one or two funders that can actually fund a project (e.g., Elie for poverty interventions, Lewis + ACE for nonhuman animal interventions, Nick for AI interventions, and Nick + CEA for community projects and I imagine these two groups confer significantly). I don't think we need dozens of funders, but I think the optimal number would be closer to three or four people that think somewhat differently and confer only loosely, rather than one or two people.

Comment author: Denise_Melchin 02 May 2018 10:24:25AM 1 point [-]

I don’t think of having a (very) limited pool of funders who judge your project as such a negative thing. As it’s been pointed out before, evaluating projects is very time intensive.

You’re also implicitly assuming that there’s little information in the rejection of funders. I think if you have been rejected by 3+ funders, where you hopefully got a good sense for why, you should seriously reconsider your project.

Otherwise you might fall prey to the unilateralist’s curse - most people think your project is not worth funding, possibly because it has some risk of causing harm (either directly or indirectly by stopping others from taking up a similar space) but you only need one person who is not dissuaded by that.

Comment author: Peter_Hurford  (EA Profile) 02 May 2018 05:30:10PM 4 points [-]

I don’t think of having a (very) limited pool of funders who judge your project as such a negative thing. As it’s been pointed out before, evaluating projects is very time intensive.

I like the reduction of high time costs and specialization of trade, but a small pool of funders means that if (a) they don't have time for you, your project dies and (b) if they don't share your theory of change, your project dies.

On (a), it does seem like staff time bottlenecks have prevented a lot of funding from going to a lot of good projects (see EA Funds).

On (b), I admit that there's a fine line between "this person is wrong and their project just shouldn't happen" to "this person has a good idea but it just isn't recognized by the few funders". It does seem to me, however, that the current funding system does have some groupthink around certain policies (e.g., "hits based giving") that may not universally select every good project and reject every bad project. It would be nice for there to be somewhat more worldview diversification in what can get funded and I'm seeing a lot of gaps here.

Comment author: Peter_Hurford  (EA Profile) 02 May 2018 12:30:37AM 11 points [-]

I like CEA's work and people a lot, but I envision a world where they're not the only group that is able to and trusted to lead community projects.

View more: Next