Comment author: byanyothername 11 June 2018 05:12:30PM 2 points [-]

Do you have some quick thoughts on when you expect your coaching to be more valuable to EAs than your average productivity coaching? E.g. What factors are relevant when deciding between paying for coaching with you, and doing a Google search and picking a productivity coach with similar/better prices/experience/reviews?

(Although r.e. prices, given that you've received an EA Grant for this work, we should be careful not to double-count.)

Comment author: byanyothername 11 June 2018 04:59:37PM 1 point [-]

Just want to highlight the bit where you describe how you exceeded your goals (at least, that's my takeaway):

As our Gran Canaria "pilot camp" grew in ambition, we implicitly worked towards the outcomes we expected to see for the “main camp”:

  1. Three or more draft papers have been written that are considered to be promising by the research community.
  2. Three or more researchers who participated in the project would obtain funding or a research role in AI safety/strategy in the year following the camp.

It is too soon to say about whether the first goal will be met, although with one paper in preparation and one team having already obtained funding it is looking plausible. The second goal was already met less than a month after the camp.

Congrats!

Comment author: byanyothername 11 June 2018 04:47:38PM *  1 point [-]

evidence in this study points to an estimate of $310 per pig year saved

Christ, I'd give up pork for 4 years for that price. Any takers? 10% discount if it's in the next 24 hours; I'm pretty cash-strapped at the moment.

Comment author: Peter_Hurford  (EA Profile) 08 June 2018 05:18:50AM 0 points [-]

I'm not sure that these are analogous enough to directly compare them in this way.

Every time we do cost-effectiveness analysis we need to make philosophical judgment calls about what we value. I agree that these "$ per thing" measures can be crude and are meant more for illustrative purposes than as a rigorous, binding, rationally compelling comparison. People could feel free to disagree and think that pig years saved are far more important (perhaps due to preference utilitarianism, or thinking the suffering averted is far more intense, etc.).

Despite this, we are faced with a genuine choice here and need some way to navigate that choice, even if we may do that with different values and philosophical backgrounds in mind.

Especially the part where video outperforms VR possibly due to a negative multiplier on VR.

I'm not sure how seriously I would take that proposition -- it appears to largely be guesswork. This study did not find any statistically significant difference in either direction between 360 VR and 2D video and both Faunalytics and Animal Equality leave open the possibility that novelty effects not captured in this study may still make 360 VR more compelling. Given my assessment that they're roughly equal in cost per person reached, I would not try to make a case for 2D video over 360 VR.

Comment author: byanyothername 11 June 2018 04:38:18PM *  0 points [-]

Despite this, we are faced with a genuine choice here and need some way to navigate that choice, even if we may do that with different values and philosophical backgrounds in mind.

Of course. But we're comparing two such different things here that I wouldn't claim things like, ". . . an estimate of $310 per pig year saved . . . which is worse than human-focused interventions even from a species neutral perspective" - to me, that's much worse than saying things like, "it costs $300 to provide biweekly CBT for a depressed Kenyan for a month and $50 to provide a daily hot meal for a homeless American for a month, so the former is worse than the latter even from a nationality neutral perspective", which you wouldn't say.

Comment author: Joey 06 June 2018 05:42:31PM 3 points [-]

I expect ~10 people to attend the camp although I do not expect 100% of them will start charities (I would guess ~60% would). Out of charities founded I expect about 50% of them would be GiveWell incubation/ACE recommended. Although it would depend on the year and focus.

Comment author: byanyothername 11 June 2018 03:53:56PM 0 points [-]

Out of charities founded I expect about 50% of them would be GiveWell incubation/ACE recommended.

Is there anything more you can say about why you think this? (Feel free to ignore the question - I don't have anything substantial to say about why this seems high to me.)

Comment author: byanyothername 11 June 2018 02:12:52PM 0 points [-]

Thank you for this data point!

I’m glad to see there are three other mental health apps being created by EAs - Mind Ease (Peter Brietbart, stress and anxiety), Inuka (Robin van Dalen, support chat app) and UpLift.us (Eddie Liu, depression)

There's also Moodimodo (Mike P Sinn), which I used for a couple of years after trying several other mood-tracking apps and disliking them. It was only later that I realised the connection.

Comment author: Denise_Melchin 20 May 2018 11:42:00PM *  23 points [-]

Thanks for trying to get a clearer handle on this issue by splitting it up by cause area.

One gripe I have with this debate is the focus on EA orgs. Effective Altruism is or should be about doing the most good. Organisations which are explicitly labelled Effective Altruist are only a small part of that. Claiming that EA is now more talent constrained than funding constrained implicitly refers to Effective Altruist orgs being more talent than funding constrained.

Whether 'doing the most good' in the world is more talent than funding constrained is much harder to prove but is the actually important question.

If we focus the debate on EA orgs and our general vision as a movement on orgs that are labelled EA, the EA Community runs the risk of overlooking efforts and opportunities which aren't branded EA.

Of course fixing global poverty takes more than ten people working on the problem. Filling the funding gap for GiveWell recommended charities won't be enough to fix it either. Using EA branded framing isn't special to you - but it can make us lose track of the bigger picture of all the problems that still need to be solved, and all the funding that is still needed for that.

If you want to focus on fixing global poverty, just because EA focuses on GW recommended charities doesn't mean EtG is the best approach - how about training to be a development economist instead? The world still needs more than ten additional ones of that. (Edit: But it is not obvious to me whether global poverty as a whole is more talent or funding constrained - you'd need to poll leading people who actually work in the field, e.g. leading development economists or development professors.)

Comment author: byanyothername 11 June 2018 01:47:35PM *  -1 points [-]

I had a similar reaction.

It was the choice of "Money gap - Large (~$86 million" in the summary that got me. It just seems immediately odd that if you think that Earning To Give to some global poverty charities is on a par with other common EA career choices in terms of marginal impact (i.e. assuming you think "poverty" should be on the table at all for us), that the size of this funding gap is the equivalent of ~$0.086pp for the bottom billion. And in fact the linked post gives a funding gap of something more like $400 million for GiveWell's top charities alone (on top of expected funding from Good Ventures and donors who aren't influenced by GiveWell), with GiveDirectly able to absorb "over 100 million dollars". But it's not so odd if you think that the expected value of donating to GiveWell-recommended charities is several orders of magnitude greater compared to the average global poverty charity. I'm aware that heavy-tailed distributions are probably at play here, but I'm very skeptical that GiveWell has found anywhere near the end of that tail (although I think they're the best we have).

Regardless of what the author meant, I think I see this kind of thinking in EA fairly regularly, and it's encouraged by giving the "neglectedness" criterion such prominence, perhaps unduly.

And yes, I also want to thank the author for encouraging people to think and talk about this in a more nuanced way.

Comment author: byanyothername 11 June 2018 12:59:14PM 1 point [-]

The funding is fairly centralized between Open Phil and the AR Funds being run by the same person (Lewis), which controls nearly 50% of all funding in AR.

If this is true, I just want to take a moment to celebrate that the EA movement has more or less doubled animal rights funding globally. That's awesome!

View more: Prev