Comment author: Evan_Gaensbauer 24 March 2017 09:22:31PM 4 points [-]

Upvoted. Maybe this is just what's typical for academic style, but when you address Gabriel it seems you're attributing the points of view presented to him, while when I read the original paper I got the impression he was collating and clarifying some criticisms of effective altruism so they'd be coherent enough they'd affect change. One thing about mainstream journalism is it's usually written for a specific type of audience the editor has in mind, even if the source claims to be seeking a general audience, and so they spin things a certain way. While criticisms of EA definitely aren't what I'd call sensationalistic, they're written in a style of a rhetorical list of likes and dislikes about the EA movement. It's taken for granted the implied position of the author is somehow a better alternative than what EA is currently doing, as if no explanation is needed.

Gabriel fixes this by writing the criticisms of EA up in a way that we'd understand what about the movement would need to change to satisfy critics, if we were indeed to agree with critics. Really, except for the pieces published on the Boston Review, I feel like other criticisms of EA were written not for EA at all, but rather a review of EA for other do-gooders as a warning to stay away from the movement. It's not the job of critics to solve all our problems for us, but being a movement that is at least willing to try to change in the face of criticism, it's frustrating nobody takes us up on the opportunity given what blindspots we may have and tries to be constructive.

In response to Open Thread #36
Comment author: Evan_Gaensbauer 17 March 2017 03:26:28AM 3 points [-]

As people age their lives become more difficult. Physically and mentally, they just aren't where they previously were. Most effective altruists are younger people, and they may not take into consideration how risky it can be to not have any savings cushion in the case things change. We can't necessarily count on pension plans to cover us in our old age. We can't assume our health will always be what it is now. A lot of people will face harder times in the future, and being put in the mindset of assuming one won't face personal hardship, so one need not save money, is reckless.

It's one thing if someone aspires to be wealthy, retire at age 30 like Mr. Money Mustache, or live a luxurious retirement. But it's dangerous to create a culture in EA where people might be accused of hypocrisy to even save enough for retirement to cover their own basic living expenses. It's also dangerous for us to presume that each of our lives will go so easily we can work until we die, or we won't get sick. While talking about these things in the abstract may be well and fine, I want to register my conviction using social influence, i.e., peer pressure, alone to normalize "don't/no need to save for retirement" as practical advice among effective altruists is potentially dangerous.

Comment author: RobBensinger 07 February 2017 10:36:23PM -1 points [-]

Anonymous #2:

I'd prefer it if more people in EA were paid on a contract basis, if more people were paid lower salaries, if there were more mechanisms for the transfer of power in organizations (e.g., a 2- or 3-year term limit for CEOs and a maximum age at entry), and if there were more direct donations. Also: better systems to attract young people. More people in biology. More optimism. More willingness to broadcast arguments against working on animal welfare that have not been refuted.

Comment author: Evan_Gaensbauer 22 February 2017 05:48:12AM 1 point [-]

I originally downvoted this comment, because some of the suggestions obviously suck, but some of the points here could be improved.

I'd prefer it if more people in EA were paid on a contract basis.

There are a lot of effective altruists who have just as good ideas as anyone working at an EA non-profit, or a university, but due to a variety of circumstances, they're not able to land those jobs. Some effective altruists already run Patreons for their blogs, and I think the material coming out of them is decent, especially as they can lend voices independent of institutions on some EA subjects. Also, they have the time to cover or criticize certain topics other effective altruists aren't since their effort is taken up by a single research focus.

I'd prefer it if more people in EA were paid on a contract basis.

Nothing can be done about this criticism if some numbers aren't given. Criticizing certain individuals for getting paid too much, or criticizing certain organizations for paying their staff too much, isn't an actionable criticism unless one gets specific. I know EA organizations whose staff, including the founders who decide the budget, essentially get paid minimum wage. On the other hand, Givewell's cofounders Holden and Elie get paid well into the six figures each year. While I don't myself much care, I've privately chatted with people who perceive this as problematic. Then, there may be some staff at some EA organizations who may appear to others to get paid more than they deserve, especially when their salaries may be able to pay for one or more full-time salaries as other individuals perceived to be just as competent. That last statement was full of conditionals, I know, but it's something I'm guessing they anonymous commenter was concerned about.

f there were more mechanisms for the transfer of power in organizations (e.g., a 2- or 3-year term limit for CEOs and a maximum age at entry),

Again, they'd need to be specific about what organization they're talking about. The biggest problem with this comment is the commenter made broad, vague generalizations which aren't actionable. It's uncomfortable to make specific criticisms of individuals or organizations, yes, but the point of an anonymous criticism is to be able to do that if it's really necessary with virtual impunity, while bad commentary which are more or less character assassinations can easily be written off without a flamewar ensuing, or feelings not getting as hurt.

Anyway, I too can sympathize with demands for more accountability, governance and oversight at EA organizations. For example, many effective altruists have been concerned time and again with the influence of major organizations like the Centre for Effective Altruism which, even if its not their intent, may be perceived to represent and speak for the movement as a whole. This could be a problem. However, while EA need not only be a social movement predicated on and mediated through registered NPOs, it by and large is and will continue to be in practice, as many social movements which are at all centralized are. Making special asks for changes in governance at these organizations to become more democratic without posting to the EA Forum directly and making the suggestions consistent with how NPOs at all operate in a given jurisdiction will just not result in change. These suggestions really stand out considering they're more specific than I've seen anyone call for, as if this is a desperate problem in EA, when at most I've seen similar sentiments at most expressed as vague concerns on the EA Forum.

and if there were more direct donations.

The EA Forum and other channels like the 'Effective Altruism' Facebook group appear dominated by fundraisers and commentary on and from metacharities because those are literally some of the only appropriate outlets for metacharities to fundraise or to publish transparency reports. Indeed, that they're posting material besides fundraisers beyond their own website is a good sign, as it's the sort of transparency and peer review the movement at large would demand of metacharities. Nonetheless, between this and constant chatter about metacharities on social media, I can see how the perception most donations are indirect and go to metacharities arises. However, this may be illusory. The 2015 EA Survey, the latest date for which results are available, show effective altruists overwhelmingly donate to Givewell's recommended charities. Data isn't available on the amounts of money self-identified effective altruists are moving to each of these given charities. So, it's possible lots of effective altruists earning to give are making primarily indirect donations. However, anecdotally, this doesn't seem to be the case. If one wants to make that case, and then mount a criticism based on it, one must substantiate it with evidence.

Comment author: Evan_Gaensbauer 30 January 2017 09:23:42PM 1 point [-]

If effective altruists aren't perfect utilitarians because they're human, and humans can't be perfect utilitarians because they're human, maybe the problem is effective altruists trying to be perfect utilitarians despite their inability to do so, and that's why they make mistakes. What do you think of that?

Comment author: RyanCarey 12 January 2017 09:54:57PM 8 points [-]

I strongly agree with the points Ben Hoffman has been making (mostly in the other threads) about the epistemic problems caused by holding criticism to a higher standard than praise. I also think that we should be fairly mindful that providing public criticism can have a high social cost to the person making the criticism, even though they are providing a public service.

This is completely true.

I personally have a number of criticisms of EA (despite overall being a strong proponent of the movement) that I am fairly unlikely to share publicly, due to the following dynamic: anything I write that wouldn't incur unacceptably high social costs would have to be a highly watered-down version of the original point, and/or involve so much of my time to write carefully that it wouldn't be worthwhile.

There are at least a dozen people for whom this is true.

Comment author: Evan_Gaensbauer 13 January 2017 01:47:15PM 3 points [-]

I feel like this is true for me too. I'd guess I've got more spare time on my hands than you guys. I also don't currently work for any EA charities. It's really hard to make your beliefs pay rent when you're in near mode and you're constantly worried about how if you screw up a criticism you'll lose connections and get ostracized, or you'll hurt the trajectory of a cause or charity you like by association because as much as we like to say we're debiased a lot of time affective rationalizations sneak into our motivations. Well, we all come from different walks of life, and a lot of us haven't been in communities trying to be as intellectually honest and epistemically virtuous as EA tries to be. It's hard to overcome that aversion to keeping our guard up because everywhere else we go in life our new ideas are treated utterly uncharitably, like, worse than anything in EA on a regular day. It's hard to unlearn those patterns. We as a community need to find ways to trust each other more. But that takes a lot of work, and will take a while.

In the meantime, I don't have a lot to lose by criticizing EA, or at least I can take a hit pretty well. I mean, maybe there are social opportunity costs, what I won't be able to do in the future if I became low-status, but I'm confident I'm the sort of person who can create new opportunities for himself. So I'm not worried about me, and I don't think anyone else should either. I've never had a cause selection. Honestly, it felt weird to talk about, but this whole model uncertainty thing people are going for between causes now is something I've implicitly grasped the whole time. Like, I never understood why everyone was so confident in their views on causes when a bunch of this stuff requires figuring out things about consciousness, or the value of future lives, which seem like some philosophically and historically mind-boggling puzzles to me.

If you go to my EAHub profile, you'll notice the biggest donation I made was in 2014 for $1000 to Givewell for unrestricted funds. That was because I knew those funds would increase the pool of money for starting the Open Philanthropy Project. And it was matched. You'll also notice I select pretty much every cause as something to consider, as I'm paranoid about myself or EA in general missing out on important information. All I can say about my politics is I'm a civil libertarian and otherwise I don't get offended by reading things when they're written by people who want to improve EA in earnest. I hope you'll take my word that I didn't just edit my EA Hub profile now. That's what I got for a badge to show I really try to stay neutral.

If anyone wants to privately and/or anonymously send me their thoughts on an EA organization, and what they're doing wrong, no matter what it is, I'll give my honest feedback and we can have a back and forth and hopefully hammer something out to be published. I also don't particularly favour any EA org right now as I feel like a lot of these organizations are people who've only been in academia, or the software industry, or are sometimes starting non-profits right out of college, who might just not have the type or diversity of experience to alone make good plans/models, or get skills for dealing with different types of people and getting things done. I've thought for a while all these organizations at different points have made little or big mistakes, which are really hard to talk about in public, and it feels a bit absurd to me they're never talked about.

Feel free to send me stuff. Please don't send me stuff about interpersonal drama. Treat what you send me like filing a bug report.

Comment author: Evan_Gaensbauer 06 January 2017 12:52:29PM 4 points [-]

If you're looking to donate a small sum to animal charities, you could always look at Animal Charity Evaluators' (ACE) recommendations. Aside from that, 80,000 Hours borrowed some recommendations from Lewis Bollard, OpenPhil's Farm Animal Welfare Program Officer. He also recommends donating to ACE. I recall Michael Dickens in the past has claimed he currently thinks the Good Food Institute (GFI) is the best option for giving, but he wouldn't discourage giving to either ACE or Mercy For Animals (MFA) either. You can learn more in his own cause/donation selection write-up. I don't know if ACE regrants donations they receive directly for their recommended charities as Givewell does for theirs.

I've been reading some comments by John Maxwell recently which give me pause about effective altruists donating to political advocacy. John explains political advocacy is one of the least neglected focus areas, at least in the straightforward way of contributing to political campaigns, and is zero-sum in a way which makes getting more good info hard. He's made some comments about how perhaps EA should look more into reforming journalism for the better. I know Peter Hurford has thought about this some as well. Feel free to contact either of those guys.

Comment author: Evan_Gaensbauer 05 January 2017 04:20:21AM 9 points [-]

One thing that helps ground me in this regard is to think about in terms of the individual lives saved. At the end of the day, if you only donate enough to save one life when there are those who donate enough to save so many more, what you did is save a life that nobody else would've saved, or else they would've donated more. It doesn't matter how one relatively compares to others; compare how many lives you saved that you wouldn't have saved had you made different choices. That's what matters.

Comment author: Evan_Gaensbauer 28 December 2016 10:15:34AM 2 points [-]

The way your fundraising page represents how much money CEA is trying to raise confused me. First of all, you switched between representing amounts in either dollars or pounds. This isn't a big deal, but I thought I'd just let you know it's momentarily jarring when the amount being requested switches so much. I think readers can convert currencies well by themselves if need be.

Anyway, it says the CEA is seeking $3.1 million as its 'Minimum Target' for how much its seeking to raise. But that's the minimum target CEA is seeking to expand beyond its current scope of activity. It says in the budget summary the amount CEA needs to raise to cover the continuation of its regular suite of activities in 2017 is £ 1,860,517. As of this writing, that comes out to $2,277,905. It took me a while to figure out the ~$2.3 million figure was to continue ongoing operations, and guessed the ~$900k USD remaining would be for the ambitious expansion of more speculative but successful projects, like marketing, EAGx grants, and EA chapter grants. But I noticed that's already accounted for in the budget summary as well.

So, pardon me for saying so, but I'm confused as to what CEA's intentions are with the 'Minimum Target' and 'Growth Target' for Growth(?). I think I'm missing something, or the document doesn't make clear, which items in CEA's 2017 budget would the funding from these targets, if reached, be used for. Could you please clarify?

Comment author: Sean_o_h 07 December 2016 01:40:16PM *  3 points [-]

If enough people feel the same as Michael, is there a case for having a forum subsection where e.g. updates/fundraising/recruitment calls for EA orgs could live?

Disadvantages I could see

  • 'branching' the forum complicates and clutters it at a point where there still isn't a huge amount of volume to support/justify such structures.

  • these updates might get less visibility.

Advantages (in addition to the forum being kept more free for discussion of EA ideas)

  • These updates would then all be clustered in one place, making it easier for a reader to do an overview of orgs without digging through the forum's history.
Comment author: Evan_Gaensbauer 18 December 2016 11:10:20AM 0 points [-]

I could make a links post of all EA orgs' (semi-)annual reviews (if they have one up), and make it its own top-level post.

Comment author: Joey 08 December 2016 07:02:51PM 6 points [-]

Given all the interest in this (fairly unrelated to top post) topic I wonder if it makes sense to do a different post/survey on what would be the ideal posting frequency for EA orgs on the EA forum. I know CS would be very responsive to information on this and I suspect all the other EA orgs would be as well.

It also seems a bit hard to deal with criticism that falls along somewhat contradicting lines of a) you're not being transparent enough, I want more things like the monthly update and b) you're too spammy, I want to see less things like the monthly update. (I do know there is a difference between number of posts and given information, but limiting number of posts does make it harder).

Comment author: Evan_Gaensbauer 18 December 2016 11:09:01AM 0 points [-]

Who would you suggest run such a survey? Usually, these sorts of things would be run by EA orgs, but in this case I'd be wary of almost any EA org running it since they've got such strong institutional motivations/incentives to interpret or present the data in a biased way.

View more: Next