There are thin and thick versions of EA[1]. The thinnest version is that EA is doing good effectively. The thickest versions look at the beliefs held by the majority of the community and says that these beliefs are what EA is. Often critics of EA take a thick version: they may describe EA as affluent, elitist, utilitarian, dismissive of systemic change, adverse to hard to evidence interventions and supportive of earning to give, etc.


It appears to me that there are two very easy common counters of criticism of EA

  1. Bite the bullet and say ‘Yes, you are right EA has this feature but that is actually a good thing’. For example standing up for the idea that earning to give can be a sensible career route for many altruistic people.
  2. Proffer thin EA and say ‘No, this is not actually part of EA as EA is just doing good effectively’. For example pointing out that nothing core in EA precludes anyone supporting or donating to systemic change causes.


So, firstly, it could be worth us noting when there is no consensus within the EA community about how to respond to a criticism, with approach 1 or approach 2. Where responses to criticisms are divided could be potentially useful areas for further research. (For example, is there actually any good evidence systemic change interventions are just a waste of time?)


Secondly if we want to fairly address criticisms we could be more aware how easy it is to dismiss criticisms with one of the above responses, we should think more about what can we learn from the criticism and why people are feeling this way about EA. For example, here is my guess at what we can learn from / what the element to truth is in a few EA criticisms.

  • EA is elitist: EAs want to create change quickly. Sometimes this means concentrating EA outreach to global elites / top schools / etc. This may be good for each individual project but could collectively risk making EA appear unwelcoming and risk systematically creating a bias to particular worldviews.
  • EA rejects systemic change: EA thinking does not have much to add to the very difficult question of how to create systemic change. There maybe people whose values should lead them to push for systemic changes, but EA biases them against this route by not having much to say on the issue whist professing to be the experts on how to do good.
  • EA does not support people who are not utilitarian: Firstly, the bulk of online EA literature is written in a way that is appealing to utilitarians but not nearly as appealing to people with other value sets. Secondly, in a similar manner to donor coordination problems in EA, there may be problems of utilitarian EAs not being willing to help people with different values to do good effectively, and as a result both sets of people are losing out. (This is actually the topic I set out to write on and may write more on).


On the other hand, having said all that, I am not sure how useful it is to worry too much about learning something from every little criticism of EA. For example, there are still more useful areas of prioritisation research than addressing the question above on the value of systemic change. (Responding in detail to criticisms may be analogous to caring more about distinguishing the 5th and 6th best charities than the 1st and 2nd best charities).

 

Summary / Tl;dr
• Noting when there is no consensus within the EA community about how to respond to a common criticism could indicate a useful area for further research.
• We (as individuals and as a community) could think more about what can we learn from various criticisms of EA.

 

 

[1] Idea taken from Iason Gabriel, see: http://effective-altruism.com/ea/lb/iason_gabriel_writes_whats_wrong_with_effective/

 

5

0
0

Reactions

0
0

More posts like this

Comments1
Sorted by Click to highlight new comments since: Today at 11:30 PM

One approach, as with utilitarianism, would be to ask the person whether they want to analyze the question intuitively or critically.

If critically, then bullet-biting should be accepted.

If intuitively, then it's useful to have intuitions against elimitism and in favour of collectivism.

http://www.utilitarian.net/hare/by/1981----.htm