Hide table of contents

Epistemic Status: Relatively confident but also willing to take advantage of Cunningham's Law if I'm wrong. May lack deep knowledge of EA efforts in this area.

About me:

I'm a sometime EA and looking to dive into the EA world more fully, including potentially shifting my career to work in EA. I currently work in US politics and specialize in online movement building and communications. I trend towards near-termist and global health EA causes, although I think the argument below also has long-termist implications.

The Central Premise 

There is a potentially massive method of doing good out there that's mostly ignored.

This method is at the absolute heart of the very concept of Effective Altruism, and yet is rarely discussed in EA communities or spaces.

We should try harder to influence the average non-EA person's donation.


The Current Charitable Landscape

A few quick facts:  The United States donated almost $500 billion just in 2021.  Without listing every individual country, European charitable donations are on the scale of hundreds of billions as well. Overall, yearly charitable donations in rich countries worldwide are in the high hundreds of billions of USD.

Most of this money, from an EA perspective, is wildly inefficiently spent. While it's impossible to break down exactly where each of these dollars goes, a little bit of common sense and some basic statistics paints a discouraging picture.  Of this giant pile of money, only 6% is donated internationally, despite donations to poor countries usually having a better per-dollar impact than donations inside a rich country. The largest categories for donation are religious organizations. The second largest category is educational donations.  Three quarters of that educational money is given to existing 4-year colleges and universities.  Much of that is the stereotypical worst kind of donation, a huge donation to an elite school that already has billions in endowment.

Beyond the statistics, any casual glance at how normal people donate their money can confirm this. People give to local schools, their friend's charity, or generally whatever they feel a connection to.  My parents, who are highly charitable people who gave >10% of their income long before it was an EA idea, have made significant charitable donations to a children's puppetry program. This is the landscape in which the better part of a trillion dollars is being spent.

None of this should be surprising to EAs. The core of Effective Altruism is the argument that when you attempt to do good, you should try to be effective in doing so. The equally core fact about the world that EAs recognize is that historically, most people have not been very good at maximizing how much good they do.  For the vast majority of charitable dollars, that's still true.


The Argument for Impact

I believe Effective Altruism should spend more time trying to shift the behavior of the general public. I believe this area has the potential for large impact, and that it's currently neglected as a way to do good.

Scale - Enormous. Not going to spend much time on this point, but obviously changes to how hundreds of billions of charitable dollars are given would be huge in scale.

Tractability - This problem is likely more difficult and less tractable than many other cause areas. It's very difficult to simply spin wide-ranging cultural changes into existence.  But it's not impossible, and the enormous size of the pile of money mitigates the low tractability.  Using some relatively low numbers - If you had even a 1% chance of success, and success meant only shifting 5% of US charitable dollars, that's still 250 million dollars of donations going to more effective causes than before, every single year in perpetuity.

Neglected - It's possible I am incorrect that this is a neglected area, because I am not a full time EA and am not deeply plugged in to EA circles.  I am more of a part time EA, loosely plugged in to those circles. 

But from my perspective, EA spends an enormous amount of institutional energy on what I'll call 'elite strategies'. This means reaching smaller groups of targeted individuals and convincing them to do an enormous amount of good.  Sometimes this literally means elites - there are EA events I've attended where 75% of the crowd comes from Oxford, Cambridge, Stanford, Harvard, or Princeton.

You could also describe this as a focus on outliers. Don't just improve your donations, find the absolute most worthy cause. Convince a small group of very bright people to dedicate their lives to it. Have a small number of ultra-wealthy folks fund most of our work. This approach has done an enormous amount of good, and I'm very glad it's being done. But it's not the only way to make change happen, and I don't see much systematic effort put towards swaying the general public's giving habits. 

Again, it's possible that I'm just missing an effort that already exists. But if it does exist, I haven't noticed much of it. Shifting public opinion is hard and usually takes a long time, and it's often much easier and more legible to shift small groups of people.


Ideas for Impact

More general public facing arguments - EA needs more people dedicated to getting off the EA forums, outside EA circles, and making our general case for broad EA values in wider venues. I do think that EA has been doing a slightly better job of this recently, with much of it connected to specific book releases.  But I don't think there's yet a conscious, dedicated effort to convince the wider general public of EA values. So much EA outreach and publicity is done through EA-adjacent or EA-friendly channels. Going on Tyler Cowen's podcast, posting comments on SlateStarCodex, etc. There should be more emphasis on doing interviews, op-eds and outreach well outside the traditional EA sphere in places that have never heard these ideas before, and there should be public communicators who specifically specialize towards the goal of getting EA in front of new audiences.

Meet people where they are - For most normal people, EA arguments will be somewhat persuasive but won't get them to change everything they do.  If you talk to a random person on the street, they're likely to agree that we should be thoughtful about donations and try to make sure they go to causes that do a lot of good. If you tell them that this means they need to donate to wild animal suffering research or AI safety policy or some other obscure EA thing, they're going to wonder if you're in a cult. 

Without criticizing the merits of those ideas, it's undeniable that they sound bizarre to many people. Rather than asking people to jump into the deep end immediately, general public outreach should keep a laser focus on relatable giving efforts and making reasonable asks for a normal person.  Don't ask them to abandon their current cause, but suggest splitting their dollars between that and a more effective one. Try to get them giving better, not giving the absolute best. One critical way to do that is...

Fund more charity evaluators - GiveWell is one of my favorite EA organizations.  But in the same way that normal people aren't going to flock to AI safety, it's hard to get people to switch from donating to their local favorite charity to malaria bed nets or vitamin A supplements for people ten thousand miles away. Instead, we should have significantly more charity evaluators like Animal Charity Evaluators

Even if you think animal charities shouldn't be focused on, the fact remains that lots of people like to donate to charities that help animals. Since that's going to be happening, we should influence them away from the charity that helps three very photogenic piglets and to the charity that causes a policy change that helps millions of factory farmed pigs.  I love the idea of ACE even though none of my own donations go towards animal welfare causes

People want to give to religious causes, to education, to cancer/AIDS/other disease research, to things that matter to them. Why isn't there an Education Charity Evaluators for people who care about education? Or a Christian/Jewish/Islamic Charity Evaluators for people who are going to make those donations anyways? A Cancer Charity Evaluators for people who lost a loved one to cancer? People are going to donate in these areas. Why aren't we trying to improve those donations?

More evaluators would help along two dimensions. It would help in that there would be people who read the recommendations and shift their dollars into a more effective charity. It would also help in that charities which have rarely ever been scrutinized are suddenly under pressure to think about how to pass a cost/benefit analysis, and may improve themselves due to increased scrutiny.

Pick a fight - I think this may be one of the more controversial ideas, but I think EA needs to pick more fights. EAs tend towards niceness and generally try to stay out of nasty culture war stuff or tribal political battles.  But they should ignore that instinct in at least some instances and start a couple of very public fights. 

If you want to change society's culture, you're going to ruffle some feathers. And that's not necessarily a bad thing - there are bad incumbents and institutions which need to be replaced. Conflicts with entrenched, bad incumbents are unlikely to be resolved without a struggle, and that struggle is likely to turn at least a little bit tribal, political, and nasty.  That's ok. Publicly feuding with a terrible incumbent is still a good idea.

If I was the czar of EA, I'd choose Ivy League university endowments as my fight. Rich people routinely make massive donations to universities which already have tens of billions of dollars in endowments.  It's probably one of the single worst possible donations you can make.  I'd fund a group called "Not A Single Dollar More" whose stated goal was to permanently end anyone ever giving to any Ivy League school ever again. Publicly attack everyone involved, shame the universities, shame the donors, organize protests, the works. This would have several benefits.  It would generate an enormous amount of press if you do it right, with most of that press being favorable. The Ivy League is a symbol of elitism and out-of-touch rich people, which will bias the public towards us. And frankly the fight is just very easy to win on the merits - rich people shouldn't donate 50M to name a new dorm.

Picking fights means creating enemies, but any movement that aims to change society is going to have enemies at some point (if they get close to success). I'd rather pick that enemy myself.  (side note - even if you disagree with picking fights, it's not central to the argument that we should still be trying to shift public values)


A note on longtermism

I am not a long-termist, but I think this goal is still a valuable undertaking for long-termist goals. I think most EAs are initially hooked by the very simple version of EA. They hear about giving more effectively, maybe read the GiveWell recommendations, get curious to learn more, and fall down the rabbit hole.  My sense of the long-termist organizing apparatus is that there's a lot of focused effort at doing university organizing, career help for people who have heard about EA and want to work in EA, etc.  But an underrated mechanism for recruiting more long-termists is simply increasing the size of the initial input. Surveys indicate only 2.6-6.7% of the population has even heard of EA. If we can triple that number, I'd expect proportional increases in the number of people involved with long-termist causes, both from a donation standpoint and a career standpoint.


Personal Note

Pre-emptive thanks to any who can help refine these ideas.  I'll be at EA Global for the first time starting tomorrow, and would be happy to meet anyone interested in developing these ideas with me and figuring out how EA can develop a real strategy to target the general public.  I'm looking to potentially make a career jump from politics into EA, and this is a topic I care deeply about and feel I am well positioned to make an impact in.

Comments14
Sorted by Click to highlight new comments since:

I would prefer it quite a lot if this post didn't have me read multiple paragraphs (plus a title) that feel kind of clickbaity and don't give me any information besides "this one opportunity that Effective Altruists ignore that's worth billions of dollars". I prefer titles on the EA Forum to be descriptive and distinct, whereas this title could be written about probably hundreds of posts here. 

A better title might be "Why aren't EAs spending more effort on influencing individual donations?" or "We should spend more effort on influencing individual donations".

I appreciate this response because I think it's symbolic of something I think is important.

EA has a lot of internal norms, like any group. It seems like on the EA forum one of those is to use more factual, descriptive, neutral titles. But elsewhere, the norm is to be attention getting, provocative, etc.  You could fairly call this 'clickbait' if you'd like. Clickbait exists because it works.  It is startlingly effective, and not just at cheap engagement that dies quickly.  It's effective at prompting deep engagement as well.  One quick example - video essayists on youtube who do incredibly informative deep dives on technical subjects still use clickbait titles, image previews, etc. The big channels literally have consultants that help A/B test which reaction face will get more clicks. It doesn't detract from the quality of their videos or the depth of their communities, it's just part of what you have to do to get people to care.

My experience is more in that world. I'm used to phrasing things with the explicit goal to make people click, draw their eyeballs, cause a stir, etc. By using that kind of title on the EA forum, I've probably committed a minor faux pas.  But it actually does help me illustrate the point that EAs shouldn't be allergic to that kind of thing all the time.

EAs using factual, descriptive, neutral titles on their own forums is an interesting quirk of the community.  But if EAs only ever use factual, descriptive, neutral language in all forums, that's a strategic mistake, and hinders their ability to effectively communicate with the public. This comment is a corollary to the 'Pick a fight' argument - I believe that sometimes EAs need to abandon internal norms in order to win attention for their ideas.

I think this comment misses the point. The crux is not whether clickbait does in fact draw attention – the fact that clickbait works is precisely why we don't want it on the forum. We have a limited amount of attention to spend, and encouraging clickbait means necessarily drawing away attention from less-clickbaity posts.

"But if EAs only ever use factual, descriptive, neutral language in all forums, that's a strategic mistake, and hinders their ability to effectively communicate with the public."

I don't think the purpose of the EA forum is to communicate with the public.

If EA starts to abandon internal norms about factual communication that's bad. It hinders what EA is about. When a GiveWell analyst gets told not to speak about the drawbacks of a certain cause because that might demotivate people to donate to that cause, that's a problem. That kind of behavior should happen less not more. 

Fighting for keeping the core norms intact is important. 

(I was about to send something like this as a private message, any opinions on that vs a comment?) (I did in fact send a similar message very recently, should it have been a comment?)

I think it's not obvious in this case what is better, though I think I mildly prefer publicly. Sending it privately keeps the conversation less tense and has less risk of making people feel embarrassed, but sending it publicly is better for helping newcomers orient to the culture (99% of people never post or comment, so private norm enforcement is a losing battle, especially if you hope that EA Forum norms expand to the in-person realm).

Re more charity evaluators, there is SoGive, which aims to evaluate a wide variety of popular charities, with the idea of nudging people toward more effective choices (they are EAs). Also with the idea of nudging people toward donating more to more effective charities, there is  Giving Multiplier ("Give to both your favorite charity and a super-effective charity recommended by experts. We'll add to your donations.")

I'm obviously biased, but wholeheartedly agree with you that EA should invest even more in effective giving (because of the impact the donations will have and also to get more people interested in EA). In the last couple of weeks I have started to become more optimistic that this will happen though (e.g. Open Phil seems to consider supporting this space directly).

With regards to the "pick a fight" strategy you might want to check out some of the very early GiveWell blog posts (2007 - 2009). They definitely didn't shy away from a fight (just ask Charity Navigator) and I actually think that this was a smart strategy at the time and might still be under some circumstances. 

I’m very interested in this. Some of what we’re doing at bit.ly/eamtt gets at this, as well as some of the ‘research and agenda’ I try to synthesize at “Increasing effective giving (& action): The puzzle, what we (need to) know”. (See also earlier/related work at innovationsinfundraising.org .)

... Whew, end of self-promo.

So in my case, you are preaching to the choir. However, there had been some pushback on this view, which I hope others present in the comments below (if not, I'll try to 'steel-man' it).

By the way, one additional point that I frequently make on the above is that, in addition to the importance of charitable giving itself

Drivers and barriers to effective giving are also drivers/barriers to effective pro-social personal, professional and political choices.

So (learning how to) promoting effective charitable-giving-related ideas to 'the masses' will also help (learn how to convince people) to vote for effective government policies.

In addition to the work I link above, there are obviously other groups and initiatives 'in this space'. You are not alone, even though the thrust of most efforts within 'core EA' seems to have moved to 'making more highly-engaged EAs' and getting the smartest people to pursue longtermist-relevant career paths.

E.g., see the work of (some of which is discussed in the above links) ...

I submitted an entry to Open Phil (not sure if it came through) that was short and sweet. Reprinted here: 

This essay will be short and simple. 

The problem at hand is this: Far too much potential philanthropy isn’t happening at all. 

First, when you look at the list of Giving Pledge signatories (https://givingpledge.org/pledgerlist), all of whom are billionaires who have promised to give away half their wealth, most of them haven’t set up foundations or made any significant effort to actually follow through on their pledge. Indeed, even the Pledgers that have engaged in serious philanthropy have often been able to increase their net worth by more than they gave away (see https://www.marketwatch.com/story/giving-away-money-well-is-very-hard-the-giving-pledge-turns-10-and-its-signers-are-richer-than-ever-2020-08-08).

Second, the Initiative to Accelerate Charitable Giving estimates that over $1 trillion is locked up in private foundations and donor-advised funds, most of which don’t provide any transparent way to apply for funding. Indeed, the major firms that manage DAFs (e.g., Schwab) are highly incentivized by management fees not to facilitate transfers out to actual charities. 

Open Philanthropy and associated efforts in EA have focused on how to optimize the philanthropy that already occurs. But what if a focused public campaign and some policy work on DAFs etc. could vastly increase the amount of philanthropy available to optimize? Getting DAFs and Giving Pledge members to distribute 5% of their assets a year could be worth at least $50 billion a year. 

I believe there is a strong need for mainstream EA influencers.

"Giving What We Can's mission is to make giving effectively and significantly a cultural norm. We mean this quite literally: our goal isn't just to marginally increase the amount of money going to effective charities — we're aiming to make meaningful cultural change."

At Giving What We Can we are trying to bring effective giving and the ideas of effective altruism to a much broader audience through the lens of engaging those who are generally in the top 10% of earners worldwide. 

There are many other organisations and new projects that aim to share the ideas to a wider audience - One example I'm personally excited about is Asterisk, a new magazine shaped by the philosophy of EA. https://asteriskmag.com/

EAF’s ballot initiative doubled Zurich’s development aid
Would love to see more initiatives like this one.

(I havent read the whole top level post, only skimmed)

Curated and popular this week
Relevant opportunities