2

"Efficiency" Measures Miss the Point

by Dan Pallotta, originally on HBR Blogs

An e-mail I got from a former employee last week exemplified a dangerous public mythology: “You see, for every dollar a donor gives they have the expectation that it’s used efficiently. After all they have choices, they can give that same dollar to another charity. Donors want their donations to go as far as possible…”

There are two fatal errors here. The first is that high administrative efficiency equals high impact. It doesnʼt. The second is that the admin-to-program ratio is measuring efficiency. If it isnʼt measuring impact, itʼs axiomatic that it isnʼt measuring efficiency, because the only efficiency that matters is the efficiency associated with impact. Take the frugal breast cancer charity that consistently fails to find a cure for breast cancer. The last word a woman dying of breast cancer would use to describe it would be “efficient.” Not if she factors in the value of her life.

As for making donations “go as far as possible,” consider two soup kitchens. Soup kitchen A reports that 90% of every donation goes to the cause. Soup Kitchen B reports 70%. You should donate to A, right? No-brainer. Unless you actually visited the two and found that the so-called more “efficient” Soup Kitchen A serves rancid soup in a dilapidated building with an unpleasant staff and is closed half the time, while Soup Kitchen B is open 24/7, and has a super-friendly staff that serves nutritious soup in a state-of-the-art facility. Now which looks better? The admin: program ratio would have failed you completely. It betrays your trust. Itʼs utterly deficient in data about which soup kitchen is better at serving soup. It undermines your compassion and insults your contribution. And yet we praise it as a yardstick of morality and trustworthiness. Itʼs the exact opposite.

We should stop saying charities with low ratios are efficient. Efficient at what? Fundraising? “Inefficient” — as in expedient — fundraising may accelerate problem-solving, making its “inefficiency” efficient in the big picture. Say Jonas Salk spent $50 million to raise $100 million to find a polio vaccine. The admin:program ratio would report he had a shameful 50% overhead. But the $100 million he raised wasnʼt his end result. His end result was a vaccine. Divide the $50 million fundraising expense into the God-only-knows-how-many billions of dollars a polio vaccine is worth, and his overhead ratio at eradicating polio is 0%. A hypothetical competing charity with 10% fundraising cost that comes up empty on a vaccine has 100% overhead against the goal of a vaccine, because it never found one. But itʼs labeled the more “efficient” charity. As one of millions who dodged polio because of Salk, Iʼd have to disagree.

Letʼs get unhypothetical. In 1995, Physicians for Human Rights had revenues of approximately $1.3 million. They spent approximately $750,000, or 58 percent of revenues, on programs. Today that organization would fail all of the watchdog standards for “efficiency.” It would be ineligible for a BBB Wise Giving Alliance seal of approval. The Nobel Peace Prize committee felt differently. Physicians for Human Rights won the Nobel Prize in 1997 for its work as a founding member of the International Campaign to Ban Landmines.

Imagine coming out of a shoe store with a brand new pair of shoes full of holes, and whispering to your friends, “You wouldnʼt believe how low the overhead was on these shoes.” Thatʼs exactly what Americans are doing with hundreds of billions of annual charitable donations. We take huge pride in giving to charities with low overhead without knowing a damned thing about whether theyʼre any good at what they do.

The e-mail from my former colleague was right in one respect. Donors do have a choice. And they should stop using this hallucinogenic “efficiency” ratio to determine how they make it.

Part of Introduction to Effective Altruism

Previous: Dorothea Brooke • Next: How Not to Be a "White in Shining Armor" 

Comments (1)

Comment author: carneades 24 December 2016 09:38:31AM 1 point [-]

I would argue that this assessment does not go far enough. When charities are analyzed by EA, there are many factors that are overlooked which would disqualify them, such as the long term economic impact, the right of communities to choose their own interventions, and the dependence created by these programs. I live in West Africa and I have seen that the "top ranked" charity, AMF, does more harm than good, it is the nasty soup kitchen with nice numbers but horrible facilities, and yet EA has failed to recognize that fact. Here's a video which goes into greater depth.