Comment author: Tom_Davidson 07 September 2017 07:38:46PM 4 points [-]

Great podcasts!

Comment author: Tom_Davidson 11 February 2016 12:10:27PM *  0 points [-]

I found Nakul's article v interesting too but am surprised at what it led you to conclude.

I didn't think the article was challenging the claim that doing paradigmatic EA activities was moral. I thought Nakul was suggesting that doing them wasn't obligatory, and that the consequentialist reasons for doing them could be overridden by an individual's projects, duties and passions. He was pushing against the idea that EA can demand that everyone support them.

It seems like your personal projects would lead to do EA activities. So I'm surprised you judge EA activities to be less moral than alternatives. Which activities and why?

I would have expected you to conclude something like "Doing EA activities isn't morally required of everyone; for some people it isn't the right thing to do; but for me it absolutely is the right thing to do".

Comment author: Owen_Cotton-Barratt 23 January 2016 12:21:46PM 1 point [-]

But this will tend to neglect the fact that people can make choices which make them richer, possibly at personal cost. If we systematically ignore this, we will probably encourage people too much into careers which they enjoy with low consumption levels. I think it's important to take both degree of sacrifice (because the amount we can do isn't entirely endogenous) and absolute amount achieved (because nor is it entirely exogenous) into account.

Comment author: Tom_Davidson 23 January 2016 08:55:06PM 0 points [-]

Yeah good point.

If people choose a job which they enjoy less then that's a huge sacrifice, and should be applauded.

Comment author: Ben_Kuhn 21 January 2016 07:07:37PM *  3 points [-]

I'm interested to see people phrasing their arguments in terms of distinguishing how much sacrifice people make.

Personally, I'm sympathetic to distinguishing between how much impact people have, but thinking too hard about who sacrifices the most (except inasmuch as it's correlated with the former) seems like it's against the spirit of EA. It's about how much good you do, not how much you give up to do it!

If you're living on $10k and donating $90k, then donating your marginal $10k is WAY more of a sacrifice than if you're living on $90k and donating $10k. But it doesn't do any more good! I have a lot of respect for people who donate/sacrifice up to that margin, but it's the same kind of respect I have for, like, Wim Hof.*

(Of course, a lot of those people are also doing really important/awesome things, and I have EA-respect for them because of that. But the EA-respect isn't because they live on small amounts of money or spend every waking hour thinking about EA. It's what they actually get done!)

*the man who holds the world record for longest time spent immersed in an ice bath.

Comment author: Tom_Davidson 21 January 2016 08:53:10PM 6 points [-]

But EA is about doing the most good that you can.

So anyone who is doing the most good that they could possibly do is being an amazing EA. Someone on £1million who donates £50K is not doing anywhere near as much good as they could do.

The rich especially should be encouraged to make big sacrifices, as they do have the power to do the most good.

Comment author: Tom_Davidson 27 December 2015 02:12:37AM *  3 points [-]

I agree completely that talking with people about values is the right way to go. Also, I don't think we need to try and convince them to be utilitarians or nearly-utilitarian. Stressing that all people are equal and pointing to the terrible injustice of the current situation is already powerful, and those ideas aren't distinctively utilitarian.

Comment author: Geuss 26 December 2015 02:41:21AM 0 points [-]

I realise the difference between average and total utilitarianism, but in the context of the the whole history of moral and political thought the gap between the two is infinitesimal as compared to the gap between the utilitarian framework in which the debate operates and alternative systems of thought. There is no a priori reason to think that the efficacy of charitable giving should have any relation whatsoever to utilitarianism. Yet it occupies a huge part of the movement. I think that is regretful not only because I think utilitarianism hopelessly misguided, but because it stifles the kind of diversity which is necessary to create a genuinely ecumenical movement.

I am still struggling to follow any line of reasoning in the second half of what you have written. Why is that quote the part I want? What is it supposed to be doing? Can you summarise what you are doing in one paragraph of clear language?

Comment author: Tom_Davidson 26 December 2015 04:41:42AM 0 points [-]

There is no a priori reason to think that the efficacy of charitable giving should have any relation whatsoever to utilitarianism. Yet it occupies a huge part of the movement.

I think the argument is that, a priori, utilitarians think we should give effectively. Further, given the facts as they far (namely that effective donations can do an astronomical amount of good), there are incredibly strong moral reasons for utilitarians to promote effective giving and thus to participate in the EA movement.

I think that [the obsession with utilitarianism] is regretful... because it stifles the kind of diversity which is necessary to create a genuinely ecumenical movement.

I do find discussions like this a little embarrassing but then again they are interesting to the members of the EA community and this is an inward-facing page. Nonetheless I do share your fears about it putting outsiders off.

Comment author: MichaelDickens  (EA Profile) 21 December 2015 07:20:24PM 0 points [-]

I would also describe that scenario as "high confidence." My best guess on the actual numbers is more like, the direct effect of a donation to AMF is +1 utilon, and flow-through effects are normally distributed around 100 with standard deviation 500. So it's net positive in expectation but still has a high probability (~42% for the numbers given) of being net negative.

Comment author: Tom_Davidson 26 December 2015 04:15:26AM 0 points [-]

Those seem really high flow through effects to me! £2000 saves one life, but you could easily see it doing as much good as saving 600!

How are you arriving at the figure? The argument that "if you value all times equally, the flow through effects are 99.99...% of the impact" would actually seem to show that they dominated the immediate effects much more than this. (I'm hoping there's a reason why this observation is very misleading.) So what informal argument are you using?

Comment author: Carl_Shulman 21 December 2015 08:48:44PM *  4 points [-]

As I said on facebook, I think this mostly goes away (leaving a rather non-speculative case) if one puts even a little weight on special obligations to people in our generation:

AMF clearly saves lives in the short run. If you give that substantial weight rather than evaluating everything solely from a "view from nowhere" long run perspective where future populations are overwhelmingly important, then it is clear AMF is good. It is an effective way to help poor people today and unlikely to be a comparably exceptional way to make the long run worse. If you were building a portfolio to do well on many worldviews or for moral trade it would be a strong addition. You can avoid worry about the sign of its long run effects by remembering relative magnitude.

Comment author: Tom_Davidson 22 December 2015 07:43:55PM 1 point [-]

This is a nice idea but I worry it won't work.

Even with healthy moral uncertainty, I think we should attach very little weight to moral theories that give future people's utility negligible moral weight. For the kinds of reasons that suggest we can attach them less weight don't go any way to suggesting that we can ignore them. To do this they'd have to show that future people's moral weight was (more than!) inversely proportional to their temporal distance from us. But the reasons they give tend to show that we have special obligations to people in our generation, and say nothing about our obligations to people living in the year 3000AD vs people living in the year 30,000AD. [Maybe i'm missing an argument here?!] Thus any plausible moral theory will such that the calculation will be dominated by very long term effects, and long term effects will dominate our decision making process.

Comment author: Tom_Davidson 22 December 2015 06:24:46PM 2 points [-]

Great post!

Out of interest, can you give an example of an "instrumentally rational technique that require irrationality"?

Comment author: Robert_Wiblin 21 December 2015 05:18:25PM 1 point [-]

I think the effect of murdering someone are more robustly bad than reducing poverty (which are also probably positive, but less obviously so).

Comment author: Tom_Davidson 21 December 2015 05:21:35PM 2 points [-]

Why? What are the very long term effects of a murder?

View more: Next