12

Ben_Todd comments on The marketing gap and a plea for moral inclusivity - Effective Altruism Forum

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (46)

You are viewing a single comment's thread. Show more comments above.

Comment author: MichaelPlant 10 July 2017 09:23:18AM *  1 point [-]

Hey.

So, I don't mean to be attacking you on these things. I'm responding to what you said in the comments above and maybe more of a general impression, and perhaps not keeping in mind how 80k do things on their website; you write a bunch of (cool) stuff, I've probably forgotten the details and I don't think it would be useful to go back and enage in a 'you wrote this here' to check.

A few quick things as this has already been a long exchange.

Given I accept I'm basically a moral hipster, I'd understand if you put my views in the 3 rather 4 category.

If it's of any interest, I'm happy to suggest how you might update your problem quiz to capture my views and views in the area.

I wouldn't think the same way about Spanish flu vs mental health. I'm assuming happiness is duration x intensity (#Bentham). What I think you're discounting is the duration of mental illnesses - they are 'full-time' in that they take up your conscious space for lots of the day. They often last a long time. I don't know what the distribution of duration is, but if you have chronic depression (anhedonia) that will make you less happy constantly. In contrast, the experience of having flu might be bad (although it's not clear it's worse, moment per moment, than say, depression), but it doesnt last for very long. Couple of weeks? So we need to accounts for the fact a case of Spanish flu has 1/26th of the duration than anhedonia, before we even factor in intensity. More generally, I think we suffer from something like scope insensity when we do affecting forecasting: we tend to consider the intensity of events rather than duration. Studies into the 'peak-end' effect show this is exactly how we remember things: our brains only really remember the intensity of events.

One conclusion I reach (on my axiology) is that the things which cause daily misery/happy are the biggest in terms of scale. This is why I think don't think x-risks are the most important thing. I think a totalist should accept this sort of reasoning and bump up the scale of things like mental health, pain and ordinary human unhappiness, even though x-risk will be much bigger in scale on totalism. I accept I haven't offered anything to do with solvability of neglectedness yet.

Comment author: Ben_Todd 10 July 2017 10:28:18PM 1 point [-]

Thanks. Would you consider adding a note to the original post pointing out that 80k already does what you suggest re moral inclusivity? I find that people often don't read the comment threads.

Comment author: MichaelPlant 10 July 2017 11:40:51PM *  1 point [-]

I'll add a note saying you provide a decision tool, but I don't think you do what I suggest (obviously, you don't have to do what I suggest and can think I'm wrong!).

I don't think it's correct to call 80k morally inclusive because you substantially pick a prefered outcome/theory and then provide the decision tool as a sort of after thought. By my lights, being morally inclusive is incompatible with picking a preferred theory. You might think moral exclusivity is, all things considered, the right move, but we should at least be a clear that's the choice you've made. In the OP I suggest there were advantages to inclusivity over exclusivity and I'd be interested to hear if/why you disagree.

I'm also not sure if you disagree with me that the scale of suffering on the living from a X-risk disaster is probably quite small, and that the happiness lost to long-term conditions (mental health, chronic pains, ordinary human unhappiness) is of much larger scale than you've allowed. I've very happy to discuss this with you in person to hear what, if anything, would cause you to change your views on this. It would be a bit of a surprise if every moral view agreed X-risks were the most important thing, and it's also a bit odd if you've left some of the biggest problems (by scale) off the list. I accept I haven't made substantial arguments for all of these in writing, but I'm not sure what evidence you'd consider relevant.

I've also offered to help rejig the decision tool (perhaps subsequently to discussing it with you) and that offer still stands. On a personal level, I'd like the decision tool to tell me what I think the most important problems are and better reflection the philosophical decision process! You may decide this isn't worth your time.

Finally, I think my point about moral uncertainty still stands. If you think it is really important, it should probably feature somewhere. I can't see a mention of it here: https://80000hours.org/career-guide/world-problems/