12

Andaro comments on Worldview uncertainty - Effective Altruism Forum

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (6)

You are viewing a single comment's thread.

Comment author: Andaro 30 May 2018 02:13:39PM -1 points [-]

I used to think of this in terms of sanity heuristics. For example, if you consider an altruistic act A relative to another possible altruistic act B, check if A would cause more victimization of innocent sentient entities, e.g. rape, torture or similar nonconsensual violations. Measured by number and/or severity. If so, choose B instead and sanity-check against C, wash rinse repeat until you're at least indifferent between the choices on this dimension. A kind of "do no harm" or rather "don't increase evil" principle. The problem is that this is incompatible with those forms of x-risk reduction that focus on making the world bigger rather than better. And that in turn got me a lot of flak from the EA community.

I wrote "used to think" because I gave up on altruism altogether since then. From a purely self-interested POV, there's no reason to apply any such sanity heuristic to my choices. The hard truth is, the disutility of others is not my disutility. That's a straightforward fact of life. So why should I treat it as such? It's really hard to justify on logical grounds.

My axiomatic disagreements with high-status "altruists" who all but trademarked altruism itself, e.g. Elon Musk and the EA community, actually helped me in this reflection process. To see how badly those altruists went wrong (from my perspective) and how much they compromised their own resource base and wellbeing while causing more innocent victims than they prevent opened my eyes and led me back to the most fundamental question, "Why care in the first place?"

I still do instrumental altruism, but it is strictly limited to facilitating the interests of those who facilitate my personal interests. Similarly, I undermine the interests of those who undermine my personal interests.

My only disagreement with EA now is when you harm the overall interests of people in the countries where you apply your lobbying and propaganda efforts. For example, I don't want more costly immigration to my country, or unilateral GHG reductions, or higher consumer prices to save the rainforests or for animal welfare, or tax funds going to scientific projects that don't facilitate my wellbeing or the competitiveness of my allies. I'm willing to penalize all such efforts that make me personally worse off severely.

This is not limited to EA, of course, and you're not the worst offenders. Organized religion is actually at the top of my enemy list for now. They've been working ridiculously hard to harm my personal interests to an almost comical degree. For example, through lobbying and propaganda efforts to reduce my personal end-of-life choice set without my consent. As a consequence, I now routinely shift all my indifference curves to harm all their interests and values as much as possible by all means available, whenever applicable. I don't think they had and idea what they were doing, either. When I discussed with them, I always had the impression that they thought undermining the personal interests of other people is somehow a free action. Like, if they use moralistic language or refer to the will of their God, this somehow works like a magic spell that will make the retaliation damage go away. Of course, none of that is true.

The only remotely good thing about them from my perspective is their pedophilia. I'd like to see consensual pedophilia normalized, since I would personally welcome this increase in my sexual choice set. I would love to live in a world where it is legal and socially accepted to buy consensually made child porn or pay willing 12-year-olds for blowjobs during the holidays. They could make an honest buck, and everybody would gain more sexual options. So I'm actually not upset about all those pedophilic priests. It's good to see them come out of their closet. Unfortunately, there's never any real distinction made between consensual conduct and child sexual abuse. I can't even tell from the media reports if the kids molested by organized religion had an honest exit option explained to them or if they were threatened, lied to, or otherwise coerced. Overall the pedo bashing has gotten worse rather than better over the last few years, and evil opportunists like Ross Douthat are trying to leverage the "me too" phenomenon to call for bans on consensually made adult (!) porn. Ridiculously stupid enemy action.

Anyway, sorry for the long post; I just wanted to let you know that I apologize for my earlier posts that suggested increasing x-risk as an altruistic strategy. They were motivated by the benevolent concern that lower x-risk will increase the total amount of evil in the universe. Which of course it actually will. Since then, I gave up on benevolence altogether, with the exception of rewarding those who facilitate my personal interests anyway. I don't really care what happens after my death, because there's no good reason for me to care. As long as you don't undermine the rational self-interest of people in the West or my person specifically, I have no more problem with EA. Just don't parasitize us without giving us higher value back, economically or in terms of our rights and liberties; others are already paying a huge price for trying that.