In many ways, most EAs are extraordinarily smart, but in one way EAs are naive. The most well known EAs have stated that the goal of EA is to minimize suffering. I can't explain this well at all, but I'm certain that is not the cause or effect of altruism as I understand it.
Consider The Giver. Consider a world where everyone was high on opiates all the time. There is no suffering or beauty. Would you disturb it?
Considering this, my immediate reaction is to restate the goal of EA as maximizing the difference between happiness and suffering. This still seems naive. Happiness and suffering are so interwoven, I'm not sure this can be done. The disappointment from being rejected by a girl may help you come to terms with reality. The empty feeling in the pit of your stomach when your fantasy world crumbles motivates you to find something more fulfilling.
It's difficult to say. Maybe one of you can restate it more plainly. This isn't an argument against EA. This is an argument that while we probably do agree on what actions are altruistic--the criteria used to explain it are overly simplified.
I don't know if there is much to be gained by having criteria to explain altruism, but I am tired of "reducing suffering." I like to think about it more as doing what I can to positively impact the world--and using EA to maximize that positivity where possible. Because altruism isn't always as simple as where to send your money.
You (and the 5 people who agreed) are blowing my mind right now.
Based on the last paragraph, it sounds like you would support a world full of opiate users--provided there was a sustainable supply of opiates.
The first paragraph is what's blowing my mind though. When I was a baby, I'm pretty sure I would have told you that a room with toys and sweets would maximize my happiness. I guess you could argue that I'd eventually find out that it would not sustain my long term happiness, but I really do think some amount of suffering ensures happiness in the future. Perhaps this is overly simple, but I'm sure you have fasted at some point (intentionally or not) and that you greatly appreciated your next meal as a result.
Lastly, you separate knowledge and feelings from suffering, but I'm not sure this can be done. My parents told me not to do X because it would hurt, but I did not learn until I experienced X firsthand.
I'm amazed that so many EAs apparently think this way. I don't want to be mean, but I'm curious as to what altruistic actions you have taken in your life? Really looking forward to your reply.
I think your argument is actually two: 1) It is not obvious how to maximize happiness, and some obvious-seeming strategies to maximize happiness will not in fact maximize happiness. 2) you shouldn't maximize happiness
(1) is true, I think most EAs agree with it, most people in general agree with it, I agree with it, and it's pretty unrelated to (2). It means maximizing happiness might be difficult, but says nothing about whether it's theoretically the best thing to do.
Relatedly, I think a lot of EAs agree that it is sometimes indeed the fact that to maxim... (read more)