In many ways, most EAs are extraordinarily smart, but in one way EAs are naive. The most well known EAs have stated that the goal of EA is to minimize suffering. I can't explain this well at all, but I'm certain that is not the cause or effect of altruism as I understand it.
Consider The Giver. Consider a world where everyone was high on opiates all the time. There is no suffering or beauty. Would you disturb it?
Considering this, my immediate reaction is to restate the goal of EA as maximizing the difference between happiness and suffering. This still seems naive. Happiness and suffering are so interwoven, I'm not sure this can be done. The disappointment from being rejected by a girl may help you come to terms with reality. The empty feeling in the pit of your stomach when your fantasy world crumbles motivates you to find something more fulfilling.
It's difficult to say. Maybe one of you can restate it more plainly. This isn't an argument against EA. This is an argument that while we probably do agree on what actions are altruistic--the criteria used to explain it are overly simplified.
I don't know if there is much to be gained by having criteria to explain altruism, but I am tired of "reducing suffering." I like to think about it more as doing what I can to positively impact the world--and using EA to maximize that positivity where possible. Because altruism isn't always as simple as where to send your money.
I think generalizing from these examples (and especially from fictional examples in general) is dangerous for a few reasons.
Fiction is not designed to be maximally truth-revealing. Its function is as art and entertainment, to move the audience, persuade them, woo them, etc. Doing this can and often does involve revealing important truths, but doesn't necessarily. Sometimes, fiction is effective because it affirms cultural beliefs/mores especially well (which makes it seem very true and noble). But that means it's often (though certainly not always) a reflection of its time (it's often easy, for example, to see how fiction from the past affirmed now-outdated beliefs about gender and race). So messages in fiction are not always true.
Fiction has a lot of qualities that bias the audience in specific useful ways that don't relate to truth. For example, it's often beautiful, high-status, and designed to play on emotions. That means that relative to a similar non-fictional but true thing, it may seem more convincing, even when the reasoning is equally or less sound. So messages in fiction are especially powerful.
For example, I think the Giver reflect the predominant (but implicit) belief of our time and culture, that intense happiness is necessarily linked to suffering, and that attempts to build utopias generally fail in obvious ways by arbitrarily excluding our most important values. Iirc, the folks in the Giver can't love. Love is one of our society's highest values; not loving is a clear sign they've gone wrong. But the story doesn't explain why love had to be eliminated to create peace, it just establishes a connection in the readers' minds without providing any real evidence.
Consider further that if it was true that extreme bad wasn't a necessary cost of extreme good, we would probably still not have a lot of fiction reflecting that truth. This is simply because fiction about everything going exceedingly well for extended periods of time would likely be very boring for the reader (wonderful for the characters, if they experienced it). People would not read that fiction. Perhaps if you made them do so they would project their own boredom onto the story, and say the story is bad because it bored them. This is a fine policy for picking your entertainment, but a dangerous habit to establish if you're going to be deciding real-world policy on others' behalf.
I agree that it's dangerous to generalize from fictional evidence, BUT I think it's important not to fall into the opposite extreme, which I will now explain...
Some people, usually philosophers or scientists, invent or find a simple, neat collection of principles that seems to more or less capture/explain all of our intuitive judgments about morality. They triumphantly declare "This is what morality is!" and go on to promote it. Then, they realize that there are some edge cases where their principles endorse something intuitively abhorrent, or pr... (read more)