In many ways, most EAs are extraordinarily smart, but in one way EAs are naive. The most well known EAs have stated that the goal of EA is to minimize suffering. I can't explain this well at all, but I'm certain that is not the cause or effect of altruism as I understand it.
Consider The Giver. Consider a world where everyone was high on opiates all the time. There is no suffering or beauty. Would you disturb it?
Considering this, my immediate reaction is to restate the goal of EA as maximizing the difference between happiness and suffering. This still seems naive. Happiness and suffering are so interwoven, I'm not sure this can be done. The disappointment from being rejected by a girl may help you come to terms with reality. The empty feeling in the pit of your stomach when your fantasy world crumbles motivates you to find something more fulfilling.
It's difficult to say. Maybe one of you can restate it more plainly. This isn't an argument against EA. This is an argument that while we probably do agree on what actions are altruistic--the criteria used to explain it are overly simplified.
I don't know if there is much to be gained by having criteria to explain altruism, but I am tired of "reducing suffering." I like to think about it more as doing what I can to positively impact the world--and using EA to maximize that positivity where possible. Because altruism isn't always as simple as where to send your money.
I agree the EA movement is not about minimizing suffering, and its existence does not minimize suffering.
I don't even agree it's about maximizing happiness or well-being, as some pretend, and its existence does not maximize those things.
In fact, it has no coherent goals at all, because people can't agree on any, let alone formalize them. That's why you see all this fluff, like "flourishing" or the "future of humanity". Or, as OP babbles, "positively impact the world". Completely meaningless phrases to avoid the elephant in the room.
The EA movement, like humanity itself, does not have coherent goals and I'm pretty convinced it's going to cause much more suffering than it prevents. It may cause some additional wellbeing or pleasure, almost by accident, but not in any efficient or optimized way - that's just not what it does sociologically and psychologically. But hey, movement growth! It's the equivalent to humanity's GDP growth or "progress": meaningless metrics that take on their own status as quasi-religious goals to pledge allegiance to.
It's all very toxic and pretty useless, which is why Isupport neither the EA movement nor "humanity" itself and would never consider it actual altruism.
Yeah, I bet you have a long marathon of pain prevention behind you, hahaha.
I agree that EA as a whole doesn't have coherent goals (I think many EAs already acknowledge that it's a shared set of tools rather than a shared set of values). But why are you so sure that "it's going to cause much more suffering than it prevents"?