I have started a Discord server for near-term effective altruists. (If you haven’t used Discord before, it’s a pretty standard chat server. Most of its functions are fairly self-explanatory.)
Most of my effective altruist friends focus on the far future. While far-future effective altruists are great, being around them all the time can get pretty alienating. I don’t often argue the merits of bednets versus cash transfers, which means I get intellectually sloppy knowing I won’t be challenged. I’m slow to learn about new developments relevant to near-term effective altruism, such as discoveries in development economics. Many of the conversations I participate in work from assumptions I don’t share, such as the assumption that we have a double-digit chance of going extinct within the next twenty years.
I suspect that many other near-term effective altruists may be in the same boat, and if so I encourage them to come participate. Even if not, I hope this server can be a fun and interesting place to learn more about effective altruism and connect to other effective altruists.
“Near-term” is hard to define. I intend it to be inclusive of all effective altruists whose work and priority cause areas do not focus on the far future, whether they work on global poverty, animal welfare, mental health, politics, meta-charity, or another cause area. I ask that far-future effective altruists and people whose priority cause area is AI risk or s-risks do not participate. This runs on the honor system; I’m not going to be the Near Term EA police. There are lots of people who are edge cases and I ask them to use their best judgment.
The server is intended to be welcoming to new effective altruists, people who aren’t certain whether they want to be effective altruists or not, and people who are not currently in a place where it makes sense for them to donate, volunteer, or change careers. If you’re wondering whether you’re “not EA enough” to participate, you probably are welcome!
All three of those are merely cases of you disagreeing with my claims or my confidence in them. I thought I was being tone-policed, but you are just saying that I am wrong.
The fact that people are unable to attend something is one of the problems with the server that is being promoted here. I'm not in favor of anything in EA that does this, if someone ever tries to exclude near-term EAs from their event then give me a ping and I will argue with them too!
Theoretical physicists are not upset by the presence of discussion on experimental physics, and the ones who disbelieve in dark matter are not upset by the presence of discussion from people who do. If lots of posts aren't relevant to you, the right answer is presumably to ignore those posts; I and so many other EAs do it all the time, it's easy.
If you want more content that is relevant to you... that's perfect! Make it! Request it! Ask questions about it! Be the change that you wish to see in the world.
The physics stack exchange doesn't try to exclude engineers, and they didn't make it because they thought that engineers were "alienating"; if they operated on that basis then it would create unnecessary annoyance for everyone. They separate because they are different topics, with different questions that need to be answered, and the skills and education which are relevant to one are very different from those that matter for another. But "near-term Effective Altruism", just like "long-term Effective Altruism", is a poorly specified bundle of positions with no common methodological thread. The common thread within each bundle is not any legitimate underlying presupposition about values or methodology that may form the foundation for further inquiry, it is an ex post facto conclusion that the right cause is something that happens to be short- or long-term. And while some cause conclusions could form a meaningful basis for significant further inquiry (e.g., you selected poverty as a cause, so now you just want to talk about poverty relief), the mere conclusion that the right cause is something that matters in the near or long term does not form any meaningful basis, because there is little in the way of general ideas, tools, resources, or methodologies which matter greatly for one bundle of causes but not the other.
But not only is the original analogy with physics and engineering relevantly incorrect, it's specifically pernicious, because many EAs already implicitly have the misconception that supporting near-term or long-term causes is a matter of philosophical presupposition or overarching methodology; in fact it is probably the greatest confusion that EAs have about EA and therefore it wouldn't be wise to reinforce it.
@kbog: Most of your responses with respect to my reply do not make sense. Example, EA Chicago posts their events on the Facebook page. I don't live in Chicago...(simple as that)
~ completely missed the point. Additionally, the analogy is fine. There is seldom such a thing as an absolute analogy. With that, it doesn't follow that somehow the analogy is wrong related to these elusively implicit misconceptions by EAs about EAs.
So to sum up, you're reading in way too far to what I wrote originally. I was answering your question related to why your first reply was "harsher than necessary".