6

Near-Term Effective Altruism Discord

I have started a Discord server for near-term effective altruists. (If you haven’t used Discord before, it’s a pretty standard chat server. Most of its functions are fairly self-explanatory.)

Most of my effective altruist friends focus on the far future. While far-future effective altruists are great, being around them all the time can get pretty alienating. I don’t often argue the merits of bednets versus cash transfers, which means I get intellectually sloppy knowing I won’t be challenged. I’m slow to learn about new developments relevant to near-term effective altruism, such as discoveries in development economics. Many of the conversations I participate in work from assumptions I don’t share, such as the assumption that we have a double-digit chance of going extinct within the next twenty years.

I suspect that many other near-term effective altruists may be in the same boat, and if so I encourage them to come participate. Even if not, I hope this server can be a fun and interesting place to learn more about effective altruism and connect to other effective altruists.

“Near-term” is hard to define. I intend it to be inclusive of all effective altruists whose work and priority cause areas do not focus on the far future, whether they work on global poverty, animal welfare, mental health, politics, meta-charity, or another cause area. I ask that far-future effective altruists and people whose priority cause area is AI risk or s-risks do not participate. This runs on the honor system; I’m not going to be the Near Term EA police. There are lots of people who are edge cases and I ask them to use their best judgment.

The server is intended to be welcoming to new effective altruists, people who aren’t certain whether they want to be effective altruists or not, and people who are not currently in a place where it makes sense for them to donate, volunteer, or change careers. If you’re wondering whether you’re “not EA enough” to participate, you probably are welcome!

Comments (53)

Comment author: Dunja 10 September 2018 10:02:04AM *  15 points [-]

This is a nice idea though I'd like to suggest some adjustments to the welcome message (also in view of kbog's worries discussed above). Currently the message begins with:

"(...) we ask that EAs who currently focus on improving the far future not participate. In particular, if you currently prioritize AI risks or s-risks, we ask you not participate."

I don't think it's a good idea to select participants in a discussion according to what they think or do (it pretty much comes down to an Argumentum ad Hominem fallacy). It would be better to specify what the focus of the discussion is, and to welcome those interested in that topic. So I suggest replacing the above with:

"we ask that the discussion be focused on improving the near future, and that the far-future topics (such as AI risks or s-risks) be left for other venues, unless they are of direct relevance for an ongoing discussion on the topic of near future improvements." (or something along those lines).

Comment author: Julia_Wise  (EA Profile) 10 September 2018 02:17:35PM *  16 points [-]

I like this suggestion - personally I feel a lot of uncertainty about what to prioritize, and given that a portion of my donations go to near-term work I'd enjoy taking part in discussion about how to best do that, even if I'm also seriously considering whether to prioritize long-term work. But I'd be totally happy to have the topic of that space limited to near-term work.

Comment author: Justis 10 September 2018 03:15:28PM 5 points [-]

+1. I'm in a very similar position - I make donations to near-term orgs, and am hungry for discussion of that kind. But because I sometimes do work for explicitly long-term and x-risk orgs, it's hard for me to be certain if I qualify under current wording.

Comment author: pmelchor  (EA Profile) 11 September 2018 07:32:00AM 3 points [-]

I am personally very interested in cause areas like global poverty, so it is great to see more people wanting to discuss the related issues in depth.

Nevertheless, I strongly support the definition of EA as a question (how can we use our resources to help others the most?) and that makes me not want to tag myself as a "[enter category here] EA" (e.g. "near-term EA", "far-future EA"...).

In practical terms, the above leads me to enjoy my views being challenged by people who have come to different conclusions and I tend to favour a "portfolio approach" to doing good, somewhat along the lines of Open Phil's "worldview diversification".

Regarding discussion, there should be great spaces for both the meta topics and the cause-specific ones. Wouldn't it be ideal if we could host all those discussions under the same roof? Maybe this thread can be used as an input for the upcoming EA Forum 2.0. The feature request would be something like "make it easy to host and find worldview-specific discussions".

Comment author: kbog  (EA Profile) 10 September 2018 12:08:35AM *  7 points [-]

Discord lets you separate servers into different channels for people to talk about different things. There is already an EA Discord, of course new and near term EAs are welcome there. I think it would be bad if we split things like this because the more the near term EAs isolate themselves, the more and more "alienated" people will feel elsewhere, so it will be a destructive feedback loop. You're creating the problem that you are trying to solve.

Also, it would reinforce the neglect of mid-term causes which have always gotten too little attention in EA.

I ask that far-future effective altruists and people whose priority cause area is AI risk or s-risks do not participate.

Yeah, this isn't good policy. It should be pretty clear that this is how groupthink happens, and you're establishing it as a principle. I get that you feel alienated because, what, 60% of people have a different point of view? (perish the thought!) And you want to help with the growth of the movement. But hopefully you can find a better way to do this than creating an actual echo chamber. It's clearly a poor choice as far as epistemology is concerned.

You're also creating the problem you're trying to solve in a different way. Whereas most "near-term EAs" enjoy the broad EA community perfectly well, you're reinforcing an assumption that they can't get along, that they should expect EA to "alienate" them, as they hear about your server. As soon as people are pointed towards a designated safe space, they're going to assume that everything on the outside is unfriendly to them, and that will bias their perceptions going forward.

You are likely to have a lighter version of the problem that Hatreon did with Patreon, Voat with Reddit, etc - whenever a group of people has a problem with the "mainstream" option and someone tries to create an alternative space, the first people who jump ship to the alternative will be the highly-motivated people on the extreme end of the spectrum, who are the most closed-minded and intolerant of the mainstream, and they are going to set the norms for the community henceforth. Don't get me wrong, it's good to expand EA with new community spaces and be more appealing to new people, it is always nice to see people put effort into new ideas for EA, but this is very flawed, I strongly recommend that you revise your plans.

Comment author: Julia_Wise  (EA Profile) 10 September 2018 02:18:42AM 16 points [-]

Moderator note: I found this harsher than necessary. I think a few tone changes would have made the whole message feel more constructive.

Comment author: kbog  (EA Profile) 10 September 2018 02:25:20AM 2 points [-]

What statements were "harsher than necessary"?

Comment author: Julia_Wise  (EA Profile) 10 September 2018 02:18:12PM 6 points [-]

I'll PM you.

Comment author: MichaelPlant 12 September 2018 08:33:47AM *  5 points [-]

I don't find your objections here persuasive.

Yeah, this isn't good policy. It should be pretty clear that this is how groupthink happens, and you're establishing it as a principle. I get that you feel alienated because, what, 60% of people have a different point of view?

If you want to talk about how best to X, but you run into people who aren't interested in X, it seems fine to talk to other pro-Xers. It seems fine that FHI gathers people who are sincerely interested about the future of humanity. Is that a filter bubble that ought to be broken up? Do you see them hiring people who strongly disagree with the premise of their institution? Should CEA hire people who effective altruism, broadly construed, is just a terrible idea?

You're also creating the problem you're trying to solve in a different way. Whereas most "near-term EAs" enjoy the broad EA community perfectly well, you're reinforcing an assumption that they can't get along, that they should expect EA to "alienate" them, as they hear about your server

To be frank, I think this problem already exists. I've literally had someone laugh in my face because they thought my person-affecting sympathies were just idiotic, and someone else say "oh, you're the Michael Plant with the weird views" which I thought was, well, myopic coming from an EA. Civil discourse, take a bow.

Comment author: kbog  (EA Profile) 12 September 2018 10:38:59AM *  3 points [-]

It seems fine that FHI gathers people who are sincerely interested about the future of humanity. Is that a filter bubble that ought to be broken up?

If so, then every academic center would be a filter bubble. But filter bubbles are about communities, not work departments. There are relevant differences between these two concepts that affect how they should work. Researchers have to have their own work departments to be productive. It's more like having different channels within an EA server. Just making enough space for people to do their thing together.

Do you see them hiring people who strongly disagree with the premise of their institution? Should CEA hire people who effective altruism, broadly construed, is just a terrible idea?

These institutions don't have premises, they have teloses, and if someone will be the best contributor to the telos then sure they should be hired, even though it's very unlikely that you will find a critic who will be willing and able to do that. But Near Term EA has a premise, that the best cause is something that helps in the near term.

To be frank, I think this problem already exists. I've literally had someone laugh in my face because they thought my person-affecting sympathies were just idiotic, and someone else say "oh, you're the Michael Plant with the weird views" which I thought was, well, myopic coming from an EA. Civil discourse, take a bow.

That sounds like stuff that wouldn't fly under the moderation here or the Facebook group. The first comment at least. Second one maybe gets a warning and downvotes.

Comment author: Elizabeth 10 September 2018 08:45:40PM 2 points [-]

I don’t often argue the merits of bednets versus cash transfers, which means I get intellectually sloppy knowing I won’t be challenged.

OK, but in that case wouldn't it be better to stick around people with opposing points of view?

This seems like a pretty severe misreading to me. Ozy is saying that they want to hone their arguments against people with expertise in a particular field rather than a different field, which is perfectly reasonable.

Comment author: kbog  (EA Profile) 10 September 2018 08:51:27PM *  2 points [-]

You're right, I did misread it, I thought the comparison was something against long term causes.

In any case you can always start a debate over how to reduce poverty on forums like this. Arguments like this have caught a lot of interest around here. And just because you put all the "near-term EAs" in the same place doesn't mean they'll argue with each other.

Comment author: michaelchen 11 September 2018 02:47:02AM 1 point [-]

For what it's worth, I felt a bit alienated by the other Discord, not because I don't support far-future causes or that it was even discussing the far future, but because I didn't find the conversation interesting. I think this Discord might help me engage more with EAs, because I find the discourse more interesting, and I happen to like the way Thing of Thing discusses things. I think it's good to have a variety of groups with different cultures and conversation styles, to appeal to a broader base of people. That said, I do have some reservations about fragmenting EA along ideological lines.

Comment author: adamaero  (EA Profile) 10 September 2018 02:57:35AM 1 point [-]

Is the other Discord not publicly viewable? I've never heard of it.

Comment author: michaelchen 11 September 2018 02:36:40AM 1 point [-]
Comment author: kbog  (EA Profile) 10 September 2018 03:02:36AM *  0 points [-]

It's public. I would share a link, but that would give away my discord identity, hopefully someone has it.

Comment author: ozymandias 10 September 2018 02:59:57AM 0 points [-]

I do not intend Near-Term EAs to be participants' only space to talk about effective altruism. People can still participate on the EA forum, the EA Facebook group, local EA groups, Less Wrong, etc. There is not actually any shortage of places where near-term EAs can talk with far-future EAs.

Near-Term EAs has been in open beta for a week or two while I ironed out the kinks. So far, I have not found any issues with people being unusually closed-minded or intolerant of far-future EAs. In fact, we have several participants who identify as cause-agnostic and at least one who works for a far-future organization.

Comment author: kbog  (EA Profile) 10 September 2018 04:07:22AM *  0 points [-]

I do not intend Near-Term EAs to be participants' only space to talk about effective altruism. People can still participate on the EA forum, the EA Facebook group, local EA groups, Less Wrong, etc. There is not actually any shortage of places where near-term EAs can talk with far-future EAs.

There is not any shortage of places where near-term EAs can talk with near-term EAs - it is the same list. (except for maybe LessWrong, which may be bad for the same reasons as this discord server, but at least they are open to everyone's participation, and don't make a brand out of their POV.) But if the mere availability of alternative avenues for dissenting opinions were sufficient for avoiding groupthink, then groupthink would not exist. Every messageboard is just a click away from many others. And yet we see people operating in filter bubbles all the same.

Please see my comment reply to adamaero, "near-term EA" is a thesis, not a legitimate way to carve up the movement (the same goes for long-term EA), and it shouldn't be entrenched as a kind of ideology - certainly not as a kind of identity, which is even worse. You are reinforcing a framing that will continue to cause deep problems that will be extremely difficult to undo. Consider focusing on poverty reduction instead, for instance.