6

kbog comments on Near-Term Effective Altruism Discord - Effective Altruism Forum

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (53)

You are viewing a single comment's thread.

Comment author: kbog  (EA Profile) 10 September 2018 12:08:35AM *  7 points [-]

Discord lets you separate servers into different channels for people to talk about different things. There is already an EA Discord, of course new and near term EAs are welcome there. I think it would be bad if we split things like this because the more the near term EAs isolate themselves, the more and more "alienated" people will feel elsewhere, so it will be a destructive feedback loop. You're creating the problem that you are trying to solve.

Also, it would reinforce the neglect of mid-term causes which have always gotten too little attention in EA.

I ask that far-future effective altruists and people whose priority cause area is AI risk or s-risks do not participate.

Yeah, this isn't good policy. It should be pretty clear that this is how groupthink happens, and you're establishing it as a principle. I get that you feel alienated because, what, 60% of people have a different point of view? (perish the thought!) And you want to help with the growth of the movement. But hopefully you can find a better way to do this than creating an actual echo chamber. It's clearly a poor choice as far as epistemology is concerned.

You're also creating the problem you're trying to solve in a different way. Whereas most "near-term EAs" enjoy the broad EA community perfectly well, you're reinforcing an assumption that they can't get along, that they should expect EA to "alienate" them, as they hear about your server. As soon as people are pointed towards a designated safe space, they're going to assume that everything on the outside is unfriendly to them, and that will bias their perceptions going forward.

You are likely to have a lighter version of the problem that Hatreon did with Patreon, Voat with Reddit, etc - whenever a group of people has a problem with the "mainstream" option and someone tries to create an alternative space, the first people who jump ship to the alternative will be the highly-motivated people on the extreme end of the spectrum, who are the most closed-minded and intolerant of the mainstream, and they are going to set the norms for the community henceforth. Don't get me wrong, it's good to expand EA with new community spaces and be more appealing to new people, it is always nice to see people put effort into new ideas for EA, but this is very flawed, I strongly recommend that you revise your plans.

Comment author: Julia_Wise  (EA Profile) 10 September 2018 02:18:42AM 16 points [-]

Moderator note: I found this harsher than necessary. I think a few tone changes would have made the whole message feel more constructive.

Comment author: kbog  (EA Profile) 10 September 2018 02:25:20AM 2 points [-]

What statements were "harsher than necessary"?

Comment author: Julia_Wise  (EA Profile) 10 September 2018 02:18:12PM 6 points [-]

I'll PM you.

Comment author: MichaelPlant 12 September 2018 08:33:47AM *  5 points [-]

I don't find your objections here persuasive.

Yeah, this isn't good policy. It should be pretty clear that this is how groupthink happens, and you're establishing it as a principle. I get that you feel alienated because, what, 60% of people have a different point of view?

If you want to talk about how best to X, but you run into people who aren't interested in X, it seems fine to talk to other pro-Xers. It seems fine that FHI gathers people who are sincerely interested about the future of humanity. Is that a filter bubble that ought to be broken up? Do you see them hiring people who strongly disagree with the premise of their institution? Should CEA hire people who effective altruism, broadly construed, is just a terrible idea?

You're also creating the problem you're trying to solve in a different way. Whereas most "near-term EAs" enjoy the broad EA community perfectly well, you're reinforcing an assumption that they can't get along, that they should expect EA to "alienate" them, as they hear about your server

To be frank, I think this problem already exists. I've literally had someone laugh in my face because they thought my person-affecting sympathies were just idiotic, and someone else say "oh, you're the Michael Plant with the weird views" which I thought was, well, myopic coming from an EA. Civil discourse, take a bow.

Comment author: kbog  (EA Profile) 12 September 2018 10:38:59AM *  3 points [-]

It seems fine that FHI gathers people who are sincerely interested about the future of humanity. Is that a filter bubble that ought to be broken up?

If so, then every academic center would be a filter bubble. But filter bubbles are about communities, not work departments. There are relevant differences between these two concepts that affect how they should work. Researchers have to have their own work departments to be productive. It's more like having different channels within an EA server. Just making enough space for people to do their thing together.

Do you see them hiring people who strongly disagree with the premise of their institution? Should CEA hire people who effective altruism, broadly construed, is just a terrible idea?

These institutions don't have premises, they have teloses, and if someone will be the best contributor to the telos then sure they should be hired, even though it's very unlikely that you will find a critic who will be willing and able to do that. But Near Term EA has a premise, that the best cause is something that helps in the near term.

To be frank, I think this problem already exists. I've literally had someone laugh in my face because they thought my person-affecting sympathies were just idiotic, and someone else say "oh, you're the Michael Plant with the weird views" which I thought was, well, myopic coming from an EA. Civil discourse, take a bow.

That sounds like stuff that wouldn't fly under the moderation here or the Facebook group. The first comment at least. Second one maybe gets a warning and downvotes.

Comment author: Elizabeth 10 September 2018 08:45:40PM 2 points [-]

I don’t often argue the merits of bednets versus cash transfers, which means I get intellectually sloppy knowing I won’t be challenged.

OK, but in that case wouldn't it be better to stick around people with opposing points of view?

This seems like a pretty severe misreading to me. Ozy is saying that they want to hone their arguments against people with expertise in a particular field rather than a different field, which is perfectly reasonable.

Comment author: kbog  (EA Profile) 10 September 2018 08:51:27PM *  2 points [-]

You're right, I did misread it, I thought the comparison was something against long term causes.

In any case you can always start a debate over how to reduce poverty on forums like this. Arguments like this have caught a lot of interest around here. And just because you put all the "near-term EAs" in the same place doesn't mean they'll argue with each other.

Comment author: michaelchen 11 September 2018 02:47:02AM 1 point [-]

For what it's worth, I felt a bit alienated by the other Discord, not because I don't support far-future causes or that it was even discussing the far future, but because I didn't find the conversation interesting. I think this Discord might help me engage more with EAs, because I find the discourse more interesting, and I happen to like the way Thing of Thing discusses things. I think it's good to have a variety of groups with different cultures and conversation styles, to appeal to a broader base of people. That said, I do have some reservations about fragmenting EA along ideological lines.

Comment author: adamaero  (EA Profile) 10 September 2018 02:57:35AM 1 point [-]

Is the other Discord not publicly viewable? I've never heard of it.

Comment author: michaelchen 11 September 2018 02:36:40AM 1 point [-]
Comment author: kbog  (EA Profile) 10 September 2018 03:02:36AM *  0 points [-]

It's public. I would share a link, but that would give away my discord identity, hopefully someone has it.

Comment author: ozymandias 10 September 2018 02:59:57AM 0 points [-]

I do not intend Near-Term EAs to be participants' only space to talk about effective altruism. People can still participate on the EA forum, the EA Facebook group, local EA groups, Less Wrong, etc. There is not actually any shortage of places where near-term EAs can talk with far-future EAs.

Near-Term EAs has been in open beta for a week or two while I ironed out the kinks. So far, I have not found any issues with people being unusually closed-minded or intolerant of far-future EAs. In fact, we have several participants who identify as cause-agnostic and at least one who works for a far-future organization.

Comment author: kbog  (EA Profile) 10 September 2018 04:07:22AM *  0 points [-]

I do not intend Near-Term EAs to be participants' only space to talk about effective altruism. People can still participate on the EA forum, the EA Facebook group, local EA groups, Less Wrong, etc. There is not actually any shortage of places where near-term EAs can talk with far-future EAs.

There is not any shortage of places where near-term EAs can talk with near-term EAs - it is the same list. (except for maybe LessWrong, which may be bad for the same reasons as this discord server, but at least they are open to everyone's participation, and don't make a brand out of their POV.) But if the mere availability of alternative avenues for dissenting opinions were sufficient for avoiding groupthink, then groupthink would not exist. Every messageboard is just a click away from many others. And yet we see people operating in filter bubbles all the same.

Please see my comment reply to adamaero, "near-term EA" is a thesis, not a legitimate way to carve up the movement (the same goes for long-term EA), and it shouldn't be entrenched as a kind of ideology - certainly not as a kind of identity, which is even worse. You are reinforcing a framing that will continue to cause deep problems that will be extremely difficult to undo. Consider focusing on poverty reduction instead, for instance.