Hide table of contents

At the risk of starting a messy discussion, I'm curious how folks feel the heavy linking between the EA Forum and LessWrong affects both content on the Forum itself and the EA community more generally.

I know many people identify as part of both spaces (and I myself peruse LW), but I'm wondering if the connection has larger cultural effects, from normalized writing styles to a perhaps disproportionate rationalist representation. Thoughts?

56

0
0

Reactions

0
0
New Answer
New Comment

8 Answers sorted by

I think EA is at its best when it takes the high epistemic standards of LW and applies them to altruistic goals. I see the divergence growing, and that worries me.

Pato
13
4
0

Can you give me an example of EA using bad epistemic standards and an example of EA using good epistemic standards?

I think EA is at its best when it takes the high epistemic standards of LW and applies them to altruistic goals.

I agree with this.

(I don't know whether the divergence is growing, shrinking, or staying the same.)

Personally, I'm not a fan of LessWrong's thinking style, writing style, or intellectual products. As such, I think  EA would be better off with less LW influence in the near-medium term. 
However, I'm not familiar enough with EA's intellectual history to judge how useful LW was to it; I certainly can't predict EA's intellectual future. It seems possible that future exchange would be useful, if only for viewpoint diversity. On balance though I'd lean against heavy exchange.

What do you dislike about the LW style? Can you provide more specifics? There's a big range of authors that have been publishing on LW for several years now. 

I can try! But apologies as this will be vague - there'll be lots of authors this doesn't apply to, and this is my gestalt impression given I avoid reading much of it.  And as I say, I don't know how beneficial LW was to EA's development, so am not confident on how future exchange should go.

I tend to be frustrated by the general tendencies towards over-confidence,  in-group jargon,  and overrating the abilities or insights of their community/influences vs. others (esp. expert communities and traditional academic sources).  Most references I see to 'epistemics' seem under-specified and not useful, and usually a short-hand way to dismiss a non-conforming view.  I find it ironic that denigrating others’ ‘epistemics’ is a common LW refrain given my impression that the epistemic quality of LW seems poor. 

There's a kind of Gell-Mann amnesia effect I get where the LW discourse on things I know about decently well (medical advice, neuroscience, global health) I can easily see as  wrong, poorly conceived and argued, and over-confident. I don't have a clear personal view on the LW discourse on things I don't know well like AI, but have occasionally seen ~similar takes to mine from some people who know AI well.  

There are definitely writers/thinkers I admire from LW, but I usually admire them despite their LW-like tendencies. Losing their input would be a true loss. But for overall effect on EA, I doubt (with weak confidence) these exemplars outweigh the subpar majority.

 

I'm in favor of a clear separation between the forums. They are made for different audiences and not everything that is meant for one is meant for the other. As somebody who writes some pieces that are meant for both audiences, the cross posting feature is somewhat convenient for me (but not hugely so; I can just copy and paste). And as a reader, sometimes it's nice to see a post is cross posted so that I can go see the comments on the other forum.

I'd be interested to see how much the easy cross posting has increased the number of cross posts, and if so what kinds of posts are now more likely to be cross posted. This seems like an analysis the forum team could do and is harder to do anecdotally.

The EA Forum and LessWrong have some of the best technical infrastructure on the internet, and I think the EA Forum derives huge benefit from that. However, it does make me a little uneasy that it's made by the Lightcone team, who are in charge of LessWrong. I like the people on that team, but I expect probably some decisions that are good for LessWrong but not so good for the EA Forum might just end up propagating here by default. This is just a suspicion; I don't have any particular examples.

A lot of separation does exist. LessWrong posts are moderated pretty differently, the commenter audiences are often very different, and the types of posts are mostly different. So the connection as is isn't currently a huge concern of mine.

Clarifying a couple of points:

  • Crossposting used to be totally unrestricted; it now requires a user to have 100 karma on both LW and the EA Forum (regardless of which side they're crossposting from) to use the feature
  • While historically most development was driven by the LW team, in the last year or so the EA Forum team has hired more engineers and is now larger than the LW team by headcount (and very likely by lines-of-code shipped, or whatever other metric you want to use to analyze "how much stuff did they do").
2
ThomasW
I'm a bit confused about crossposting, are you saying it was always available? I don't remember seeing any crossposts a year ago, or being able to use the feature. In fact I used to crosspost a lot of things and specifically remember the first time I saw the crossposting feature. But maybe I just didn't notice this before. Didn't know that about the dev teams, that's useful to know!
1
RobertM
No, sorry, I meant that at the time the feature was released (a few months ago), it didn't have any karma requirement.

Crossposting has been a huge win in my opinion.

It used to be what you did was post on one site and then manually crosspost to the other. This was annoying and somewhat error prone and you had to setup the links yourself by editing at least one post after it was published.

Automatic crossposting eliminates that mess, and it adds the nice feature of letting you know if there's comments on the other site (since you don't want to read the same thing twice but you might want to know about the conversation on the other site).

The feature is also fairly easy to ignore if you don't like the other site, and there's not a ton of cross posts so it's not generating a lot of noise.

I guess if you don't like Less Wrong it might feel annoying to see Less Wrong content on EAF, but honestly you were going to see it anyway, it just would have been less clearly labeled.

As a datum from the LessWrong side as a moderator, when the crossposting was first implemented, initially there were a bunch of crossposts that weren't doing well (from a karma perspective) and seemed to be making the site worse. To address this, we added a requirement that to crosspost from EAF to LW, you need 100 karma on LW. I believe the karma requirement is symmetrical: in order to crosspost an LW post onto EAF, you need 100 EAF karma

The theory being, a bit of karma shows that you probably have some familiarity with the crosspost-destination site culture, and probably aren't just crossposting out of a vague sense of wanting to maximize your post's engagement.  I don't think it's been a problem (in the EAF->LW crossposting direction) since.

This was a clever solution. I didn't know this was a thing.

While I think some threshold barrier is a good idea, I don't think the UX makes it clear that's happening. I've never been able to successfully crosspost, and just realized this is probably why..

I think the current arms-length community interaction is good, but mostly because I'm scared EAs are going to do something crazy which destroys the movement, and that Lesswrongers will then be necessary to start another spinoff movement which fills the altruistic gap. If Lesswrong is too close to EA, then EA may take down Lesswrong with it.

Lesswrongers seem far less liable to play with metaphorical fire than EAs, given less funding, better epistemics, less overall agency, and fewer participants.

Scared as in, like, 10-15% in the next 50 years assuming we don't all die.

Pro starting messy discussions around things that are elephants in rooms (not all of them, but on the margin, and this one seems good!)

My gut feeling is that LessWrong is cringe and the heavy link to the Effective Altruism forum is making the forum cringe.

Trying to explain this feeling I'd say some features I don't like are:

  • Ignoring emotional responses and optics in favour of pure open dialogue. Feels very New Atheist.
  • The long pieces of independent research that are extremely difficult to independently verify and which often defer to other pieces of difficult-to-verify independent research.
  • Heavy use of expected value calculations rather than emphasising the uncertainty and cluelessness around a lot of our numbers.
  • The more-karma-more-votes system that encourages an echo chamber.

I disagree-voted.

I think pure open dialogue is often good for communities. You will find evidence for this if you look at most any social movement, the FTX fiasco, and immoral mazes.

Most long pieces of independent research that I see are made by open-phil, and I see far more EAs deferring to open-phil's opinion on a variety of subjects than Lesswrongers. Examples that come to mind from you would be helpful.

It was originally EAs who used such explicit expected value calculations during Givewell periods, and I don't think I've ever seen an EV calculation done on LessWrong.

I think the more-karma-more-votes system is mostly good, but not perfect. In particular, it seems likely to reduce the impact of posts which are popular outside EA but not particularly relevant to EAs, a problem many subreddits have.

I strong downvoted this because I don't like online discussions that devolve into labeling things as cringe or based. I usually replace such words with low/high status, and EA already has enough of that noise.

3
ChanaMessinger
I think I like this norm where people say what they voted and why (not always, but on the margin). Not 100% it would be better for more people to do, but I like what you did here.

I think some statements or ideas here might be overly divisive or a little simplistic.

  • Ignoring emotional responses and optics in favour of pure open dialogue. Feels very New Atheist.
  • The long pieces of independent research that are extremely difficult to independently verify and which often defer to other pieces of difficult-to-verify independent research.
  • Heavy use of expected value calculations rather than emphasising the uncertainty and cluelessness around a lot of our numbers.
  • The more-karma-more-votes system that encourages an echo chamber.

For counterpoi... (read more)

9
Jeff Kaufman
!?

I like that you were open about your gut feeling and thinking that something is cringe. I generally don't think that's a good reason to do or not do things, but it might track important things, and you fleshed yours out.

Comments2
Sorted by Click to highlight new comments since:

Thanks for posting this.

Thank you for having such a whimsical username!

Curated and popular this week
Relevant opportunities