K

kshen

20 karmaJoined

Comments
5

Awesome post! I often think that EA needs to better incorporate this huge part of human experience into its discourse, and I think it can go beyond simply motivating people.

This essay also touched on a lot of themes of the Replacing Guilt essay series, which also came out of the EA community.

That was a very interesting essay! I love the distinction of aptness vs instrumentality.

However, the closing paragraphs posed an odd turn of arguments -- essentially, the author tries make a move to say, "I reserve the right to be angry because it is one of the few/last instruments I be any kind of productive." While I agree with assessment, it seems to do a disservice to the author's argument to draw the attention back to the instrumentality of anger. The whole strength of her argument was that there is some place for anger, just as we grant to aesthetics and beauty and senses of justice, in discourse, that stands before considerations of instrumentality.

Lastly, it is also interesting that this essay expresses some disdain for consequentialism as oppressive. That is another intricate dynamic that may be pertinent to EA.

I want to echo all the interest in leftist critique (usually it reduces into something about colonialism, racism, or capitalism), but from the perspective that @JulianHazell brought up, i.e. of being able to reach a wider audience. I.e. at some point, EA needs to get better at representing itself in a nontechnical manner. 

Btw, I'm writing from the perspective of someone who doesn't have a job in EA, but who sees a lot of leftist leanings in organizations that I'm a part of. 

My personal experience is that I doubt a point-by-point rebuttal would change any minds/reach a broader audience, but it would serve to "reinforce the faith" of people in EA. This does precious little to get EA values out.

I guess what I'm getting to is not so much a critique of EA, but wanting to think critically about how EA should make itself more accessible. Instead of forcing other movements to speak EA language, EA also needs to learn how to speak other languages, or at least provide more accessible language that others can buy into.

If someone wants to work on this, let me know!

=====

Some other ideas informing me:

I would say that any outsider's (including leftists' objection to EA) is fundamentally tied to some of the most counterintutive aspects of EA:

  • Dispositional differences: EA is so future-focused as to seemingly deny the importance of the present (e.g. EA deprioritizes climate change, etc. ). If EA is so focused on the future, it seems that much more removed from the past (which a lot of leftist priorities are about).
  • Stylistic differences: EA prioritizes effective and precise action. This echoes @CharlesHe's comment.  Contrast this to when liberals turn colonialism, racism, and capitalism into a wrecking ball that subsumes everything. They've lost descriptive power and nuance. It's a blunt use of frameworks that drains all meaning from them, but maybe that's their intent -- systemic change, destruction be damned. This is infuriating to EA, because it seems poorly thought out and imprecise. But ironically, one argument I've seen from outside EA is that EA's interventions are so focused on the tree as to miss the forest. Of course, this is a caricature, and I think there's some kind of middle ground where mutual interest can be found.

Great post! Thanks for sharing and the great overview of all the resources out there.

I think part of the fundamental challenge is that trust and good information is expensive,  because relationships are expensive. And oftentimes the truth (think, nuances, caveats, further considerations) is... psychologically unsatisfying. 

The problem of setting up responsible institutions is itself a difficult issue, but no matter what, even the most responsible institutions will fail if they aren't good at building relationships with people at large. This latter work is expensive and tricky and hard to scale, which may be part of why it's under-addressed by people in the EA community?

Second is to focus on psychological understanding and interventions, and cheap easy ways to counter native human biases. I think the lateral reading example you gave is a prime example about this. The humor of a rumor is a similar example. Reason does not seem to be the most effective/efficient way to fight misinformation... and that may be another sticking point that makes it hard to address.

Hi all! Recently found this community and I'm really impressed with the discourse here!

This is kind of meta and not about EA per se, but from a community-builder's perspective I was wondering how this forum is moderated (self or otherwise), and how it was built up to such a vibrant space! Are there other forums like this (I know lesswrong runs on a similar-looking community blogging model)? Have there been any moderation challenges? 

I read through some of these posts (https://forum.effectivealtruism.org/tag/discussion-norms) but would appreciate any other links people might have.

 

background: I've been thinking of ways to build community in another setting that is more "lay audience." While I have a strong affinity for the "academic-ish" and "open-source-ish" standards that this EA forum seems to be able to maintain, I also recognize that these practices 1) take effort and 2) are probably not the default mode of being for most people. I'm curious about how to make these standards simple and concise enough to be grasped, but not over-simplified.