D

devansh

532 karmaJoined

Comments
28

There is a pool of donors who make their decisions based on the opinions of EA

 

There is a pool of donors who make their decisions based on their own beliefs and the beliefs of individuals they trust, not "EA." See this post.

I think that paragraph is quite misguided. "Becoming much more risk averse" is a great way to stop doing anything at all because it's passed through eight layers of garbage. On top of this, it's not like "literally becoming the US federal government" and "not having any accounting or governance at all" are your only two options; this creates a sad false dichotomy. SBF was actively and flagrantly ignoring governance, regulation, and accounting. This is not remotely common for EA orgs.

Like, for the last couple of decades we've been witnesssing over and over again how established, risk-averse institutions fail because they're unable to compete with new, scrappy, risk-tolerant ones (that is, startups).

"Good governance" and bureaucracy are, while correlated, emphatically not the same thing. EA turning into a movement that fundamentally values these over just doing good in the world as effectively as possible will be a colossal failure, because bureaucracy is a slippery slope and the Thing That Happens when you emulate the practices that have been used for centuries is that you end up not being able to do anything. I'd be very sad if this was our final legacy.

To be clear, this is an account that joined from Twitter to post this comment (link).

And I’m nervous about what I perceive as dynamics in some circles where people seem to “show off” how little moderation they accept - how self-sacrificing, “weird,” extreme, etc. they’re willing to be in the pursuit of EA goals. I think this dynamic is positive at times and fine in moderation, but I do think it risks spiraling into a problem.

There seems to be an important trade-off here, where this is a valuable signal that the person "showing off" is aligned with your values and it's actually pretty useful to know that (especially since current gradients often push in favor of people who are not aligned paying lip service to EA ideas in order to gain money/status/power for themselves). 

The balance of how much we should ask or expect of this category of sacrifice seems like one that we should put lots of time as a community into thinking about, especially when we're trying to grow quickly and are unusually willing to provide resources to people.

It seems incredibly important that EA, as a community, maintains extremely high epistemic standards and is a place where we can generally assume that people, while not necessarily having the same worldviews or beliefs as us, can communicate openly and honestly about the reasons for why they're doing things. A primary reason for this is just the scale and difficulty of the things that we're doing.

That's what makes me quite uncomfortable with saying global health and development work is reparation for harms that imperialist countries have caused poor countries! We should work on aid to poor countries because it's effective, because we have a chance to use a relatively small-to-us amount of money to save lives and wildly improve the conditions of people in poor countries—not because aid represents reparations from formerly imperial countries to formerly subjugated ones

I think many people who identify with social justice and leftist ideologies are worth recruiting and retaining. But I care more about our community having good epistemics in general, about being able to notice when we are doing things correctly and when we are not, and conveying our message honestly seems really important for this. This objection is not "leftists have bad epistemics," like you mentioned at the start of this article - you should increase recruitment and retention but not lower your own epistemic standards as a communicator to do so.

I think parts of this post are quite good, and I think when you can do low-cost things that don't lower your epistemic standards (like using social justice focused examples, supporting the increase of diversity in your movement, saying things that you actually believe in order to support your arguments in ways that connect with people). But I think this post at the current moment needs a clear statement of not lowering epistemic standards when doing outreach to be advice that helps overall.

"It's worth noting that the scale of the funding overhang isn't absolute; there are"

Is this a typo?

>>after a tech company singularity, such as if the tech company develops safe AGI

I think this should be "after AGI"?

Ah, I see. I guess I kind of buy this, but I don't think it's nearly as cut-and-dry as you argue, or something. Not sure how much this generalizes, but to me "staying in school" has been an option that conceals approximately as many major suboptions as "leaving school." I'd argue that for many people, this is approximately true - that is, people have an idea of where they'd want to work or what they'd want to do given leaving school, but broadly "staying in school" could mean anything from staying on ~exactly the status quo to transferring somewhere in a different country, taking a gap year, etc.

Load more