AB

Allen Bell

85 karmaJoined

Posts
1

Sorted by New

Comments
2

I agree that it’s great that EA values truth-seeking. However, I’m not sure a social community is essential to acting according to this value, since this value could be incorporated on the level of projects and organisations just as well.

For example, consider the scientific method and thinking in a sciency way. Although we can speak of ‘the scientific community’ it’s a community with fairly weak social ties. And most things happen on the level of projects and organisations. These (science) projects and organisations usually heavily incorporate scientific thinking.

For an individual's experience and choices there are usually many ‘communities’ relevant at the same time, e.g. their colleagues, school-mates, country of residence, people sharing their language etc. However, each of these ‘communities’ have a differently sized impact on their experience and choices. What I’m arguing for is increasing the ‘grasp’ of projects and organisations and decreasing the grasp of the wider EA community.

Thanks!

a) I broadly like the idea that “we also have to be willing to back ourselves and not risk crippling our effectiveness by optimising too much on minimising downside in the case where we are wrong”. I would like to note that downgrading the self-directed-investment reduces the need for caution, and so reduces the crippling effect.

j) I think it’s hard to decide how much meta-investment is optimal. You talk about it as if it’s a matter of dialling up or down one parameter (money) though, which I think is not the right way to think about it. The ‘direction’ in which you invest in meta-things also matters a lot. In my ideal world “Doing Good Better” becomes part of the collective meme-space just like “The Scientific Method” has. However, it’s not perfectly obvious which type of investment would lead to such normalisation and adoption. 

h) I’m happy to hear Giving What We Can and GiveWell don’t position themselves in the wider EA framework. I’m not very up to date with how (effectively) they are spreading memes.

c) Running an intervention such as the Criticism Red Teaming and Competition is only effective if people can fundamentally change their minds based on submissions. (And don’t just enjoy that we’re all being so open-minded by inviting criticisms or only change their minds about non-core topics.)

f) I agree talent is important. However, I think organising as a community might as well have made us lose out on talent. (This “a local community running events to help them understand why the cause was important” actually gives me some pyramid scheme vibes btw.)

i) I wasn’t talking about poaching. I was more talking about: caring about all  EA cause areas should not in any way be a conditional or desired outcome for someone caring about one  EA cause area. 
Re “shifting from one cause are to another could potential lead to orders of magnitude in the impact someone has”: sure, but I think in EA the cost of switching has also been high. It switches all the time what people think is the most impactful area, and skilling up in a new area takes time. If someone works in a useful area, has built up expertise there and the whole area is to their comparative advantage, then it would be Best if they stayed in that area. 

k) Here we disagree. I think that within a project there should be value-alignment. However, the people within a project imo do not have to be value-aligned to EA at large. 

Re “I'd suggest that having the right culture is a key part of achieving high performance.” I personally think “Doing the thing” and engagement with concrete projects is most important.

I also actually feel like “could reduce impact by an order of magnitude from having people pursue the highest impact project that is high status rather than the highest impact project” is currently partially caused by the EA community being important to people. If people’s primary community is something like a chess club, pub or family, then there are probably loads of ways to increase status in that group that have nothing to do with the content of their job (e.g. getting better at chess, being funny, and being reliable and kind). However, if the status that’s most important to you is whether other EAs think your work is impactful, then you end up with people wanting to work on the hottest topic, rather than doing the most impactful think based on their comparative advantage.