Hide table of contents

Thanks to Vaidehi Agarwalla, Charles Dhillon, Chana Messinger, Miranda Zhang, Bruce Tsai for comments. I don’t imply these people agreed with the article. Sorry if I missed others. 

Tl;dr

  • Sometimes people talk as if the EA community should make funding decisions. I disagree
  • We are a network of highly engaged individuals, though we could be closer networked
  • We support one another, though we could be more deliberate about this
  • We could build better tools for synthesising knowledge and checking work done by decision-makers
  • We could take more control over our community narrative and better signal what we are and what we aren’t

Its important to know who you are

Self-conception is important. It helps us make decisions. CEA was nearly called:

  • Rational Altruist Community, RAC
  • Effective Utilitarian Community, EUC
  • Evidence-based Charity Association, ECA
  • Alliance for Rational Compassion, ARC
  • Evidence-based Philanthropy Association, EPA

I guess several of these would have led to quite different outcomes. I think there would be far fewer non-consequentialists in the Effective Utilitiarian Community, for instance. With that in mind, what should our conception of EA be? And for things we want, how do we make it more like them? Are we:

  • A democractic funding body?
  • A talent pool?
  • Community support?
  • Error checkers and researchers?
  • A rabid fan base?
  • Storytellers

Are we a democratic body that makes funding decisions?

We aren't this and I don't want us to be. I guess this would lead us away from the highest risk, highest EV options. Sometimes I read people who talk as if is our job and that we should be consulted. This both isn’t reality, won’t be and ought not to be. 😬

Suggestions solutions:

  • Funding pools. I'd love to see more people giving to funding pools and reporting their decisions
  • Community research. I cover this later, but making the decisions and checking them for errors are different in my opinion.

Are we a network of talented professionals who work on lower wages than they otherwise would?

Partially. And that made sense when there wasn’t much money around. Now, as has been pointed out elsewhere we should want to pay people the rates which maximise impact (MacAskill roughly makes an analogous argument here that we should be maximise impact on retreats). 

That said, I think we underrate the network angle. How much gain is created by us being 10% better networked to one another? How can we be intentional [1] about our community having denser connections?

Suggested solutions:

  • Professional matchmaking tools on the forum. An option to more easily meet up with EAs who are like you or find people who might solve your problems or whose problems you might solve
  • Go to some EAGxs! It’s odd to think that what is a really fun weekend might make the world better, but EAGxs definitely do help me feel comfortable asking those EAs for favours and likely would increase the chance of me working with them in future. If some funding would make the difference, EAGs have generous travel bursaries!  I'm gonna say this twice, please don't let your personal finances stop you - just honestly describe your circumstances and let the organisers decide.

Are we a community which supports one another?

Yes. We are working on hard problems with emotional consequences. Some are helping the globally poor and have to live close to the reality of how their leisure spending could change the lives of people they know personally. Others are trying to avoid biodisasters and have to read depressing reports on the current state of regulation. I think we should be watchful of work like this harming us and offer support to one another. 

As a community, we don’t always agree. Currently there are live discussions about funding, communications and longtermism. As I will say next, I think error checking here is useful, but there is also value in supporting other parts of the community through change. For example, a small but significant proportion (25%?) of EAs aren’t consequentialists and they might struggle with media representation the EA as consequentialism in action. I want to be there for my community members even though I am not personally affected.

Suggested solutions:

  • The forum already serves as an emotional escape valve. Many community or emotive posts get a lot of upvotes. I guess we like to read posts about community and it’s easier for us to feel we can say they are true and upvote them, compared to say a post on machine learning methods. I think this could become a bit of problem in the long term, since community discussion is so much more engaging. As a friend put it "community members are customers of the community but are employees of EA organisations".
  • Official responses. Will’s recent post on the funding situation is really well written. But it’s also written by Will. Feeling listened to matters. Likewise EA big dogs talking about their mental health struggles is important.
  • Polis polls on the forum. Here is one that you delightful souls answered about longtermism. I think it’s helpful for us to see how the community feels in a more effective way than just upvotes. Are there consensus positions or key misunderstandings?
  • Quarterly polls. There is space for someone to run a community survey every 3 month and then report the results.

Are we armchair researchers and error checkers?

Maayybee. Obviously some people are paid to do research, but many EAs I know like to do some research on the side, or check through this or that decision. For me, this is one where we as a community could do better. While I don't think the community should make decisions, there is a lot of value in allowing the clever individuals in the community to check them. And this does, happen - Google docs are often sent round, but this could be a more open and hence valuable process. Largely I don't think the technology required to synthesise and correct huge amounts of information as an informal community exists yet. But given the potential value, it is worth looking into.

Suggested solutions:

  • Forecasting. I want to see us forecast big EA questions and see who has a track record of being right. In the face of increasing money, a good track record is a great signal
  • An acknowledgement that there are many ways to be sent google docs about important decisions. You can work at an org. You can be trusted by the author. Importantly, you can write a good forum post. I was surprised to find out that after my forum post on comms people messaged me comms queries.
  • The next google docs. I wonder if we can do better. Are there better ways of including trusted groups in decisions and getting their opinions on documents?
  • Private forum areas. I don’t love this one, but it’s worth considering if some discussion would become more open if there was a private but open-ish space for them to happen
  • Community posts. There is a huge amount on this forum that we will never read. If we could collaboratively summarise it, that would bring some of these great ideas back into our view.

Are we a rabid fan base?

I hope not. When the New York Times tried to write a story about the Scott Alexander, the rationalist community went to war. Scott pulled down SSC, many people cancelled subscriptions and a huge amount of talk was created (here is Scotts biased account which I personally trust). My sense is that some members of the community harassed NYT staffers, though I’m not certain of this. What I do know is that when the community is attacked individuals should put a high price on attacking back.

There will be (more) articles which attack EA and in the past these have often been biased or misrepresent us. As I said in my twitter tips blog, I think it’s better to push positive messages than for individuals to respond to every criticism. Often if criticism is uninteresting, responding just gives it more views than it deserves. I usually want to get back to the work of doing more good.

Suggested solutions:

  • I am confident that if there are articles attacking EA figures or ideas that there are people working on responses. Unless you are particularly close in some way to the person giving criticism, I doubt it’s worth your time
  • Response posts. I think that gracious point by point documents responding to major criticisms are underrated. These are only worth writing if the criticism is very public, but it can be useful to have something to show your friends

Are we storytellers?

Hopefully. I think it's important for a community to be able to retell its own narrative to work out decisions for the future. Or to put it another way, to have a simple model of what it’s trying to achieve. In some ways, that's what this post is about. How do we decide what we are about? Is Will gonna tell us? Or can we also write it ourselves?

I have a healthy respect for elites, EA elites in particular are generally more intelligent, more dedicated and kinder than me. I’m interested in the vision they want to give for this community. But I also think that we can do useful work here as a group.

Suggested solutions:

  • Community posts I am not brave or knowledgeable enough to write a post on what the community *is* but I would cowrite one with all of you. Can we come to consensus on what we are about?
  • Community epistemics. What are we weak as a community on? If we could all learn one idea, what would it be? Just as we can educate ourselves, it might be that there are core ideas that as and EA community we are weak on (personally it’s my knowledge of economics). It might be worth finding these out to provide space for people to write explainers on these topics/models

Conclusion

I want us to see ourselves as a community who support one another, check decisions and take responsibility for community knowledge and vision. Feel free to disagree. Just as we can be agentic as individuals we can be agentic as a community, seeking tools to make our community more like we want it to be. For me, the community could have significant community infrastructure to help us be more like we want to be.

Post script: My way or the highway

I used to want everyone to like me. These days I think that's actively destructive. I am not for everyone. I can only mould myself to other people's preferences for so long and then the movement of disconnect is much worse. If I'm honest from the beginning, people can cheaply think "he's not someone I feel similar to" and we can be polite acquaintances.

So it is with EA. EA is not and should not be for everyone. But pretending it is will leave people hurt and waste our emotional energy. Instead we should ask "what do we want to be?" and how do we signal yes to those who will love this and a polite no to those who will hate it? I really like Will MacAskill's frame of Judicious Ambition (which honestly could be a church's vision statement). It acknowledges we are a community which weighs the options carefully, but must be willing to take risks. Even the name presents such a choice - if you have no interest in being effective or altruistic, EA probably isn’t for you.

I’m (currently) feeling like a pretty big tent kind of guy. My sense is that EA should have a highly engaged core and a broad movement of people giving a little more a little more effectively. But that doesn’t mean we should be for everything or for everyone and I’d like us to signal what we are and what we aren’t.

I reserve the right to edit this post and  significant edits and explanations will go here:


 

  1. ^

    One for all you Christians.

29

0
0

Reactions

0
0

More posts like this

Comments5
Sorted by Click to highlight new comments since:

Upvoted even though I disagree with important parts, because I think this kind of post is a good idea.

I'm curious about your idea of the relationship between the community and funders/managers. On the one hand, you say (without much explanation) that funding decisions ought not to be, and never will be, made democratically. On the other hand, you think the community should inspect and check decisions by funders.

This leads me to ask: what do you envision should happen, if the community finds funding decisions to be bad, or points to a new appointment being a very bad idea, for example. What's the mechanism of accountability?

Other questions on the same topic:

  • If enlarging and diversifying the community to include more backgrounds and perspectives is important to making the right decisions for humanity's benefit (which I definitely believe) - what would make funders take these views into account?
  • What, in your eyes, gives community members a feeling of belonging and cooperation if, de facto, funding and management decisions don't take them into account in any way?

This leads me to ask: what do you envision should happen, if the community finds funding decisions to be bad, or points to a new appointment being a very bad idea, for example. What's the mechanism of accountability?


What currently happens is that people talk in private and then later post on the forum about it. This seems reasonable.

What, in your eyes, gives community members a feeling of belonging and cooperation if, de facto, funding and management decisions don't take them into account in any way?

I don't think community opinions aren't taken into account. I think fund managers are affected by community discourse. But they aren't bound to it.  I  feel belonging and cooperation for all the other reasons I state. I don't need to be [making decisions] to feel involved.

If enlarging and diversifying the community to include more backgrounds and perspectives is important to making the right decisions for humanity's benefit (which I definitely believe) - what would make funders take these views into account?

I'm interested in better decisions and I agree that it seems better to have more diverse groups for error correction. Firstly, those people can be part of the error correction systems I mention. Secondly they can gain prestige and then work at funding orgs.

Do those answer you questions?

I wish there were non scoring karma comments so that I could put each of my subheadings and allow you to vote on them.

I know that often people hate this idea, but:

  • We've never tried it
  • I use facebook comments as mini polls a lot and people are much more likely to upvote or downvote a canned response than to write one themselves - it would lead to more engagement

+1 to a polls feature! Add to forum feature thread?

I like those Polis polls you keep posting. Maybe you should now have one to vote on that :)

Curated and popular this week
Relevant opportunities