Hide table of contents

The Centre for Effective Altruism (CEA) is helping to build a community of people acting on altruistic, impartial, truth-seeking principles, by nurturing spaces to discuss EA ideas. This post sets out our plans for the year.

In 2019, we carried out an Executive Director search, which ended when I was confirmed as Executive Director and subsequently brought on Joan Gass as Managing Director. We also focused on improving the execution of our programs (e.g. by more reliably following through on public commitments). 

In 2020, we’ll be figuring out how we want to build on that foundation — i.e. improve our organizational focus and program objectives. We’ll also begin to narrow our focus by investing more in groups and community support, and considering how to spin off our fundraising projects (Giving What We Can and EA Funds).

Summary

Below are our goals for 2020. We’ll review these goals periodically and adjust as circumstances change. We might not achieve all of these goals.

  1. Developing our strategy
    1. Gather feedback and data to test our strategy.
    2. Share more information about our strategy on the EA Forum and our website.
    3. For all projects, draft goals that link to our overall strategy, and use these goals as the basis for our annual review.
  2. Narrowing our scope by considering spinning off EA Funds and Giving What We Can
    1. Make a plan for this area that has been confirmed by the ED, CEA’s trustees, and project staff.
    2. Hire at least one person in this area.
  3. Hiring to support groups, diversity, and public communications
    1. Hire at least one person to increase support for university group organizers.
    2. Hire, or bring on a consultant, to the community health team.
    3. Gather feedback from members of underrepresented groups on EA messaging.
  4. Improving online discussion
    1. Run at least two online events between April and December 2020.
    2. Achieve a mean score of >8/10 on monthly support calls with organisers of EAGx conferences.
    3. Share a sequence of introductory material on the Forum, and add supporting features to enable these materials’ composition.
    4. Make it easier to find Forum content via improved tagging and search.
    5. Improve the Forum post editor and reach out to potential authors.
  5. Streamlining internal collaboration and processes
    1. Maintain or increase average team morale scores during the COVID-induced remote work period.
    2. Aggregate data from at least three programs into a central CRM.
    3. Invest our reserves in low-risk, low-rate, diversified portfolios.
    4. Move into a new office in Oxford.
    5. Revamp our hiring and onboarding systems.
    6. Implement a new grant management system.
    7. Align account codes, automate reporting, and share quarterly financial reports with project leads.

I’d welcome feedback on these plans via this form or in the comments, especially if you think there’s something that we’re missing or could be doing better.

1. Developing our strategy

For CEA as a whole

We are developing our answers to some of the following questions:

  • What is CEA’s scope? What things should we do? What should we not do?
    • We hope that clarifying our plans will make it easier for others to coordinate with us — in particular, to take on areas that we’re not focused on.
  • What would indicate that we’re doing a good job overall? What metric(s) should we focus on?
    • We hope that this will help us to focus internally and make tradeoffs.
    • It will also make it easier to communicate with community members and donors about the progress we’re making.
  • What are the top priorities for CEA over the next 1-3 years?
  • Which target audience should we focus on? 

I have been working with an experienced executive advisor, trustees, and staff to develop draft answers to the questions above. 

Some preliminary answers are:

  • What is CEA’s scope? 
    • CEA should focus on nurturing spaces for people to learn about and act on EA principles. These discussion spaces might include events, local groups, or online discussion spaces like the Forum. We want to promote the principles of EA with illustrative applications (e.g. to global health or existential risk), rather than any one particular application (e.g. to global health or existential risk).
    • We don’t think it’s our comparative advantage to do cause-specific community building (e.g. AI-safety discussion groups), promote particular career paths, promote effective giving, or do research.
    • Obviously, we want to continue to support and collaborate with partners who have a different focus (e.g. cause-specific or talent-focused).
  • What would indicate that we’re doing a good job overall?
    • We think it would be harmful to evaluate CEA based on any simple metric.
    • We are developing a metric that accounts for how many people are regularly engaging with and acting on EA principles and how satisfied those people are with EA.
    • We think that this metric should be supplemented by qualitative feedback from community members.
  • What are the top priorities for CEA over the next 1-3 years? 
    • For the 1-year timescale, our best guess is discussed below.
    • We’re still developing longer-term plans, but if we had to guess, we expect that Groups is the program at CEA that is most likely to grow.
  • Which target audience should we focus on? 
    • We think that we are best placed to support existing community members (of all ages) and to recruit students and young professionals. We’re keen to welcome new high-net worth individuals or mid-career professionals in the EA community, but we think that other projects (e.g. Effective Giving and 80,000 Hours) are better suited to recruit them.

We will improve our answers to these questions by consulting community members, gathering more data, and testing some hypotheses.

We also aim to share more about our strategy publicly. This may be a long process, so we're not sure whether we’ll have capacity to share more detailed plans in 2020.

For our projects

We’d like to make our program objectives more explicit, test our assumptions about how to reach those objectives, and check for things we might be missing. In particular, we’d like to improve our answers to the following questions:

  • How do each program’s objectives contribute to achieving our org-wide strategy?
  • How do we track/measure whether we're achieving these objectives?
  • Do we have a reasonable plan for how we will achieve these objectives?
  • Are we executing well (e.g. meeting commitments, allocating sufficient capacity to run each program)?

2. Narrowing our scope by considering spinning off EA Funds and Giving What We Can

In 2019, Giving What We Can members logged over $20m in donations to the charities that they believe to be most effective, and 528 people took a 10% lifetime pledge, bringing the year-end total to 4,454 members. EA Funds facilitated grantmaking of $8.5m through the four main funds, as well as $3.4m to other effective charities. 

I think that both of these programs are important for EA because:

  • They direct a significant amount of money to effective charities.
  • They provide an opportunity for individuals to take important, concrete actions based on EA principles.

However, these projects have a fairly different focus from CEA’s other projects (which focus on community engagement rather than charitable donations), and we think that with more focus and staff time they could achieve more.

We'd like to move towards a state where these projects have the latitude and resources to accomplish more, and where CEA can focus on a narrower range of projects. Over the last few months I’ve been working with trustees and staff to plan for the future of these projects, using surveys of users and members to inform our thinking.

We’ll initially search for someone who can lead an independent Giving What We Can. If you know someone who you think might be a good fit for this role (including yourself), you can fill out this form. If we find a leader for Giving What We Can, we’ll help to onboard and advise them, and we will continue to provide operational support to both EA Funds and Giving What We Can for the foreseeable future. Once we’ve completed our hiring round for the Giving What We Can director, we will consider focusing more on plans for hiring an executive for and/or spinning out EA Funds.

3. Expanding our groups and community health support 

Expanding groups support

EA Survey respondents report that local groups and personal connections are some of the most important ways that they hear about EA and get more involved. This suggests that support for groups is important.

CEA’s Groups team currently has two full-time staff (Katie Glass, Harri Besceli) as well as a part-time contractor (Catherine Low). With this capacity, we expect to maintain the team’s current activities, such as funding organisers with the Community Building Grants program, maintaining discussion spaces for organisers on Facebook and Slack, curating resources for groups, and responding to organisers’ requests for advice or funding.

We’ve found that university groups are especially well-positioned to engage with new people interested in EA. We therefore intend to hire a new specialist to develop customised advice and resources for university group organisers. Given the large number of university groups, we’ll start by piloting this with a subset of groups.

More community health work

Our community health team works to reduce risks to the community, and to improve people’s experiences in the community. This may increase effective altruism’s robustness and potential over the long run.

The team currently has three staff (Julia Wise, Sky Mayhew, Nicole Ross) who cover the following areas:

  • Researching what the biggest risks to EA are, and creating proposals for addressing them
  • Responding to community members’ concerns, such as mental health struggles, safety concerns, or interpersonal conflicts
  • Responding to media inquiries and developing clear, accurate public messages about complex EA topics
  • Advising on organizational strategy or responding to community members’ requests for advice on best practices on a variety of topics — for example, how to host safe events, moderate online groups, and build diverse and welcoming groups or programs

We want to have clear, inspiring messages about complex EA topics, and we want the spaces we foster to be welcoming to a diverse range of people. We think that these goals are linked. We’re still figuring out how to achieve these goals, and whether we need new hires/consultants to help us achieve them.

4. Improving online discussion

Exploring online events

Due to COVID-19, we could not hold EA Global: San Francisco in person. Local groups around the world have stopped holding events and retreats, and we’re still not sure how many EAGx conferences will be able to take place “on location” this year. 

We pivoted EA Global: San Francisco to an online format, which resulted in over 3,000 total live views of talks (and several thousand more since), 438 one-on-one meetings, and lots of discussion on Slack and YouTube. Whilst attendees didn’t make as many connections or give as much positive feedback as they do at in-person EA Global conferences, they still reported a positive experience and increased motivation. We’re pleased with the number of meetings and new connections given the lower cost of the event. We are also providing coordination and support for local groups and EAGx organisers as they shift to online events and discussion spaces.

The connections that people have formed in these events help to maintain people’s engagement with the community. We are interested in holding more virtual events in the future, and hope to learn things that we can apply once the current crisis is over.

Developing the EA Forum

The EA Forum is a key resource for people who read about and discuss EA online. This year, we’d like to provide better introductory resources, and increase the number of people reading and writing excellent content. Our plans include: adding introductory material, supporting authors, and helping readers find content that interests them.

Introductory material: We plan to launch a feature which allows users to create collections of posts (e.g. on a particular theme). We will release at least one collection ourselves, curated by our team and aimed at introducing people to effective altruism. We will ask experts and a broad range of community members to give feedback on the collection. In the future, this might replace our EA Handbook or the introduction on effectivealtruism.org.

Supporting authors: We will reach out to a number of EA researchers who rarely or never post on the Forum to offer assistance with editing and formatting, which we hope will lead them to publish more of their work on the platform. For the benefit of all our authors, we will be updating the text editor so that it is much easier to use (with features including native table support and the ability to copy and paste images into a post). We’re also considering ways to give authors better feedback on how people have engaged with their posts.

Helping readers: We will add a tagging system to make it easier to browse posts on a given topic. We’ll also make back-end technical changes to make it easier to find specific posts via Google search.

(Several features mentioned in this update will be ported over from LessWrong; much of the Forum’s code is based on that site, and we work closely with their team on development.)

We will also continue to focus on increasing our core metric and will choose additional projects throughout the year to further that goal.

5. Streamlining internal collaboration and processes

Information systems

Currently, our records of community members are split across different systems for different programs. This year, we aim to pull together this data into one CRM. 

We hope that this will enable staff to more easily access and aggregate information, and make more informed decisions (e.g. about EAG admissions). It will also help us to track community growth and monitor whether people are drifting away or continuing to engage with EA. 

Improving operations

CEA runs operations for 80,000 Hours and the Forethought Foundation, provides operational support to several other EA projects, and routes millions of dollars of donations to effective projects and organisations. Given this, there are significant benefits to reducing risks, saving time, and investing financial reserves appropriately.

To improve our operations, we intend to

  • Move into a larger, refurbished office in Oxford (shared with the Future of Humanity Institute and the Global Priorities Institute).
  • Upgrade our hiring and onboarding systems.
  • Set up software systems to automate grantmaking operations and to track donations to CEA, 80,000 Hours, and the Forethought Foundation.
  • Improve our financial systems (e.g. automate more of our reporting).
  • Invest our reserve capital in low-risk, low-rate, diversified portfolios.

Strengthening a thriving remote work environment

For the past few years, we’ve had several remote staff, so we’ve always had lots of video calls, team retreats, and an active Slack workspace. 

We plan to permanently close our office in Berkeley, with our Berkeley staff moving to remote work. This will save time and money, and establish Oxford as our headquarters. 

Due to COVID-19, all our staff are currently working remotely; we’ve invested further in making sure that staff have ergonomic home working spaces, regular group social calls, time off when necessary, and tailored support from their managers.

Team and culture

I’m proud of how we have developed as a team over the last year. Staff at our last team retreat reported enjoying collaborating with colleagues, learning from them, and seeing them grow.

We're still developing a more explicit account of our cultural values, but some things I'd like us to keep doing are:

  • Raising our standards for execution and making sure we have capacity to execute on any projects we take on
  • Acknowledging and learning from our mistakes (e.g. see our updated mistakes page)
  • Encouraging internal transparency and upward feedback
  • Building a more detailed and specific sense of our goals, how well our projects are furthering those goals, and how we hope the EA community will develop, via discussion
  • Seeking advice and input from stakeholders and feedback from a broad range of community members as we develop project goals

Conclusion

By the end of 2020, I hope we'll have made further progress on our org-wide strategy and have more specific program objectives that help us achieve our org-wide goals. I also hope that we’ve strengthened our groups and community support, and begun to set EA Funds and Giving What We Can up to flourish independent of CEA.

I’d welcome feedback on these plans via this form or in the comments. I particularly look forward to hearing reactions to our initial thoughts on strategy, shared above.

We plan to check in on these goals in a future annual review.

Comments20
Sorted by Click to highlight new comments since:

I'm looking forward to CEA having a great 2020 under hopefully much more stable and certain leadership!

I’d welcome feedback on these plans via this form or in the comments, especially if you think there’s something that we’re missing or could be doing better.

This is weakly held since I don't have any context on what's going on internally with CEA right now.

That said: of the items listed in your summary of goals, it looks like about 80% of them involve inward-facing initiatives (hiring, spinoffs, process improvements, strategy), and 20% (3.3, 4.1-5) involve achieving concrete outcomes that affect things outside of CEA. The report on progress from last year also emphasized internal process improvements rather than external outcomes.

Of course, it makes sense that after a period of rapid leadership churn, it's necessary to devote some time to rebuilding and improving the organization. And if you don't have a strategy yet, I suppose it makes sense to put "develop a strategy" as your top goal and not to have very many other concrete action items.

As a bystander, though, I'll be way more excited to read about whatever you end up deciding your strategy is, than about the management improvements that currently seems to be absorbing the bulk of CEA's focus.

I think this is a really important point, and one I’ve been thinking a lot about over the past month. As you say, I do think that having a strategy is an important starting point, but I don’t want us to get stuck too meta. We’re still developing our strategy, but this quarter we’re planning to focus more on object-level work.  Hopefully we can share more about strategy and object-level work in the future. 

That said, I also think that we’ve made a lot of object-level progress in the last year, and we plan to make more this year, so we might have underemphasized that. You can read more in the (lengthy, sorry!) appendix to our 2019 post, but some highlights are:

  • Responding to 63 community concerns (ranging from minor situations (request for help preparing a workshop about self-care at an event) to major ones (request for help working out what to do about serious sexual harassment in a local group)).
  • Mentoring 50 group organizers, funding 80 projects run by groups, running group organizer retreats and making 30 grants for full-time organizers, with 25 case studies of promising people influenced by the groups we funded.
  • Helping around 1000 EA Global attendees make eight new connections on average, with 350 self-reported minor plan changes and 50 self-reported major plan changes (note these were self-reports, so are nowhere near as vetted as e.g. 80k plan changes).
  • 85% growth over 10 months in our key Forum metric and 34% growth in views of EA Global talks.
  • ~50% growth in donations to EA Funds, and millions in reported donations from GWWC members

Of course, there are lots of improvements we still need to make, but I still feel happy with this progress, and with the progress we made towards more reliably following through on commitments (e.g. addressing some of the problems with EA Grants). 

Adding a little bit to Max's comment. — When I count the number of our staff working on each section, I get ~half of staff focused on the external-facing goals. And that's on top of the business as usual work, which is largely external facing. I was one of the people pushing for more object level work this quarter, but my feeling of the distance between what it was and what I wanted it to be was not as high as it might seem from a simple count of the number of goals.[1]


  1. Which, to be clear, you had no way of knowing about, and you explicitly called out that it was weakly-held. ↩︎

I think a very common failure mode for CEA over the past ~5 years has been: CEA declares they are doing X, now no one else wants to or can get funding to do X, but CEA doesn't actually ever do X, so X never gets done.

I think spinning off GWWC and EA Funds is a great step for fixing this failure mode and I'd like to see CEA continue to welcome other groups and people to filling neglected roles in the community.

I think I’m pretty torn up about this. I agree that this was a failure, but going too far in the other direction seems like a loss of opportunity. I think my ideal would be something like a very competent and large CEA, or another competent and large organization spearheading a bunch of new EA initiatives. I think there’s enough potential work to absorb an additional 30-1000 full time people. I’d prefer small groups to do this to a poorly managed big group, but in general don’t trust small groups all too much for this kind of work in the long run. Major strategic action requires a lot of coordination, and this is really difficult with a lot of small groups.

I think my take is that the failures mentioned were mostly failures of expectations, rather than bad decisions in the ideal. If CEA could have done all these things well, that would have been the ideal scenario to me. The projects often seemed quite reasonable, it just seemed like CEA didn’t quite have the necessary abilities at those points to deliver on them.

Referencing above comments, I think, “Let’s make sure that our organization runs well, before thinking too much about expanding dramatically” is a very legitimate strategy. My guess is that given the circumstances around it, it’s a very reasonable one as well. But I also have some part of me inside screaming, “How can we get EA infrastructure to grow much faster?”.

Perhaps more intense growth, or at least bringing in several strong new product managers, could be more of a plan in 1-2 years or so.

I think a very common failure mode for CEA over the past ~5 years has been: CEA declares they are doing X, now no one else wants to or can get funding to do X, but CEA doesn't actually ever do X, so X never gets done.

I agree with this.  I think we've been making progress both in following through on what we say we'll do and in welcoming others to fill neglected roles, and I'd like to see us continue to make progress, particularly on the latter.

Hi Max, good to read an update on CEA's plans.

Given CEA's central and influential role in the EA community, I would be interested to hear more on the approach on democratic/communal governance of CEA and the EA community. As I understand it, CEA consults plenty with a variety of stakeholders, but mostly anonymously and behind closed doors (correct me if I'm wrong). I see lack of democracy and lack of community support for CEA as substantial risks to the EA community's effectiveness and existence.

Are there plans to make CEA more democratic, including in its strategy-setting?

I agree that it’s important that CEA reliably and verifiably listens to the community.

I think that we have been listening, and we published some of that consultation - for instance in this post and in the appendix to our 2019 review (see for instance the EA Global section).

Over the next few months we plan to send out more surveys to community members about what they like/dislike about the EA community members, and as mentioned above, we’re thinking about using community member satisfaction as a major metric for CEA. If it did become a key metric, it’s likely that we would share some of that feedback publicly. 

We don’t currently have plans for a democratic structure, but we’ve talked about introducing some democratic elements (though we probably won’t do that this year). 

Whilst I agree that consultation is vital, I think the benefits of democracy over consultation are unclear. For instance, voters are likely to have spent less time engaging with arguments for different positions and there is a risk of factionalism. Also the increased number of stakeholders means that the space of feasible options is reduced because there are few options that a wide spread of the community could agree on, which makes it harder to pursue more ambitious plans. 

I think you’re right that this would increase community support for CEA’s work and make CEA more accountable. I haven’t thought a lot about the options here, and it may be that there are some mechanisms which avoid the downsides. I’d be interested in suggestions.

Anyway, I definitely think it’s important for CEA to listen to the community and be transparent about our work, and I hope to do more of that in the future.

Thanks for the elaborate reply!

I think there's a lot of open space in between sending out surveys and giving people binding voting power. I'm not a fan of asking people to vote on things they don't know about. However, I have something in mind of "inviting people to contribute in a public conversation and decision-making process". Final decision power would still be with CEA, but input is more than one-off, the decision-making is more transparant, and a wider range of stakeholders is involved. Obviously, this does not work for all types of decisions - some are too sensitive to discuss publicly. Then again, it may be tempting to classify many decisions as "too sensitive". Well, organisation "opening up" should be an incremental process, and I would definitely recommend to experiment with more democratic procedures.

I really appreciate your transparency and your overall focus on improving your ability to meet your commitments and deliver projects to a high standard. I've given feedback as separate comments below so that people can build on the different topics - hopefully it's not too messy!

I had not read through the CEA mistakes page before (linked in your post), and I am very impressed with it. I wanted to note that I'm pleased and kind of touched that the page lists neglect of animal advocacy in the 2015 and 2016 EAGs. I was one of the advocates who was unhappy, and I was not sure whether there was recognition of this, so it was really meaningful to see CEA admit this and detail steps that are taken.

"We’ll also make back-end technical changes to make it easier to find specific posts via Google search."

Hurray!!

On your plan for an independent Giving What We Can: Wasn't GWWC previously independent, before it was incorporated into CEA in 2016? What's changed over the last 5 years to warrant a reversal?

Thanks for your comments! 

>Wasn't GWWC previously independent, before it was incorporated into CEA in 2016?

Essentially, yes. Giving What We Can was founded in 2009. CEA was set up as an umbrella legal entity for GWWC and 80,000 Hours in 2011, but the projects had separate strategies, autonomous leadership etc. In 2016, there was a restructure of CEA such that GWWC and some of the other activities under CEA’s umbrella came together under one CEO (Will MacAskill at that time), whilst 80,000 Hours continued to operate independently. 

>What's changed over the last 5 years to warrant a reversal?

To be honest, I think it’s less that the strategic landscape has changed, and more that the decision 5 years ago hasn’t worked out as well as we hoped. 

(I wasn’t around at the time the decision was made, and I’m not sure if it was the right call in expectation. Michelle (ex GWWC Executive Director) previously shared some thoughts on this on the Forum.)

As discussed here, from 2017 to 2019 CEA did not invest heavily in Giving What We Can. Communications became less frequent and the website lost some features. 

We’ve now addressed the largest of those issues, but the trustees and I think that Giving What We Can is an important project that hasn’t lived up to its (high) potential under the current arrangement (although pledges continue to grow).

Giving What We Can is one of the most successful parts of CEA. Over 4500 members have logged over $125M in donations. Members have pledged to donate $1.5B.  Beyond the money raised, it has helped to introduce lots of people (myself included) to the EA community. This means that we are all keen to invest more in GWWC.

I also think it’s important to narrow CEA’s focus. That focus looks like it’s going to be nurturing spaces for people to discuss and apply EA principles. GWWC is more focused on encouraging a particular activity (pledging to donate to charities). Since it was successfully run as an independent project in the past, trying to spin it out seemed like the right call. I’m leading on this process and trustees are investing a lot of time in it too, and we’ll work very closely with new leadership to test things out and make sure the new arrangement works well.

Hey Max, thanks for all this explanation! One question: Has any thought been given to spinning off GWWC and EA Funds together (as a single organization), given that they share a similar focus, common users, and some degree of web integration?

Yes, we’ve thought about this. We currently think that it’s probably best for them to spin off separately, so that’s the main option under consideration, but we might change our minds (for instance as we learn more about which candidates are available, and what their strategic vision for the projects would be). 

This is a bit of a busy week for me, so if you’d like me to share more about our considerations, upvote this comment, and I’ll check back next week to see if there’s been sufficient interest.

"We want to promote the principles of EA with illustrative applications (e.g. to global health or existential risk), rather than any one particular application (e.g. to global health or existential risk)."

I really appreciate this approach. I think it will promote a community that can work together on a variety of different causes.

Why are you moving to Oxford?

Sorry, that paragraph wasn’t clear. Before we had offices in Oxford and Berkeley. The change is to close the Berkeley office (for reasons discussed above) and keep the Oxford office open. We think it’s useful to be in Oxford because that’s where a lot of our staff are currently based, and because it allows us to keep in touch with other EA orgs (e.g. the Global Priorities Institute) who share our office in Oxford. 

There's also reference to 'moving office' in Oxford. That's because the cluster of EA organisations currently sharing an office in Oxford - CEA, FHI and GPI - have outgrown their current office and are together moving to a bigger one.

Curated and popular this week
Relevant opportunities