Hide table of contents

Introduction

I work on the Longtermist Effective Altruism Community Growth cause area at Open Philanthropy; we aim to empower and grow the number of people trying to help make the long-term future as good as possible. (See this recent post for more about what we do and what’s going on with us as a team.)

About one year ago my coworkers Claire Zabel and Asya Bergal put out a call for proposals for outreach and community-building projects that look good on a longtermist worldview[1]. This post is meant to 1) serve as an update for readers on what’s been happening with this funding program, and 2) serve as a reminder that this funding program exists, and a renewed call for proposals.

Since one year ago, we’ve received 93 applications and funded many of them, giving out >$5m overall (see examples of projects we’ve funded, and statistics on application outcomes).

But we still have the capacity to support many more projects, and a wider variety of types of projects, than we currently do. This program is currently limited by the number and quality of applications, not by available funding, and will be for the foreseeable future. See outlines of projects and project areas we’re excited to see applications in.

If you have an idea for work aimed at supporting or growing the set of people who do good work on longtermist projects, we’d love to get an application from you. You can apply here. Only a brief pre-proposal is required for the initial application, and we’re open to making “planning grants” where you can apply for funding for up to 3 months just to develop an idea.

Applications are assessed on a rolling basis. See more details on the application process below.

If you have a question about this funding program, here’s three ways of getting them answered:

  • This post is an AMA post; feel free to Ask Me Anything in the comments. I’ll be answering questions until August 19th. I’ll aim to get through all questions, but will triage according to vote count and may not get to all of them. Questions about future plans, things we should be doing differently, logistical questions, etc. are all welcomed.
  • I’ll also be hosting office hours on August 18 — August 19 and August 22 in the mornings Pacific Time. Sign up here to talk to me for 15 minutes in person on Zoom. (I’m friendly!)
  • You can contact us at longtermfuture-outreach-rfp@openphilanthropy.org for any questions you think are better suited to email. We reply to every email. If you’re not sure whether your project is a good fit for this funding opportunity, please don’t hesitate to contact us by email.

Some projects we’ve funded through this call for proposals

  • Eon Essay Contest (also see the Forum post) — a contest where students, aged high-school and above, can win scholarship money by reading Toby Ord’s The Precipice and writing essays about it. Run by Neha Singh.
  • Asterisk — a new quarterly magazine/journal of ideas from in and around Effective Altruism, run by Clara Collier.
  • The Apollo Fellowship — a summer program for young competitive debaters, intended to introduce them to “a wide variety of philosophical and technological issues, including artificial intelligence, long-termism and existential risk, utilitarianism, and more.” Run by Jason Xiao and Sam Huang.
  • Funding for EA Brazil and EA Japan to translate articles, essays and videos about effective altruism (and related ideas), including the Intro Fellowship materials, into Portuguese and Japanese, to widen the reach of these ideas. We worked with Ramiro Peres from EA Brazil, and Luis Costigan from EA Japan, on these grants.

This is just a small sample of the projects we’ve funded, chosen to showcase variety rather than for representativeness.

Statistics on application outcomes

We’ve received 94 applications overall, of which about 24 didn’t seem to be related to effective altruism or longtermism at all.

Of the remaining 70 applications, we:

  • Funded 25.
  • Rejected 28.
  • Referred 15 to the EA Infrastructure Fund or the Long-Term Future Fund, with the applicants’ permission.
  • Are still evaluating one.
  • Started evaluating one but the applicant withdrew, I believe (this application was handled by a colleague).

Outlines of projects and project areas we’re excited to see applications in

Some work may fall into multiple categories.

  • In-person programs engaging with promising young people high-school-aged and up, e.g. retreats, summer camps, scholarships and fellowships, seminars, conferences, and workshops.
    • See our previous post’s explanation of why we think this kind of work is particularly promising and what we think is involved in doing it well (including potential downsides).
  • AI safety-focused meta work, i.e. aiming specifically at causing more people who are good fits for AI safety research to work on it (e.g. projects like EA Cambridge’s AGI Safety Fundamentals).
  • Cause-specific meta work focused on other longtermist cause areas.
  • Projects aimed at producing more excellent content on EA, longtermism, transformative technology, and similar topics, and getting it seen by many people whom it might get further interested in these topics. This could include:
    • Blog posts
    • Articles on the web
    • New magazines, webzines, blogs, and media verticals
    • Nonfiction books
    • Podcasts
    • YouTube videos
    • MOOCs
    • Many kinds of fiction (novels, web fiction, video series or TV shows…)
  • Work on advertising and marketing for high-quality content on EA, longtermism, and/or transformative technology.
    • See Digital Marketing is Under-Utilized in EA.
    • For both this and the immediately previous bullet, see our last post’s explanation of why we think this kind of work is particularly promising and what we think is involved in doing it well (including potential downsides).
  • Rationality-and-epistemics-focused community-building work, which could include:
    • Retreats or events based around epistemics and rationality.
    • Creating high-quality content around epistemics and rationality.
  • Work that tries to make EA ideas and discussion opportunities more accessible outside current EA hubs, especially outside the Anglophone West. This includes proposals to translate content into non-English languages.

We’re also interested in receiving applications to our University Group Organizer Fellowship to support university groups that are aimed at these goals, including (but not limited to!):

  • AI safety-focused university groups (e.g. Harvard AI Safety Team).
  • Rationality or epistemics-focused university groups.
  • Groups at universities outside of the Anglophone West.

See Asya and Claire’s recent post for more info about this.

More details on the application process

(This section is similar to the corresponding section in the original post.)

The application form is here. Only a brief pre-proposal (mainly a project description of <750 words) is required at this stage. If we are interested in supporting your project, we will reach out to you and invite you to submit more information.

We encourage submissions from people who are uncertain if they want to found a new project and just want funding to seriously explore an idea. In many cases we’re open to giving a “planning grant,” which is funding for up to 3 months to test and iterate on a project idea, without committing to it. We’re happy to look at multiple pre-proposals from applicants who have several different project ideas.

We may also be able to help some applicants (by introducing them to potential collaborators, giving them feedback about plans and strategy, providing legal assistance, etc.) or be able to help find others who can. We have funded, and continue to be open to, very ambitious proposals for projects that have annual budgets in the millions, including proposals to scale existing projects that are still relatively small.

We intend to reply to all applications within two months. With your permission, we may refer your application to the Effective Altruism Infrastructure Fund or the Long-Term Future Fund.

There is no deadline to apply; applications are assessed on a rolling basis. We’ll leave this form open indefinitely until we decide that this program isn’t worth running, or that we’ve funded enough work in this space. If that happens, we will update this post noting that we plan to close the form at least a month ahead of time.

As mentioned above, if you’re wondering if your project is a potential fit, or if you have any other questions, you can contact us at longtermfuture-outreach-rfp@openphilanthropy.org.

Notes


  1. Note this can include outreach or community-building work about EA in general that doesn’t focus on longtermism in particular, or outreach aimed at particular longtermist cause areas. ↩︎

50

0
0

Reactions

0
0

More posts like this

Comments6
Sorted by Click to highlight new comments since:

I appreciate the AMA style post!

Are you interested in projects that do outreach and community building to mid-career professionals, especially in tech, where some of the projects help these professionals hear about and/or transition into longtermism roles?

(If some of these parts sound like a good fit and some don't - could you tell me which are which? If any of this sounds like a good fit then I'll share some projects I'm working on or considering working on)

The answer is yes, I can think of some projects in this general area that sound good to me. I'd encourage you to email me or sign up to talk to me about your ideas and we can go from there. As is always the case, a lot rides on further specifics about the project — i.e. just the bare fact that something is focused on mid-career professionals in tech doesn't give me a lot of info about whether it's something we'd want to fund or not.

Thanks, I only meant to ask if focusing on mid-career tech is already a deal breaker for things you'd be interested in, and I understand that isn't.

 

Here are some ideas:

  • EA tech newsletter
    • Aim to keep people somewhat engaged in EA and hear about opportunities that fit them when these people are looking for work.
    • Draft
  • Running local groups that publicly transparently look for impactful tech companies together
    • This is good because:
      • If a tech person goes to a local EA group and asks how to have impact without working remotely, the group will have some answer beyond "read these 100 articles" and "we have lots of ways to analyze but no answers" (I think this is a negative first experience for senior tech people approaching EA groups)
      • It will hopefully spread the word to the tech community about "yeah, some companies are more impactful than others, and it's possible to do the analysis"
    • Longer pitch here (linkpost to the EA tech FB group)
  • Hiring
    • EA hiring agency
      • Maybe specifically for tech, maybe for other important-and-hard-to-hire jobs (Open Phil researchers?)
    • Supporting EA orgs hiring: I think there are many low hanging fruit problems to solve with EA hiring (1, 2, and many more), but this is unrelated to community building.
    • Does improving 80k's job board's conversation count as "outreach", since they have so much traffic?
  • Create a conversation in EA longtermism about "very senior people: working directly is  probably more impactful than donating"
    • I can send my draft privately.
  • EAG for tech? (I'm not sold on this yet, but maybe)

 

Also ,here's a thread of EA tech community building ideas, unrelated to outreach or longtermism (join the group here).

 

P.S

If you'd fund a CFAR or similar workshop in Israel, we have many people from EA+Lesswrong Israel who'd be interested, including me. (I estimate this would be less than $10k)

 

I'll schedule with you anyway, we can use this as an agenda, but I encourage you to brutally filter out the ideas that seem bad, or at least to voice your actual opinion. Helping me prioritize is welcome, and most of these projects are getting very little attention from me.

Cool, looking forward to talking about these.

1) Are you interested in increasing diversity of the longtermist community? If so, alongside what lines?

One possibility is to increase shares of minorities according to US Census Bureau topics: race, sex, age, education, income, etc. Ways of thinking about EA, one's (static or dynamic) comparative advantages, or roles naturally/nurturally taken in a team would be irrelevant. The advantage of this diversification is its (type 1 thinking) acceptance/endorsement among some decisionmaking environments in EA, such as the Bay Area or London. The disadvantage is that diversity of perspectives may not necessarily be gained (for example, students of different race, sex, and parents' income studying at the same school may think alike).

Another possibility is to focus on the ways of thinking about EA, one's current comparative advantage and that which they can uniquely develop, and roles that they currently or prospectively enjoy. In this case, Census-type demographics would be disregarded. The disadvantage is that diversity might not be apparent (for example, affluent white people, predominantly males, who think in very different ways about the long-term future and work well together could constitute the majority of community members). The advantage is that things would get done and different perspectives considered.

These two options can be combined in a narrative-actual or actual-narrative ways: Census-type diversity could be an instrument for thinking/action/roles diversity, while only the former is narrated publicly. Or, vice versa, people of various thinking/comparative advantages/preferred roles would be attracted to increase Census-type fractions. Is either necessary or a great way to mitigate reputational loss risk? Do you have an available strategy on the longtermist community growth?

2) Is it possible to apply for a grant without collaborators but with a relevant experience or strategy of finding them?

For example, can one apply if they had previously advertised and interviewed others for a similar EA-related opportunity but have not initiated an advertisement process for the application?

Do you award grants or vary their amount conditional on others' interest? For example, is it possible to apply for a range depending on a collaborator's compensation preference or experience? Is it possible to forgo a grant if no qualified candidate is interested?

  1. We're interested in increasing the diversity of the longtermist community along many different axes. It's hard to give a unified 'strategy' at this abstract level, but one thing we've been particularly excited about recently is outreach in non-Western and non-English-speaking countries.

  2. Yes, you can apply for a grant under these circumstances. It's possible that we'll ask you to come back once more aspects of the plan are figured out, but we have no hard rules about that. And yes, it's possible to apply for funding conditional on some event and later return the money/adjust the amount you want downwards if the event doesn't happen.

Curated and popular this week
Relevant opportunities