The Problem
As noted in two recent discussions, there may be many promising EA projects that are unable to secure sufficient funding. The cause seems to be that there are few funding sources for new projects: the Open Philanthropy Project focuses on larger grantees, EA Grants and EA Funds appear staff limited, peripheral EAs prefer to fund established organizations, and core EAs may have difficulty evaluating the competence of a person, which is an important factor for early stage projects. In this post, I explore one possible solution: a crowdfunding platform for projects that are endorsed by trusted EAs. (Note that a crowdfunding platform was proposed by Linda Linsefors in response to a comment by David Moss on this Facebook post.)
How It Could Work
Below, I attempt to work out the details of how a crowdfunding platform could work. Of course, there are many different ways to set one up, which means you can reject this specific proposal without rejecting the general idea.
Note: I am not an employee of the Centre for Effective Altruism. What appears below is a description of an idea, not an announcement of something that CEA plans to implement.
The Centre for Effective Altruism (CEA) invites people who have been involved in the EA community at a deep level for several years to serve as evaluators for one or more cause areas. (Nobody can apply to be an evaluator, which means there are no explicit rejections.)
Evaluators who accept the invitation for a cause area must agree to rate all submissions within that cause area (to avoid selection bias) and to keep their ratings confidential (to encourage honesty).
The goal is to have a large number of evaluators for each cause area.
Anyone with one year of substantial involvement in EA can submit a project proposal to CEA.
Proposals must include a description of the idea, an estimate of the probability of success, the benefits if the project is successful, any possible harms and the probability of each possible harm, and the amount of funding needed.
CEA anonymizes the proposals and sends them to all evaluators for the relevant cause area.
Each of those evaluators rates the idea from -10 to 10. (The lower end of the scale is -10 to allow evaluators to indicate that they think the project has a negative expected value.)
Evaluators can include feedback encouraging/discouraging the proposer from pursuing the project further and/or providing suggestions for improvement.
Evaluators can also endorse a proposal (if they think it's a good idea) or endorse a person (if they believe the person is highly competent).
The proposal is considered approved if it meets the following criteria:
a. the average rating is above x;
b. there are n people who endorse the idea; and
c. there are m people who endorse the person.
(The process for determining whether a proposal is approved is automated and CEA never sees the rating of any specific evaluator or the identity of the evaluator(s) endorsing a person.)
Once a proposal is approved, all evaluators in the relevant cause area estimate the probability the project will succeed if undertaken. Those who endorsed the idea provide a brief statement explaining why they did so. (There is no statement for endorsing a person.)
CEA can veto an approved proposal. (This helps prevent unilateralist's curse and helps manage reputational risk.) However, this power is exercised sparingly since vetoing all proposals except those supported by CEA would result in this platform becoming EA Grants.*
Proposers are informed of whether their proposal was accepted (meaning it was approved without being vetoed) or rejected (meaning it failed to secure approval or was vetoed) as well as any feedback from the evaluators. Proposers do not see the average rating or the number or identity of people who endorsed the proposer. Additionally, unsuccessful proposers do not see the number or identity of people who endorsed the idea or the reason that the proposal was rejected (i.e. which of the approval criteria it failed to satisfy and whether it was vetoed). CEA periodically releases aggregate statistics.
A rejected proposal can be resubmitted if CEA determines that it's been materially improved.
Those proposals that are accepted appear publicly on a platform alongside the name and statement of evaluators who endorsed the proposal, the average estimated probability of success (endorsers only), the average estimated probability of success (all evaluators), and CEA's estimate of how much money it would take to fully fund the project. The names of evaluators who endorsed the proposer do not appear publicly and are not disclosed to anyone.
The proposer can either choose to only allow unconditional donations or to also allow conditional donations (money that will be returned unless the amount needed to fully fund the project as estimated by CEA is raised within a certain period of time).
Proposers can return all donations if they receive too little to go forward with the project.
Proposers who take the money must post an update every y months for a period of z years.
*Alternatively, this system could be used to evaluate ideas for EA Grants (with projects that are approved but not funded or not fully funded listed on the platform).
Note: I am not an employee of the Centre for Effective Altruism. What appears above is a description of an idea, not an announcement of something that CEA plans to implement.
Benefits
1. It could increase the number of worthwhile projects funded since:
a. some projects that are currently disfavored by a single person who controls a key funding source would be rated well by a large group of evaluators, which could influence that person;
b. people may have more confidence in projects that are currently recommended by others if they know that those recommendations represent the overall views of the community*;
c. more people would have access to the donation recommendations of those who currently only recommend worthwhile projects to others privately; and
d. there would be a single platform where busy donors could easily find most project ideas alongside relevant information.
*Think of this as allowing people to get additional draws from the distribution of how members of the EA community view the project, which allows them to determine whether those recommending the project are at the median of the distribution of community views or whether they are at the right tail (and also whether and to what extent the left tail goes into the negative).
2. Through the process of launching new projects, EAs would build valuable skills (which is possible even when projects ultimately fail).
3. People may be less likely to unilaterally start a bad project if there is a formal mechanism for an idea to receive a firm rejection from the community. (The feedback could also result in improvement to good ideas but I think that's currently already possible.)
4. Allowing people to donate on the condition that others donate could help avoid the collective action problem that arises when it is only worthwhile to fund a project if enough other money is going towards it (and where your money alone could fund a smaller scale version of the project, meaning that the proposer would not necessarily return it if you donated it unconditionally).
5. It would allow the EA community to learn valuable information such as:
a. the number and quality of project ideas in the community;
b. the probability of a project succeeding (by type, by cause);
c. the average accuracy of predictions (and which people are above average); and
d. why projects fail and what can be done to avoid failure.
Potential Costs
1. There are various costs associated with projects being funded that otherwise would not have been funded including:
a. the opportunity cost of the funding, which could have gone elsewhere;*
b. the opportunity cost of the talent of people working on the project, which could have been applied within an existing EA organization;
c. the risk that the project causes harm; and
d. the risk of reputational harm from the project failing or causing harm.
If people tend to be overly optimistic about proposals (and thus overestimate expected benefits), then they will sometimes fund projects despite the above costs being greater than the expected benefits.
*The opportunity cost might be especially high if the money would have gone to EA Funds, the fund manager would also have given it to risky projects with high expected value, and the fund managers are much better at judging which ones are likely to succeed.
2. It could increase the reputational cost for harmful projects (including ones that would have occurred absent the platform) by making it harder to distance the EA community from such projects.
3. Scammers may join the EA community and seek project funding in bad faith. Not only does this cause all of the problems identified above (diverted money, lost time, reputational harm etc.), it could decrease trust within the EA community.
4. There is an opportunity cost to the time spent creating this system, the time spent managing it, the time spent writing proposals, and the time time spent evaluating them. Given the availability of alternative ways of announcing projects* and endorsing them, there may be relatively few people who would use this system to propose ideas. If so, then the opportunity cost might be greater than the benefits.
*These may be bad examples of using a public announcement to generate initial funding since many of these seem to have gathered sufficient funding to launch before being announced.
5. The choice of evaluators could cause hurt feelings for those who are excluded. The rejection of a proposal might cause hurt feelings for the proposer and might even cause them to underestimate their abilities in the future.
6. The platform could become the default path (to the point that raising money for new projects through other channels is disfavored), which could
a. entrench the status quo in terms of cause areas and strategies;
b. make it harder for low probability, high magnitude projects to get funding (if evaluators only endorse projects they think are likely to succeed);
c. make it harder to quickly launch a project; and
d. make it harder to launch projects that require some secrecy.
Ultimately, I'm unsure as to whether this would be a good idea. My primary motivation in posting this is to generate more discussion on this topic.
Edit: I heard a round of EA Grants applications had opened for this year, but that appears not to currently be the case according to the EA Grants website. I was mistaken. I did hear more EA Grants will be from community members, but not directly from anyone at the CEA, and I assume applications will open at some point, but there isn't anywhere the CEA has said when.
It should be noted the EA Grants and the EA Funds are different accounts with different issues. Last year the EA Grants were limited by staff time, but I don't recall anyone directly saying that was the case with the EA Funds. There is another round of EA Grants this year, so no data has come out about that. I expect the CEA is putting more staff time on it to solve the most obvious flaw with the EA Grants last year.
Each of the EA Funds have been performing separately. Last year when there were infrequent updates about the EA Funds it turned out the CEA was experiencing technical delays in implementing the EA Funds website. Since then, while it's charitably assumed (as I think is fair) each of the fund managers might be too busy with their day jobs at the Open Philanthropy Project to afford as much attention to fund management, neither the CEA nor Open Phil has confirmed such speculation. The Funds also vary in their performance. Lewis Bollard has continually made many smaller grants to several smaller projects from the Animal Welfare Fund, contrasted with Neck Beckstead who has made only one grant from each of the two funds he manages, the Far Future Fund and the EA Community Fund. I contacted the CEA and let me know they intend to release updates on the Far Future Fund and EA Community Fund (which I assume will include disclosures of grants they've been tabling the last few months) by July.
One problem is smaller organizations with smaller, less experienced teams is they don't know how well how to independently and effectively pitch or raise funds for their project, even when their good people with good ideas. Compounding this is a sense of dejection by nascent community projects once they've been rejected by the big funders to receive grants, especially otherwise qualified EA community members who don't know how to navigate the non-profit sector. This is feedback I've gotten from community members who know of projects which didn't get off the ground, and that they faltered quietly might be why they go unnoticed. That stated, I don't think there is a ton of promising but funding-starved projects around.
On the flip side, I've heard some community members say they're overlooked by donors who are earning to give after they've been overlooked by, e.g,. the EA Grants, apparently based on the reasoning since as individual donors they don't have the bandwidth to evaluate projects, they defer to the apparently expert judgement of the CEA, and since the CEA didn't fund the project, individual would-be donors conclude a project isn't fit to receive funding from them either. This creates a ludicrous Catch-22 in which projects won't get funding from smaller donors until they have authentic evidence of the quality of their project in the form of donations from big donors, which if the projects got they wouldn't need to approach the smaller donors in the first place. This isn't tricky epistemology or the CEA even unwittingly creating perverse incentives. Given the EA Grants said they didn't have the bandwidth to evaluate a lot of potentially valuable projects, for other donors to base not donating to small projects based on them not receiving EA Grants is unsound. It's just lazy reasoning because smaller donors don't have the bandwidth to properly evaluate projects either.
Ultimately I think we shouldn't hold single funders like CEA and Open Phil primarily accountable for this state of affairs, and the community needs to independently organize to connect funding with promising projects better. I think this is a problem in a demand of a solution, but I think something like a guide on how to post pitches or successfully crowd-fund a project would work better than creating a brand new EA crowdfunding platform. Joey Savoie recently wrote a post about how to write posts on the EA Forum to get new causes in EA, as a long-time community members who himself has lots of experience writing similar pitches.
Unfortunately advocating for core funding groups to change their strategy has practical costs which apparently so high appeals like this on the EA Forum feel futile. Direct advocacy to change strategy is too simplistic, and long essays on the EA Forum which ground the epistemological differences of individual effective altruists which diverge from the CEA or Open Phil receive little to no feedback. I think from the inside these organizations focus narrowly on maximizing goal satisfaction they don't have the time to alter their approach in light of critical feedback from the community, and all the while they feel it's important to carry on with the very same approaches others in the community are unhappy with. So while I think in this instance a crowdfunding platform is not the right solution, advocating or changing to existing funds seems noncompetitive as well, and designing other parallel routes for funding is something I'd encourage effective altruists to do.
I haven't seen the launch of 2018 EA grants - could you link to it?