Resources spent
- Leverage Research has now existed for over 7.5 years1
- Since 2011, it has consumed over 100 person-years of human capital.
- From 2012-16, Leverage Research spent $2.02 million, and the associated Institute for Philosophical Research spent $310k.23
Outputs
Some of the larger outputs of Leverage Research include:
- Work on Connection Theory: although this does not include the initial creation of the theory itself, which was done by Geoff Anders prior to founding Leverage Research
- Contributions to productivity of altruists via the application of psychological theories including Connection Theory
- Intellectual contributions to the effective altruism community: including early work on cause prioritisation and risks to the movement.
- Intellectual contributions to the rationality community: including CFAR’s class on goal factoring
- The EA Summits in 2013-14: The EA summit is a precursor to EA Global, which is being revived in 2018
Its website also has seven blog posts.4
Recruitment Transparency
- Leverage Research previous organized the Pareto Fellowship in collaboration with another effective altruism organization. According to one attendee, Leverage staff were secretly discussing attendees using an individual Slack channel for each.
- Leverage Research has provided psychology consulting services using Connection Theory, leading it to obtain mind-maps of a substantial fraction of its prospective staff and donors, based on reports from prospective staff and donors.
- The leadership of Leverage Research have on multiple occasions overstated their rate of staff growth by more than double, in personal conversation.
- Leverage Research sends staff to effective altruism organizations to recruit specific lists of people from the effective altruism community, as is apparent from discussions with and observation of Leverage Research staff at these events.
- Leverage Research has spread negative information about organisations and leaders that would compete for EA talent.
General Transparency
- The website of Leverage Research has been excluded from the Wayback Machine5
- Leverage Research has had a strategy of using multiple organizations to tailor conversations to the topics of interest to different donors.
- Leverage Research had longstanding plans to replace Leverage Research with one or more new organizations if the reputational costs of the name Leverage Research ever become too severe. A substantial number of staff of Paradigm Academy were previously staff of Leverage Research.
General Remarks
Readers are encouraged to add additional facts known about Leverage Research in the comments section, especially where these can be supported by citation, or direct conversational evidence.
Citations
1. https://www.lesswrong.com/posts/969wcdD3weuCscvoJ/introducing-leverage-research
2. https://projects.propublica.org/nonprofits/organizations/453989386
3. https://projects.propublica.org/nonprofits/organizations/452740006
4. http://leverageresearch.org/blog
5. https://web.archive.org/web/*/http://leverageresearch.org/
The CEA, the very organization you juxtaposed with Leverage and Paradigm in this comment has in the past been compared to a Ponzi scheme. Effective altruists who otherwise appreciated that criticism thought much of the value was lost in comparing it to a Ponzi scheme, and without it, the criticism may been better received. Additionally, LessWrong and the rationality community; CFAR and MIRI; and all of AI safety have been for years been smeared as a cult by their detractors. The rationality community isn't perfect. There is no guarantee interactions with a self-identified (aspiring) rationality community will "rationally" go however an individual or small group of people interacting with the community, online or in person, hope or expect. But the vast majority of effective altruists, even those who are cynical about these organizations or sub-communities within EA, disagree with how these organizations have been treated, for it poisons the well of good will in EA for everyone. In this comment, you stated your past experience with the Pareto Fellowship and Leverage left you feeling humiliated and manipulated. I've also been a vocal critic in person throughout the EA community of both Leverage Research and how Geoff Anders has led the organization. But that to elevate a personal opposition of them to a public exposure of opposition research in an attempt to tarnish an event they're supporting alongside many other parties in EA is not something I ever did, or will do. My contacts in EA and myself have followed Leverage. I've desisted in making posts like this myself, because digging for context I found Leverage has changed from any impression I've gotten of them. And that's why at first I was skeptical of attending the EA Summit. But upon reflection, I realized it wasn't supported by the evidence to conclude Leverage is so incapable of change that anything they're associated with should be distrusted. But what you're trying to do with Leverage Research is no different than what EA's worst critics do not in an effort to change EA or its members, but to tarnish them. From within or outside of EA, to criticize any EA organization in such a fashion is below any acceptable epistemic standard in this movement.
If the post and comments here are stating facts about Leverage Research, and you're reporting impressions with no ability to remember specific details that Leverage is like a cult, those are barely facts. The only fact is some people perceived Leverage to be like a cult in the past, which are only anecdotes. And without details, they're only hearsay. Combined with the severity of the consequences if this hearsay was borne out, to be unable to produce actual facts invalidates the point you're trying to make.
Your comments seem to be way longer than they need to be because you don't trust other users here. Like, if someone comes and says they felt like it was a cult, I'm just going to think "OK, someone felt like it was a cult." I'm not going to assume that they are doing secret blood rituals, I'm not going to assume that it's a proven fact. I don't need all these qualifications about the difference between cultishness and a stereotypical cult, I don't need all these qualifications about the inherent uncertainty of the issue, that stuff is old hat. Th... (read more)