Resources spent
- Leverage Research has now existed for over 7.5 years1
- Since 2011, it has consumed over 100 person-years of human capital.
- From 2012-16, Leverage Research spent $2.02 million, and the associated Institute for Philosophical Research spent $310k.23
Outputs
Some of the larger outputs of Leverage Research include:
- Work on Connection Theory: although this does not include the initial creation of the theory itself, which was done by Geoff Anders prior to founding Leverage Research
- Contributions to productivity of altruists via the application of psychological theories including Connection Theory
- Intellectual contributions to the effective altruism community: including early work on cause prioritisation and risks to the movement.
- Intellectual contributions to the rationality community: including CFAR’s class on goal factoring
- The EA Summits in 2013-14: The EA summit is a precursor to EA Global, which is being revived in 2018
Its website also has seven blog posts.4
Recruitment Transparency
- Leverage Research previous organized the Pareto Fellowship in collaboration with another effective altruism organization. According to one attendee, Leverage staff were secretly discussing attendees using an individual Slack channel for each.
- Leverage Research has provided psychology consulting services using Connection Theory, leading it to obtain mind-maps of a substantial fraction of its prospective staff and donors, based on reports from prospective staff and donors.
- The leadership of Leverage Research have on multiple occasions overstated their rate of staff growth by more than double, in personal conversation.
- Leverage Research sends staff to effective altruism organizations to recruit specific lists of people from the effective altruism community, as is apparent from discussions with and observation of Leverage Research staff at these events.
- Leverage Research has spread negative information about organisations and leaders that would compete for EA talent.
General Transparency
- The website of Leverage Research has been excluded from the Wayback Machine5
- Leverage Research has had a strategy of using multiple organizations to tailor conversations to the topics of interest to different donors.
- Leverage Research had longstanding plans to replace Leverage Research with one or more new organizations if the reputational costs of the name Leverage Research ever become too severe. A substantial number of staff of Paradigm Academy were previously staff of Leverage Research.
General Remarks
Readers are encouraged to add additional facts known about Leverage Research in the comments section, especially where these can be supported by citation, or direct conversational evidence.
Citations
1. https://www.lesswrong.com/posts/969wcdD3weuCscvoJ/introducing-leverage-research
2. https://projects.propublica.org/nonprofits/organizations/453989386
3. https://projects.propublica.org/nonprofits/organizations/452740006
4. http://leverageresearch.org/blog
5. https://web.archive.org/web/*/http://leverageresearch.org/
Your comments seem to be way longer than they need to be because you don't trust other users here. Like, if someone comes and says they felt like it was a cult, I'm just going to think "OK, someone felt like it was a cult." I'm not going to assume that they are doing secret blood rituals, I'm not going to assume that it's a proven fact. I don't need all these qualifications about the difference between cultishness and a stereotypical cult, I don't need all these qualifications about the inherent uncertainty of the issue, that stuff is old hat. This is the EA Forum, an internal space where issues are supposed to be worked out calmly; surely here, if anywhere, is a place where frank criticism is okay, and where we can extend the benefit of the doubt. I think you're wasting a lot of time, and implicitly signaling that the issue is more of a drama mine than it should be.
I admit I'm coming from a place of not entirely trusting all other users here. That may be a factor in why my comments are longer in this thread than they need to be. I tend to write more than is necessary in general. For what it's worth, I treat the EA Forum not as an internal space but how I'd ideally like to see it be used. That is as a primary platform for EA discourse, on par with a level of activity more akin to the 'Effective Altruism' Facebook group, or LessWrong.
I admit I've been wasting time. I've stopped responding directly to the OP because if... (read more)