Hide table of contents

Resources spent

  • Leverage Research has now existed for over 7.5 years1
  • Since 2011, it has consumed over 100 person-years of human capital.
  • From 2012-16, Leverage Research spent $2.02 million, and the associated Institute for Philosophical Research spent $310k.23

Outputs

Some of the larger outputs of Leverage Research include:

  • Work on Connection Theory: although this does not include the initial creation of the theory itself, which was done by Geoff Anders prior to founding Leverage Research
  • Contributions to productivity of altruists via the application of psychological theories including Connection Theory
  • Intellectual contributions to the effective altruism community: including early work on cause prioritisation and risks to the movement.
  • Intellectual contributions to the rationality community: including CFAR’s class on goal factoring
  • The EA Summits in 2013-14: The EA summit is a precursor to EA Global, which is being revived in 2018

Its website also has seven blog posts.4

Recruitment Transparency

  • Leverage Research previous organized the Pareto Fellowship in collaboration with another effective altruism organization. According to one attendee, Leverage staff were secretly discussing attendees using an individual Slack channel for each.
  • Leverage Research has provided psychology consulting services using Connection Theory, leading it to obtain mind-maps of a substantial fraction of its prospective staff and donors, based on reports from prospective staff and donors.
  • The leadership of Leverage Research have on multiple occasions overstated their rate of staff growth by more than double, in personal conversation.
  • Leverage Research sends staff to effective altruism organizations to recruit specific lists of people from the effective altruism community, as is apparent from discussions with and observation of Leverage Research staff at these events.
  • Leverage Research has spread negative information about organisations and leaders that would compete for EA talent.

General Transparency

  • The website of Leverage Research has been excluded from the Wayback Machine5
  • Leverage Research has had a strategy of using multiple organizations to tailor conversations to the topics of interest to different donors.
  • Leverage Research had longstanding plans to replace Leverage Research with one or more new organizations if the reputational costs of the name Leverage Research ever become too severe. A substantial number of staff of Paradigm Academy were previously staff of Leverage Research.

General Remarks

Readers are encouraged to add additional facts known about Leverage Research in the comments section, especially where these can be supported by citation, or direct conversational evidence.

Citations

1. https://www.lesswrong.com/posts/969wcdD3weuCscvoJ/introducing-leverage-research

2. https://projects.propublica.org/nonprofits/organizations/453989386

3. https://projects.propublica.org/nonprofits/organizations/452740006

4. http://leverageresearch.org/blog

5. https://web.archive.org/web/*/http://leverageresearch.org/

Comments103
Sorted by Click to highlight new comments since: Today at 6:08 PM
Some comments are truncated due to high volume. (⌘F to expand all)Change truncation settings

[My views only]

Although few materials remain from the early days of Leverage (I am confident they acted to remove themselves from wayback, as other sites link to wayback versions of their old documents which now 404), there are some interesting remnants:

  • A (non-wayback) website snapshot from 2013
  • A version of Leverage's plan
  • An early Connection Theory paper

I think this material (and the surprising absence of material since) speaks for itself - although I might write more later anyway.

Per other comments, I'm also excited by the plan of greater transparency from Leverage. I'm particularly eager to find out whether they still work on Connection Theory (and what the current theory is), whether they addressed any of the criticism (e.g. 1, 2) levelled at CT years ago, whether the further evidence and argument mentioned as forthcoming in early documents and comment threads will materialise, and generally what research (on CT or anything else) have they done in the last several years, and when this will be made public.

I imagine most people have made up their minds by now, but there is now a first-person account from one of the former employees of Leverage 1.0, with some attendant discussion on LessWrong.

8
Charles He
2y
Thank you for pointing this out and increasing awareness of the issues.

Three years later, a similar post with some more details about Leverage's internal management processes, and an update from Leverage here.

Buck
5y53
0
1

An article in Splinter News was released few days ago, showing leaked emails where Jonah Bennett, a former Leverage employee who is now editor-in-chief for Palladium Magazine (LinkedIn ), was involved with a white nationalist email list, where he among other things made anti-Semetic jokes about a holocaust survivor, says he "always has illuminating conversations with Richard Spencer", and complained about someone being "pro-West before being pro-white/super far-right".

I have 35 mutual friends with this guy on Facebook, mostly EAs. This makes me think that while at Leverage he interacted a reasonable amount with the EA community. (Obviously, I expect my EA mutual friends to react with revulsion to this stuff.)

Bennett denies this connection; he says he was trying to make friends with these white nationalists in order to get information on them and white nationalism. I think it's plausible that this is somewhat true. In particular, I'd not be that surprised if Bennett is not a fan of Hitler, and if he said racist jokes more to fit in. But I'd be pretty surprised if it turned out that he didn't have endorsed explicitly racist views--this seems ... (read more)

"which makes me think that it's likely that Leverage at least for a while had a whole lot of really racist employees."

"Leverage" seems to have employed at least 60 people at some time or another in different capacities. I've known several (maybe met around 15 or so), and the ones I've interacted with often seemed like pretty typical EAs/rationalists. I got the sense that there may have been few people there interested in the neoreactionary movement, but also got the impression the majority really weren't.

I just want to flag that I really wouldn't want EAs generally think that "people who worked at Leverage are pretty likely to be racist," because this seems quite untrue and quite damaging. I don't have much information about the complex situation that represents Leverage, but I do think that the sum of the people ever employed by them still holds a lot of potential. I'd really not want them to get or feel isolated from the rest of the community.

Buck
4y38
0
0

Ok actually I just reread this comment and now I think that the thing you quoted me as saying is way too strong. I am confused by why I wrote that.

Yep, understood, and thanks for clarifying in the above comment. I wasn't thinking you thought many of them were racist, but did think that at least a few readers may have gotten that impression from the piece.

There isn't too much public discussion on this topic and some people have pretty strong feelings on Leverage, so sadly sometimes the wording and details matter more than they probably should.

Buck
5y25
0
0

Yeah, I don't think that people who worked at Leverage are pretty likely to be racist.

2
LarissaHeskethRowe
5y
Hi Buck, Ozzie and Greg, I thought I’d just add some data from my own experience. For context, I’ve been heavily involved in the EA community, most recently running CEA. After I left CEA, I spent the summer researching what to do next and recently decided to join the Leverage Research team. I’m speaking personally here, not on behalf of Leverage. I wanted to second Ozzie’s comment. My personal experience at least is that I’ve found the Leverage and Paradigm teams really welcoming. They do employ people with a wide range of political views with the idea that it helps research progress to have a diversity of viewpoints. Sometimes this means looking at difficult topics and I’ve sometimes found it uncomfortable to try and challenge why I hold different viewpoints but I’ve always found that the focus is on understanding ideas and the attitude to the individual people one of deep respect. I’ve found this refreshing. I wanted to thank Ozzie for posting this in part because I noticed reticence in myself to saying anything because my experience with conversations about Leverage is that they can get weird and personal quite fast. I know people who’ve posted positive things about Leverage on the EA Forum and then been given grief for it on and offline. For this reason Greg, I can see why Leverage don’t engage much with the EA Forum. You and I know each other fairly well and I respect your views on a lot of topics (I was keen to seek you out for advice this summer). I notably avoided discussing Leverage though because I expected an unpleasant experience and that I had more information on the topic from investigating them myself. This feels like a real shame. Perhaps I could chat with you (and potentially others) about what you’d like to see written up by Leverage. I’m happy to commit to specific Leverage-related posts if you can help ensure that turns into a genuinely useful discussion. What do you think? :-)

Hello Larissa,

I'd be eager to see anything that speaks to Leverage's past or present research activity: what have they been trying to find out, what have they achieved, and what are they aiming for at the moment (cf).

As you know from our previous conversations re. Leverage, I'm fairly indifferent to 'they're shady!' complaints (I think if people have evidence of significant wrongdoing, they should come forward rather than briefing adversely off the record), but much less so to the concern that Leverage has an has achieved extraordinarily little for an organisation with multiple full-time staff working for the better part of a decade. Showing something like, "Ah, but see! We've done all these things," or, "Yeah, 2012-6 was a bit of a write-off, but here's the progress we've made since", would hopefully reassure, but in any case be informative for people who would like to have a view on leverage independent of which rumour mill they happen to end up near.

Other things I'd be interested to hear about is what you are planning to work on at Leverage, and what information you investigated which - I assume - leads to a much more positive impression of Leverage than I take the public evidence to suggest.

Hi Greg,

Thanks for the message and for engaging at the level of what has Leverage achieved and what is it doing. The tone of your reply made me more comfortable in replying and more interested in sharing things about their work so thank you!

Leverage are currently working on a series of posts that are aimed at covering what has been happening at Leverage from its inception in 2011 up until a recent restructure this year. I expect this series to cover what Leverage and associated organisations were working on and what they achieved. This means that I expect Leverage to answer all of your questions in a lot more depth in the future. However, I understand that people have been waiting a long time for us to be more transparent so below I have written out some more informal answers to your questions from my understanding of Leverage to help in the meantime.

Another good way to get a quick overview of the kinds of things Leverage has been working on beyond my notes below is by checking out this survey that we recently sent to workshop participants. It’s designed for people who’ve engaged directly with our content so it won’t be that relevant for people to fill in necessarily but it gives ... (read more)

Leverage are currently working on a series of posts that are aimed at covering what has been happening at Leverage from its inception in 2011 up until a recent restructure this year.

 

Did this series end up being published?

8
Anthony Repetto
1y
This aged well... and it reads like what ChatGPT would blurt, if you asked it to "sound like a convincingly respectful and calm cult with no real output." Your 'Anti-Avoidance,' in particular, is deliciously Orwellian. "You're just avoiding the truth, you're just confused..." I was advocating algal and fish farming, including bubbling air into the water and sopping-up the fish poop with crabs and bivalves - back in 2003. Spent a few years trying to tell any marine biologist I could. Fish farming took-off, years later, and recently they realized you should bubble air and catch the poop! I consider that a greater real-world accomplishment than your 'training 60+ people on anti-avoidance of our pseudo-research.' Could you be more specific about Connection Theory, and the experimental design of the research you conducted and pre-registered, to determine that it was correct? I'm sure you'd have to get into some causality-weeds, so those experimental designs are going to be top-notch, right? Or, is it just Geoff writing with the rigor of Freud on a Slack he deleted?

Just wanted to say I super appreciated this writeup.

Thanks Raemon :-) I'm glad it was helpful.

Thanks Larissa - the offer to write up posts from Leverage Research is a generous offer. Might it not be a more efficient use of your time, though, to instead answer questions about Leverage the public domain, many of which are fairly straightforward?

For example, you mention that Leverage is welcoming to new staff. This sounds positive - at the same time, the way Leverage treated incoming staff is one of the main kinds of fact discussed in the top-level post. Is it still true that: (i) staff still discuss recruitees on individual slack channels, (ii) mind-mapping is still used during recruitment of staff, (iii) growth-rates are overestimated, (iv) specific lists of attendees are recruited from EA events, and (v) negative rumours are still spread about other organizations that might compete for similar talent? To the extent that you are not sure about (i-v), it would be interested to know whether you raised those concerns with Geoff in the hiring process, before joining the organization.

For other questions raised by the top-level post: (a) are Leverage's outputs truly as they appear? (b) Is its consumption of financial resources and talent, as it appears? (c) Has it truly gone... (read more)

Hi Anonymoose,

I’d like to do two things with my reply here.

  1. First, to try and answer your questions as best I can.
  2. But then second, start to work out how to make future conversations with you about Leverage more productive

1. ANSWERING YOUR QUESTIONS

I’d recommend first reading my recent reply to Greg because this will give you a lot of relevant context and answers some of your questions.

Questions a, b and d: outputs, resources and future impact

Your Questions:

“(a) are Leverage's outputs truly as they appear?”
“(b) Is its consumption of financial resources and talent, as it appears?”
“(d) How will Leverage measure any impact from its ninth year of operation?”

In terms of questions a, b and d, I will note the same thing as I said in my reply to Greg which is that we’re currently working both on a retrospective of the last eight and a half years of Leverage and on updating Leverage’s existing website. I think these posts and updates will then allow individuals to assess for themselves

  1. our past work and outputs
  2. whether it was worth the resources invested
  3. our plans for the future

For now, though sections “What did Leverage 1.0 work on?” and “What is Leverage doing now” in my reply to Greg... (read more)

2
Milan_Griffes
5y
Wow, I didn't know about this. Thank you for drawing attention to it.
1
Buck
5y
[I wrote an addendum to this comment, but then someone pointed out that it was unclear, so I deleted it]
-5
Geoff_Anders
5y

I was interviewed by Peter Buckley and Tyler Alterman when I applied for the Pareto fellowship. It was one of the strangest, most uncomfortable experiences I've had over several years of being involved in EA. I'm posting this from notes I took right after the call, so I am confident that I remember this accurately.

The first question asked about what I would do if Peter Singer presented me with a great argument for doing an effective thing that's socially unacceptable. The argument was left as an unspecified black box.

Next, for about 25 minutes, they taught me the technique of "belief reporting". (See some information here and here). They made me try it out live on the call, for example by making me do "sentence completion". This made me feel extremely uncomfortable. It seemed like unscientific, crackpot psychology. It was the sort of thing you'd expect from a New Age group or Scientology.

In the second part of the interview (30 minutes?), I was asked to verbalise what my system one believes will happen in the future of humanity. They asked me to just speak freely without thinking, even if it sounds incoherent. Again it felt extremely cultish. I expected this to l... (read more)

I had an interview with them under the same circumstances and also had the belief reporting trial. (I forget if I had the Peter Singer question.) I can confirm that it was supremely disconcerting.

At the very least, it's insensitive - they were asking for a huge amount of vulnerability and trust in a situation where we both knew I was trying to impress them in a professional context. I sort of understand why that exercise might have seemed like a good idea, but I really hope nobody does this in interviews anymore.

-28
avindroth
5y
  • Leverage Research spent a further $388k in 2017.
  • At least 11 of 13 Paradigm Academy staff listed on Linkedin are known to have worked for Leverage Research or allied organizations.
  • The coin made by Reserve (one of the successor companies to Leverage Research) has returned -32.7% since its float at the time of writing. In the same time period, bitcoin returned 24%.
8
anonymoose
4y
Reserve has now lost 50.6% of its value since its float, while Bitcoin has returned ~1% over the same time period.

So, after I read this comment I left thinking that Reserve performed exceptionally poorly, but it seems that almost all cryptocurrencies have gone down about the same amount since June 19th (the time of Reserve's launch, from what I can tell). Here are some random currencies that I clicked on, on the coinmarketcap website that you linked. This is a comprehensive list, so I report the price change since June 19th for every currency that I looked at:

  • Bitcoin Cash:
    • June 19th price: $416
    • Price now: $244
    • Change: -41.3%
  • XRP:
    • June 19th price: $0.448
    • Price now: $0.25
    • Change: -44.1%
  • Litecoin:
    • June 19th price: $135
    • Price now: $55
    • Change: -59.2%
  • Monero
    • June 19th price: $100
    • Price now: $58
    • Change: -42%

You are also incorrect that Bitcoin has returned 1% over the same time period. On June 19th, the price of Bitcoin was $9273, and it now is $8027. So while you are correct that Bitcoin went down significantly less than Reserve, it performed drastically better than almost all other cryptocurrencies, and still went down by about 13%.

I don't think Reserve is overall a super great idea, but I think the statistics you cited seem misleading to me, and it seems that Reserve overall is performing similarly to the rest of the non-Bitcoin crypto-market.

after I read this comment I left thinking that Reserve performed exceptionally poorly but..

Your initial impression was correct. Reserve has entered a terrible market and managed to perform substantially worse than its terrible competitors. Since May 24, when Reserve Rights was priced:

  • the S&P gained 14%,
  • cryptocurrency at large lost 17%,
  • cryptocurrencies excluding Bitcoin lost 33%,
  • while Reserve Rights managed to lose 52.3%.
You are also incorrect that Bitcoin has returned 1% over the same time period.

Reserve Rights was floated on May 24 according to CoinMarketCap, at which time Bitcoin was worth $7800-$7950, and it is now worth the same amount, so the error must be either with you, or with CoinMarketCap.

5
Habryka
4y
I used Jun 19th, because that was the first date with a market cap available, which seemed like the most reasonable date to start. So that likely explains the discrepancy.
2
Larks
4y
I don't know much about it, but isn't Reserve meant to be a Stablecoin? If so any change in value seems significantly worse than for other coins.

I also don't know much about it, but I think Reserve includes a couple of coins. 'Reserve Rights' is not intended to be a stablecoin (I think it is meant to perform some function for the stablecoin system, but I'm ignorant of what it is), whilst 'Reserve', yet to be released, is meant to be stable.

4
Milan_Griffes
4y
Huh, do you know what 'Reserve Rights' does / why it exists? Is there a short explainer of it somewhere?

The reason for posting these facts now is that as of the time of writing, Leverage's successor, the Paradigm Academy is seeking to host the EA Summit in one week. The hope is that these facts would firstly help to inform effective altruists on the matter of whether they would be well-advised to attend, and secondly, what approach they may want to take if they do attend.

Leverage Research has recruited from the EA community using mind-maps and other psychological techniques, obtaining dozens of years of work, but doing little apparent good. As a result, the author views it as inadvisable for EAs to engage with Leverage Research and its successor, Paradigm Academy. Rather, they should seek the advice of mentors outside of the Leverage orbit before deciding to attend such an event. Based on past events such as the Pareto Fellowship, invitees who ultimately decide to attend would be well-advised to be cautious about recruitment, by keeping in touch with friends and mentors throughout.

I think this would be more useful as part of the main post than as a comment.

-4
Evan_Gaensbauer
6y
I've provided my explanations for the following in this comment: * No evidence has been provided Paradigm Academy is Leverage's successor. While the OP stated facts about Leverage, all the comments declaring more facts about Leverage Research are merely casting spurious associations between Leverage Research and the EA Summit. Along with the facts, you've smuggled in an assumption amounting to nothing more than a conspiracy theory about Leverage rebranding themselves as Paradigm Academy and is organizing the 2018 EA Summit for some unclear and ominous reason. In addition to no logical reason or sound evidence being provided for how Leverage's negative reputation in EA should be transferred to the upcoming Summit, my interlocutors have admitted themselves or revealed their evidence from personal experience to be weak. I've provided my direct personal experience knowing the parties involved in organizing the EA Summit, and also having paid close attention from afar of Leverage's trajectory in and around EA, contrary to the unsubstantiated thesis the 2018 EA Summit is some opaque machination by Leverage Research. * There is no logical connection between the facts about Leverage Research and the purpose of the upcoming EA Summit. Further, the claims presented as facts about the upcoming Summit aren't actually facts. At this point, I'll just point out the idea Paradigm is somehow necessarily in any sense Leverage's successor is based on no apparent evidence. So the author's advice doesn't logically follow from the claims made about Leverage Research. What's more, as I demonstrated in my other comments, this event isn't some unilateral attempt by Paradigm Academy to steer EA in some unknown direction. As one of the primary organizers for the EA community in Vancouver, Canada; the primary organizer for the rationality community in Vancouver; a liaison for local representation of these communities with adjacent communities; and an organizer for many novel efforts to c

See Geoff's reply to me below: Paradigm and Leverage will at some point be separate, but right now they're closely related (both under Geoff etc). I think it's reasonable for people to use Leverage's history and track record in evaluating Paradigm.

Note: I was previously CEO of CEA, but stepped down from that role about 9 months ago.

I've long been confused about the reputation Leverage has in the EA community. After hearing lots of conflicting reports, both extremely positive and negative, I decided to investigate a little myself. As a result, I've had multiple conversations with Geoff, and attended a training weekend run by Paradigm. I can understand why many people get a poor impression, and question the validity of their early stage research. I think that in the past, Leverage has done a poor job communicating their mission, and relationship to the EA movement. I'd like to see Leverage continue to improve transparency, and am pleased with Geoff's comments below.

Despite some initial hesitation, I found the Paradigm training I attended surprisingly useful, perhaps even more so than the CFAR workshop I attended. The workshop was competently run, and content was delivered in a polished fashion. I didn't go in expecting the content to be scientifically rigorous, most self improvement content isn't. It was fun, engaging, and useful enough to justify the time spent.

Paradigm is now running the EA summit. I know Mindy and Peter,... (read more)

I don't think that Leverage, Paradigm or related projects are good use of EA time or money

Found this surprising given the positive valence of the rest of the comment. Could you expand a little on why you don't think Leverage et al. are a good use of time/money?

I think their approach is highly speculative, even if you were to agree with their overall plan. I think Leverage has contributed to EA in the past, and I expect them to continue doing so, but this alone isn't enough to make them a better donation target than orgs like CEA or 80K.

I'm glad they exist, and hope they continue to exist, I just don't think Leverage or Paradigm are the most effective things I could be doing with my money or time. I feel similarly about CFAR. Supporting movement building and long-termism is already meta enough for me.

Interesting. I don't usually conflate "good use" with "most effective use."

Seems like "not a good use" means something like "this project shouldn't be associated with EA."

Whereas "not the most effective use" means something like "this project isn't my best-guess about how to do good, but it's okay to be associated with EA."

Perhaps this is just semantics, but I'm genuinely not sure which sense you intend.

-6
Evan_Gaensbauer
6y
2
Evan_Gaensbauer
6y
As someone whose experience as an outsider from Leverage, who has not done paid for any EA organizations in the past, is similar to Tara's, I can corroborate her impression. I've not been in the Bay Area or had a volunteer or personal association with any EA organizations located there since 2014. Thus, my own investigation was from afar, following the spread-out info on Leverage available online, including past posts regarding Leverage on LW and the EA Forum, and online conversations with former staff, interns and visitors to Leverage Research. The impression I got from what is probably a very different data-set than Tara's is virtually identical. Thus, I endorse as a robust yet fair characterization of Leverage Research. I've also heard from several CFAR workshop alumni myself they found the Paradigm training they received more useful than the CFAR workshop they attended as well. A couple of them also noted their surprise at this impression, given their trepidation knowing Paradigm sprouted from Leverage, what with their past reputation. A confounding factor in these anecdotes would be the CFAR workshops my friends and acquaintances had attended were from a few years ago, in which time those same people revisiting CFAR, and more recent CFAR workshop alumni, remark how different and superior to their earlier workshops CFAR's more recent ones have been. Nonetheless, the impression I've received is nearly unanimous positive experiences at Paradigm workshops from attendees part of the EA movement, competitive in quality with CFAR workshops, which has years of troubleshooting and experience on Paradigm. I want to clarify the CEA has not been alone in movement-building activities, and the CEA itself has ongoing associations with the Local Effective Altruism Network (LEAN) and the Effective Altruism Foundation out of the German-speaking EA world on movement-building activities. Paradigm Academy's staff, in seeking to kickstart grassroots movement-building efforts in EA

Thanks for making this post, it was long overdue.

Further facts

  • Connection Theory has been criticized as follows: "It is incomplete and inadequate, has flawed methodology, and conflicts well established science." The key paper has been removed from their websites and the web archive but is still available at the bottom of this post.
  • More of Geoff Anders's early work can be seen at https://systematicphilosophy.com/ and https://philosophicalresearch.wordpress.com/. (I hope they don't take down these websites as well.)
  • Former Leverage staff have launched a stablecoin cryptocurrency called Reserve (formerly "Flamingo"), which was backed by Peter Thiel and Coinbase.
  • In 2012-2014, they ran THINK.
  • The main person at LEAN is closely involved with Paradigm Academy and helps them recruit people.

Recruitment transparency

  • I have spoken with four former interns/staff who pointed out that Leverage Research (and its affiliated organizations) resembles a cult according to the criteria listed here.
  • The EA Summit 2018 website lists LEAN, Charity Science, and Paradigm Academy as "participating organizations," implying they're equally involved. However, Charity Science
... (read more)
Jacy
6y23
0
0

Just to add a bit of info: I helped with THINK when I was a college student. It wasn't the most effective strategy (largely, it was founded before we knew people would coalesce so strongly into the EA identity, and we didn't predict that), but Leverage's involvement with it was professional and thoughtful. I didn't get any vibes of cultishness from my time with THINK, though I did find Connection Theory a bit weird and not very useful when I learned about it.

6
Evan_Gaensbauer
6y
Do you mind clarifying what you mean by "recruits people?" I.e., do you mean they recruit people to attend the workshops, or to join the organizational staff. In this comment I laid out the threat to EA as a cohesive community itself for those within to like the worst detractors of EA and adjacent communities to level blanket accusations of an organization of being a cult. Also, that comment was only able to provide mention of a handful of people describing Leverage like a cult, admitting they could not recall any specific details. I already explained that that report doesn't not qualify as a fact, nor even an anecdote, but hearsay, especially since further details aren't being provided. I'm disinclined to take seriously more hearsay of a mysterious impression of Leverage as cultish given the poor faith in which my other interlocutor was acting in. Since none of the former interns or staff this hearsay of Leverage being like a cult are coming forward to corroborate what features of a cult from the linked Lifehacker article Leverage shares, I'm unconvinced your or the other reports of Leverage as being like a cult aren't being taken out of context from the individuals you originally heard them from, nor that this post and the comments aren't a deliberate attempt to do nothing but tarnish Leverage. Paradigm Academy was incubated by Leverage Research, as many organizations in and around EA are by others (e.g., MIRI incubated CFAR; CEA incubated ACE, etc.). As far as I can tell now, like with those other organizations, Paradigm and Leverage should be viewed as two distinct organizations. So that itself is not a fact about Leverage, which I also went over in this comment. As I stated in that comment as well, there is a double standard at play here. EA Global each year is organized by the CEA. They aren't even the only organization in EA with the letters "EA" in their name, nor are they exclusively considered among EA organizations able to wield the EA brand. And yet

Paradigm Academy was incubated by Leverage Research, as many organizations in and around EA are by others (e.g., MIRI incubated CFAR; CEA incubated ACE, etc.). As far as I can tell now, like with those other organizations, Paradigm and Leverage should be viewed as two distinct organizations.

See Geoff's reply to me above: Paradigm and Leverage will at some point be separate, but right now they're closely related (both under Geoff etc). I don't think viewing them as separate organizations, where learning something about Leverage should not much affect your view of Paradigm, makes sense, at least not yet.

[anonymous]6y13
0
0

CEA incubated EAF

I don't think this is accurate. (Please excuse the lack of engagement with anything else here; I'm just skimming some of it for now but I did notice this.)

[Edit: Unless you meant EA Funds (rather than Effective Altruism Foundation, as I read it)?]

1
Evan_Gaensbauer
6y
I meant the EA Foundation, who I was under the impression received incubation from CEA. Since apparently my ambiguous perception of those events might be wrong, I've switched the example of one CEA's incubees to ACE.
[anonymous]6y12
0
0

That one is accurate.

Also "incubees" is my new favourite word.

9
throwaway2
6y
I could list a number of specific details, but not without violating the preferences of the people who shared their experiences with me, and not without causing even more unnecessary drama. These details wouldn't make for a watertight case that they're a "cult". I deliberately didn't claim that Leverage is a cult. (See also this.) But the details are quite alarming for anyone who strives to have well-calibrated beliefs and an open-minded and welcoming EA community. I do think their cultishness led to unnecessary harm to well-meaning, young people who wanted to do good in the world.
kbog
6y12
0
0

There's a big difference between feeling cultlike, as in "weird", "disorienting", "bizarre" etc, and exhibiting the epistemic flaws of a cult, as in having people be afraid to disagree with the thought leader, a disproportionate reverence for a single idea or corpus, the excommunication of dissenters, the application of one idea or corpus to explain everything in the world, instinctively explaining away all possible counterarguments, refusal to look seriously at outside ideas, and so on.

If you could provide any sanitized, abstracted details to indicate that the latter is going on rather than merely the former, then it would go a long way towards indicating that LR is contrary to the goal of well-calibrated beliefs and open-mindedness.

8
Habryka
6y
(While LessWrong.com was historically run by MIRI, the new LessWrong is indeed for most intents and purposes an independent organization (while legally under the umbrella of CFAR) and we are currently filing documents to get our own 501c3 registered, and are planning to stick around as an organization for at least another 5 years or so. Since we don't yet have a name that is different from "LessWrong", it's easy to get confused about whether we are an actual independent organization, and I figured I would comment to clarify that.)

Good day all,

Can anyone please provide an example of a tangible output from this 'research organization' of the sort EA generally recognize and encourage?

Any rationale or consideration as to how association with such opaque groups does anything other than seriously undermine EA's mission statement would also be appreciated.

Kind Regards

Alistair Simmonds

8
Anthony Repetto
1y
Alistair, I regret to inform you that after four years of Leverage's Anti-Avoidance Training, the cancer has spread: the EA Community at large is now repeatedly aghast that outsiders are noticing their subtle rug-sweeping of sexual harassment and dismissal of outside critique. In barely a decade, the self-described rats are swum 'round a stinking sh!p. I'm still amazed that, for the last year, as I kept bringing-forth concerns and issues, the EA members each insisted 'no problems here, no, never, we're always so perfect....' Yep. It shows.

About two years have now passed since the post. Main updates:

  • Leverage Research appears to be just four people. They have announced new plans, and released a short introduction to their interests in early stage science, but not any other work. Their history of Leverage Research appears to have stalled at the fourth chapter.
  • Reserve seems to be ten people, about seven of whom were involved with Leverage Research. Reserve Rights is up by about 160% since being floated two years ago.
  • Paradigm Research is now branding as a self-help organisation.

CEA appears as a "participating organisation" of the EA Summit. What does this mean? Does CEA endorse paradigm academy?

CEA is not involved in the organizing of the conference, but we support efforts to build the EA community. One of our staff will be speaking at the event.

4
Evan_Gaensbauer
6y
As an attendee to the 2018 EA Summit, I've been informed by the staff of Paradigm Academy that not even the whole organization, nor Leverage Research, initiated this idea. Geoff Anders nor the executive leadership of Leverage Research are the authors of this Summit. I don't know the hierarchy of Paradigm Academy or where Mindy McTeigue or Peter Buckley, the primary organizers of the Summit, fall in it. As far as I can tell, the EA Summit was independently initiated by these staff at Paradigm and other individual effective altruists they connected with. In the run-up to organizing this Summit, the organizations these individual community members are staff at became sponsors of the EA Summit. Thus, the Local Effective Altruism Network; Charity Science; Paradigm Academy and the CEA are all participants at this event, endorsing the goal of the Summit within EA, without those organizations needing to endorse each other. That's an odd question to ask. Must each EA organization endorse every other involved at EA Global, or any other EA event, prior to its beginning for the community to regard it as "genuinely EA?" As far as I can tell, while Paradigm is obviously physically hosting the event, what it means for the CEA and the other organizations to be participating organizations is just that, officially supporting these efforts at the EA Summit itself. It means no more and no less than for any organization other than what Julia stated in her comment. Also, I oppose using or pressuring the CEA in a form of triangulation, and to be cast by default as the most legitimate representation of the whole EA movement. Nothing I know about the CEA would lead me to believe they condone the type of treatment where someone tries speaking on their behalf in any sense without prior consent. Also, past my own expectations, the EA community recently made clear they don't as a whole give license to the CEA represent EA as a whole however they want. Nonetheless, to the point of vocally dis

Evan, thank you for these comments here. I just wanted to register, in case it's at all useful, that I find it a bit difficult to understand your posts sometimes. It struck me that shorter and simpler sentences would probably make this easier for me. But I may be totally ideosyncratic here (English isn't my first language), so do ignore this if it doesn't strike you as useful.

-6
Evan_Gaensbauer
6y

I honestly don't get all this stuff about not publishing your work. Time to brag, boy will I get shit on for this comment, but it's really relevant to the issue here: I never even had a minor in the subject, but when I had a good philosophical argument I got it published in a journal, and it wasn't that hard. Peer reviewed, not predatory, went through three rounds of revisions. Not a prestigious journal by any stretch of the imagination, but it proves that I knew what I was doing, which is good enough. You think that peer review is bullshit, fine: that mea... (read more)

Intellectual contributions to the rationality community: including CFAR’s class on goal factoring

Just a note. I think this might be a bit missleading. Geoff, and other members of Leverage research taught a version of goal factoring at some early CFAR workshops. And Leverage did develop a version of goal factoring inspired by CT. But my understanding is that CFAR staff independently developed goal factoring (starting from an attempt to teach applied consequentialism), and this is an instance of parallel development.

[I work for CFAR, though I had not yet joined the EA or rationality community in those early days. I am reporting what other longstanding CFAR staff told me.]

Some participants of the Pareto fellowship have told me that Leverage resembles a cult. I can't remember many specifics. One thing is that the main guy (Geoff Anders?) thinks, 100% in earnest, that he's the greatest philosopher who's ever lived.

3
Evan_Gaensbauer
6y
1. The CEA, the very organization you juxtaposed with Leverage and Paradigm in this comment has in the past been compared to a Ponzi scheme. Effective altruists who otherwise appreciated that criticism thought much of the value was lost in comparing it to a Ponzi scheme, and without it, the criticism may been better received. Additionally, LessWrong and the rationality community; CFAR and MIRI; and all of AI safety have been for years been smeared as a cult by their detractors. The rationality community isn't perfect. There is no guarantee interactions with a self-identified (aspiring) rationality community will "rationally" go however an individual or small group of people interacting with the community, online or in person, hope or expect. But the vast majority of effective altruists, even those who are cynical about these organizations or sub-communities within EA, disagree with how these organizations have been treated, for it poisons the well of good will in EA for everyone. In this comment, you stated your past experience with the Pareto Fellowship and Leverage left you feeling humiliated and manipulated. I've also been a vocal critic in person throughout the EA community of both Leverage Research and how Geoff Anders has led the organization. But that to elevate a personal opposition of them to a public exposure of opposition research in an attempt to tarnish an event they're supporting alongside many other parties in EA is not something I ever did, or will do. My contacts in EA and myself have followed Leverage. I've desisted in making posts like this myself, because digging for context I found Leverage has changed from any impression I've gotten of them. And that's why at first I was skeptical of attending the EA Summit. But upon reflection, I realized it wasn't supported by the evidence to conclude Leverage is so incapable of change that anything they're associated with should be distrusted. But what you're trying to do with Leverage Research is no differ

Given there are usernames like "throwaway" and "throwaway2," and knowing the EA Forum, and its precursor, LessWrong, I'm confident there is only be one account under the username "anonymous," and that all the comments on this post using this account are coming from the same individual.

I'm confused: the comments on Less Wrong you'd see by "person" and "personN" that were the same person happened when importing from Overcoming Bias. That wouldn't be happening here.

They might still be the same person, but I don't think this forum being descended from LessWrong's code tells us things one way or the other.

1
Evan_Gaensbauer
6y
Thanks. I wasn't aware of that. I'll redact that part of my comment.
9
throwaway2
6y
I don't feel comfortable sharing the reasons for remaining anonymous in public, but I would be happy to disclose my identity to a trustworthy person to prove that this is my only fake account.
1
Evan_Gaensbauer
6y
Upvoted. I'm sorry for the ambiguity of my comment. I meant each of the posts here under the usernames "throwaway," "throwaway2," and "anonymous" are each consistently being made by same three people, respectively. I was just clarifying up front as I was addressing you for others reading it's almost certainly the same anonymous individual making the comments under the same account. I wouldn't expect you to forgo your anonymity.
7
kbog
6y
Your comments seem to be way longer than they need to be because you don't trust other users here. Like, if someone comes and says they felt like it was a cult, I'm just going to think "OK, someone felt like it was a cult." I'm not going to assume that they are doing secret blood rituals, I'm not going to assume that it's a proven fact. I don't need all these qualifications about the difference between cultishness and a stereotypical cult, I don't need all these qualifications about the inherent uncertainty of the issue, that stuff is old hat. This is the EA Forum, an internal space where issues are supposed to be worked out calmly; surely here, if anywhere, is a place where frank criticism is okay, and where we can extend the benefit of the doubt. I think you're wasting a lot of time, and implicitly signaling that the issue is more of a drama mine than it should be.
0
Evan_Gaensbauer
6y
I admit I'm coming from a place of not entirely trusting all other users here. That may be a factor in why my comments are longer in this thread than they need to be. I tend to write more than is necessary in general. For what it's worth, I treat the EA Forum not as an internal space but how I'd ideally like to see it be used. That is as a primary platform for EA discourse, on par with a level of activity more akin to the 'Effective Altruism' Facebook group, or LessWrong. I admit I've been wasting time. I've stopped responding directly to the OP because if I'm coming across as implicitly signaling this issue is a drama mine, I should come out and say what I actually believe. I may make a top-level post about. I haven't decided yet.
2
BenHoffman
6y
"Compared to a Ponzi scheme" seems like a pretty unfortunate compression of what I actually wrote. Better would be to say that I claimed that a large share of ventures, including a large subset of EA, and the US government, have substantial structural similarities to Ponzi schemes. Maybe my criticism would have been better received if I'd left out the part that seems to be hard for people to understand; but then it would have been different and less important criticism.
-1
Evan_Gaensbauer
6y
[epistemic status: meta] Summary: Reading comments in this thread which are similar reactions I've seen you or other rationality bloggers receive from effective altruists on critical posts regarding EA, I think there is a pattern to how rationalists may tend to write on important topics that doesn't gel with the typical EA mindset. Consequentially, it seems the pragmatic thing for us to do would be to figure out how to alter how we write to get our message across to a broader audience. Upvoted. I don't if you've read some of the other comments in this thread. But some of the most upvoted ones are about how I need to change up my writing style. So unfortunate compressions of what I actually write aren't new to me, either. I'm sorry I compressed what you actually wrote. But even an accurate compression of what you actually wrote might make my comments too long for what most users prefer on the EA Forum. If I just linked to your original post, it would be too long for us to read. I spend more of my time on EA projects. If there were more promising projects coming out of the rationality community, I'd spend more time on them relative to how much time I dedicate to EA projects. But I go where the action is. Socially, I'm as if not more socially involved with the rationality community than I am with EA. From my inside view, here is how I'd describe the common problem with my writing on the EA Forum: I came here from LessWrong. Relative to LW, I haven't found how or what I write on the EA Forum to be too long. But that's because I'm anchoring off EA discourse looking like SSC 100% of the time. But since the majority of EAs don't self-identify as rationalists, and the movement is so intellectually diverse, the expectation is the EA Forum won't be formatted on any discourse style common to the rationalist diaspora. I've touched upon this issue with Ray Arnold before. Zvi has touched on it too in some of his blog posts about EA. A crude rationalist impression might be t

Hi everyone,

I'd like to start off by apologizing. I realize that it has been hard to understand what Leverage has been doing, and I think that that's my fault. Last month Kerry Vaughan convinced me that I needed a new approach to PR and public engagement, and so I've been thinking about what to write and say about this. My plan, apart from the post here, was to post something over the next month. So I'll give a brief response to the points here and then make a longer main post early next week [UPDATE: see 2nd edit below].

(1) I'm so... (read more)

counting our research as 0 value, and using the movement building impact estimates from LEAN, we come out well on EV compared to an average charity ... I will let readers make their own calculations

Hi Geoff. I gave this a little thought and I am not sure it works. In fact it looks quite plausible that someone's EV (expected value) calculation on Leverage might actually come out as negative (ie. Leverage would be causing harm to the world).

This is because:

  • Most EA orgs calculate their counterfactual expected value by taking into account what the people in that organisation would be doing otherwise if they were not in that organisation and then deduct this from their impact. (I believe at least 80K, Charity Science and EA London do this)

  • Given Leverage's tendency to hire ambitious altruistic people and to look for people at EA events it is plausible that a significant proportion of Leverage staff might well have ended up at other EA organisations.

  • There is a talent gap at other EA organisations (see 80K on this)

  • Leverage does spend some time on movement building but I estimate that this is a tiny proportion of the time, >5%, best guess 3%, (based on having talked to people

... (read more)

Could you comment specifically on the Wayback Machine exclusion? Thanks!

Hi Geoff,

In reading this I'm confused about the relationship between Paradigm and Leverage. People in this thread (well, mostly Evan) seem to be talking about them as if Leverage incubated Paradigm but the two are now fully separate. My understanding, however, was that the two organizations function more like two branches of a single entity? I don't have a full picture or anything, but I thought you ran both organizations, staff of both mostly live at Leverage, people move freely between the two as needed by projects, and what happens under each organization is more a matter of strategy than separate direction?

By analogy, I had thought the relationship of Leverage to Paradigm was much more like CEA vs GWWC (two brands of the same organization) or even CEA UK vs CEA USA (two organizations acting together as one brand) than CEA vs ACE (one organization that spun off another one, which is now operates entirely independently with no overlap of staff etc).

Jeff

Hi Jeff,

Sure, happy to try to clarify. I run both Leverage and Paradigm. Leverage is a non-profit and focuses on research. Paradigm is a for-profit and focuses on training and project incubation. The people in both organizations closely coordinate. My current expectation is that I will eventually hand Leverage off while working to keep the people on both projects working together.

I think this means we’re similar to MIRI/CFAR. They started with a single organization which led to the creation of a new organization. Over time, their organizations came to be under distinct leadership, while still closely coordinating.

To understand Leverage and Paradigm, it’s also important to note that we are much more decentralized than most organizations. We grant members of our teams substantial autonomy in both determining their day-to-day work and with regard to starting new projects.

On residence, new hires typically live at our main building for a few months to give them a place to land and then move out. Currently less than 1/3 of the total staff live on-site.

Thanks for clarifying!

Two takeaways for me:

  • Use of both the "Paradigm" and "Leverage" names isn't a reputational dodge, contra throwaway in the original post. The two groups focus on different work and are in the process of fully dividing.

  • People using what they know about Leverage to inform their views of Paradigm is reasonable given their level of overlap in staff and culture, contra Evan here and here.

My plan, apart from the post here, was to post something over the next month.

Did you end up posting anything on this subject?

5
Kirsten
6y
What have you done to promote movement building? I didn't see anything on the post or your website, other than the summit next week.

Leverage:

(1) founded THINK, the first EA student group network

(2) ran the EA Summit 2013, the first large EA conference (video)

(3) ran the EA Summit 2014

(4) ran the EA Retreat 2014, the first weeklong retreat for EA leaders

(5) handed off the EA Summit series to CEA; CEA renamed it EA Global

(6) helped out operationally with EA Global 2015.

3
Dunja
6y
Could you please specify which methods of introspection and psychological frameworks you employ to this end, and which evidence you use to assure these frameworks are based on the adequate scientific evidence, obtained by reliable methods?
-8
schwartzman
6y

Leverage Research has now existed for over 7.5 years1 Since 2011, it has consumed over 100 person-years of human capital.

Given by their own admission in a comment response to their original post, the author of this post is providing these facts so effective altruists can make an informed decision regarding potentially attending the 2018 EA Summit, with the expectation these facts can or will discourage EAs from attending the EA Summit, it’s unclear how these facts are relevant information.

  • In particular, no calculation or citation is provided for the

... (read more)

Meta:

It might be worthwhile to have some sort of flag or content warning for potentially controversial posts like this.

On the other hand, this could be misused by people who dislike the EA movement, who could use it as a search parameter to find and "signal-boost" content that looks bad when taken out of context.

What are the benefits of this suggestion?

kbog
6y10
0
0

This is a romp through meadows of daisies and sunflowers compared to what real Internet drama looks like. It's perfectly healthy for a bunch of people to report on their negative experiences and debate the effectiveness of an organization. It will only look controversial if you frame it as controversial; people will only think it is a big deal if you act like it is a big deal.

3
Evan_Gaensbauer
6y
I agree with kbog, while this is unusual for discourse for the EA Forum, this is still far above a bar where I think it's practical to be worried about controversy. If someone thinks the content of a post on the EA Forum might trigger some reader(s), I don't see anything wrong with including content warnings on posts. I'm unsure what you mean by "flagging" potentially controversial content.