30

Leverage Research: reviewing the basic facts

Resources spent

  • Leverage Research has now existed for over 7.5 years1
  • Since 2011, it has consumed over 100 person-years of human capital.
  • From 2012-16, Leverage Research spent $2.02 million, and the associated Institute for Philosophical Research spent $310k.23

Outputs

Some of the larger outputs of Leverage Research include:

  • Work on Connection Theory: although this does not include the initial creation of the theory itself, which was done by Geoff Anders prior to founding Leverage Research
  • Contributions to productivity of altruists via the application of psychological theories including Connection Theory
  • Intellectual contributions to the effective altruism community: including early work on cause prioritisation and risks to the movement.
  • Intellectual contributions to the rationality community: including CFAR’s class on goal factoring
  • The EA Summits in 2013-14: The EA summit is a precursor to EA Global, which is being revived in 2018

Its website also has seven blog posts.4

Recruitment Transparency

  • Leverage Research previous organized the Pareto Fellowship in collaboration with another effective altruism organization. According to one attendee, Leverage staff were secretly discussing attendees using an individual Slack channel for each.
  • Leverage Research has provided psychology consulting services using Connection Theory, leading it to obtain mind-maps of a substantial fraction of its prospective staff and donors, based on reports from prospective staff and donors.
  • The leadership of Leverage Research have on multiple occasions overstated their rate of staff growth by more than double, in personal conversation.
  • Leverage Research sends staff to effective altruism organizations to recruit specific lists of people from the effective altruism community, as is apparent from discussions with and observation of Leverage Research staff at these events.
  • Leverage Research has spread negative information about organisations and leaders that would compete for EA talent.

General Transparency

  • The website of Leverage Research has been excluded from the Wayback Machine5
  • Leverage Research has had a strategy of using multiple organizations to tailor conversations to the topics of interest to different donors.
  • Leverage Research had longstanding plans to replace Leverage Research with one or more new organizations if the reputational costs of the name Leverage Research ever become too severe. A substantial number of staff of Paradigm Academy were previously staff of Leverage Research.

General Remarks

Readers are encouraged to add additional facts known about Leverage Research in the comments section, especially where these can be supported by citation, or direct conversational evidence.

Citations

1. https://www.lesswrong.com/posts/969wcdD3weuCscvoJ/introducing-leverage-research

2. https://projects.propublica.org/nonprofits/organizations/453989386

3. https://projects.propublica.org/nonprofits/organizations/452740006

4. http://leverageresearch.org/blog

5. https://web.archive.org/web/*/http://leverageresearch.org/

Comments (63)

Comment author: Gregory_Lewis 05 August 2018 05:47:05PM 21 points [-]

[My views only]

Although few materials remain from the early days of Leverage (I am confident they acted to remove themselves from wayback, as other sites link to wayback versions of their old documents which now 404), there are some interesting remnants:

  • A (non-wayback) website snapshot from 2013
  • A version of Leverage's plan
  • An early Connection Theory paper

I think this material (and the surprising absence of material since) speaks for itself - although I might write more later anyway.

Per other comments, I'm also excited by the plan of greater transparency from Leverage. I'm particularly eager to find out whether they still work on Connection Theory (and what the current theory is), whether they addressed any of the criticism (e.g. 1, 2) levelled at CT years ago, whether the further evidence and argument mentioned as forthcoming in early documents and comment threads will materialise, and generally what research (on CT or anything else) have they done in the last several years, and when this will be made public.

Comment author: TaraMacAulay 04 August 2018 12:45:51AM 28 points [-]

Note: I was previously CEO of CEA, but stepped down from that role about 9 months ago.

I've long been confused about the reputation Leverage has in the EA community. After hearing lots of conflicting reports, both extremely positive and negative, I decided to investigate a little myself. As a result, I've had multiple conversations with Geoff, and attended a training weekend run by Paradigm. I can understand why many people get a poor impression, and question the validity of their early stage research. I think that in the past, Leverage has done a poor job communicating their mission, and relationship to the EA movement. I'd like to see Leverage continue to improve transparency, and am pleased with Geoff's comments below.

Despite some initial hesitation, I found the Paradigm training I attended surprisingly useful, perhaps even more so than the CFAR workshop I attended. The workshop was competently run, and content was delivered in a polished fashion. I didn't go in expecting the content to be scientifically rigorous, most self improvement content isn't. It was fun, engaging, and useful enough to justify the time spent.

Paradigm is now running the EA summit. I know Mindy and Peter, some of the key organisers, through their long standing contributions to EA. They were both involved in running a successful student group, and Peter worked at CEA, helping us to organise EAG 2015. I believe that Mindy and Peter are dedicated EAs, who decided to organise this event because they would really like to see more focus on movement building in the EA community.

I've been wanting to see new and more movement building focused activities in EA. CEA can't do it all alone, and I generally support people in the EA community attempting ambitious movement building projects. Given this, and my positive experience attending an event put on by Paradigm, I decided to provide some funding for the EA Summit personally.

I don't think that Leverage, Paradigm or related projects are good use of EA time or money, but I do think the level of hostility towards them I've seen in this community is unwarranted, and I'd like to see us do better.

Comment author: Milan_Griffes 04 August 2018 01:10:39AM *  19 points [-]

I don't think that Leverage, Paradigm or related projects are good use of EA time or money

Found this surprising given the positive valence of the rest of the comment. Could you expand a little on why you don't think Leverage et al. are a good use of time/money?

Comment author: TaraMacAulay 04 August 2018 01:57:49AM 20 points [-]

I think their approach is highly speculative, even if you were to agree with their overall plan. I think Leverage has contributed to EA in the past, and I expect them to continue doing so, but this alone isn't enough to make them a better donation target than orgs like CEA or 80K.

I'm glad they exist, and hope they continue to exist, I just don't think Leverage or Paradigm are the most effective things I could be doing with my money or time. I feel similarly about CFAR. Supporting movement building and long-termism is already meta enough for me.

Comment author: Milan_Griffes 04 August 2018 05:19:30PM 7 points [-]

Interesting. I don't usually conflate "good use" with "most effective use."

Seems like "not a good use" means something like "this project shouldn't be associated with EA."

Whereas "not the most effective use" means something like "this project isn't my best-guess about how to do good, but it's okay to be associated with EA."

Perhaps this is just semantics, but I'm genuinely not sure which sense you intend.

Comment author: Evan_Gaensbauer 04 August 2018 01:35:56AM 3 points [-]

I've long been confused about the reputation Leverage has in the EA community. After hearing lots of conflicting reports, both extremely positive and negative, I decided to investigate a little myself. As a result, I've had multiple conversations with Geoff, and attended a training weekend run by Paradigm. I can understand why many people get a poor impression, and question the validity of their early stage research. I think that in the past, Leverage has done a poor job communicating their mission, and relationship to the EA movement. I'd like to see Leverage continue to improve transparency, and am pleased with Geoff's comments below.

As someone whose experience as an outsider from Leverage, who has not done paid for any EA organizations in the past, is similar to Tara's, I can corroborate her impression. I've not been in the Bay Area or had a volunteer or personal association with any EA organizations located there since 2014. Thus, my own investigation was from afar, following the spread-out info on Leverage available online, including past posts regarding Leverage on LW and the EA Forum, and online conversations with former staff, interns and visitors to Leverage Research. The impression I got from what is probably a very different data-set than Tara's is virtually identical. Thus, I endorse as a robust yet fair characterization of Leverage Research.

Despite some initial hesitation, I found the Paradigm training I attended surprisingly useful, perhaps even more so than the CFAR workshop I attended. The workshop was competently run, and content was delivered in a polished fashion. I didn't go in expecting the content to be scientifically rigorous, most self improvement content isn't. It was fun, engaging, and useful enough to justify the time spent.

I've also heard from several CFAR workshop alumni myself they found the Paradigm training they received more useful than the CFAR workshop they attended as well. A couple of them also noted their surprise at this impression, given their trepidation knowing Paradigm sprouted from Leverage, what with their past reputation. A confounding factor in these anecdotes would be the CFAR workshops my friends and acquaintances had attended were from a few years ago, in which time those same people revisiting CFAR, and more recent CFAR workshop alumni, remark how different and superior to their earlier workshops CFAR's more recent ones have been. Nonetheless, the impression I've received is nearly unanimous positive experiences at Paradigm workshops from attendees part of the EA movement, competitive in quality with CFAR workshops, which has years of troubleshooting and experience on Paradigm.

I've been wanting to see new and more movement building focused activities in EA. CEA can't do it all alone, and I generally support people in the EA community attempting ambitious movement building projects. Given this, and my positive experience attending an event put on by Paradigm, I decided to provide some funding for the EA Summit personally.

I want to clarify the CEA has not been alone in movement-building activities, and the CEA itself has ongoing associations with the Local Effective Altruism Network (LEAN) and the Effective Altruism Foundation out of the German-speaking EA world on movement-building activities. Paradigm Academy's staff, in seeking to kickstart grassroots movement-building efforts in EA, are aware of this, as LEAN is a participating organization in EA as well. Additionally, while Charity Science (CS) has typically been and has streamlined their focus on direct global poverty interventions, their initial incubation and association with Rethink Charity and LEAN, as well as their recent foray into cause-neutral effective charity incubation, could arguably qualify them as focused on EA movement-building as well.

This is my conjecture based on where it seems CS is headed. I haven't asked them, and I recommend anyone curious ask CS themselves if they identify movement-building as part of their current activities in EA. I bring this up as relevant because CS is also officially participating in the EA Summit.

Also, Tara, thanks for providing funding for this event :)

Comment author: throwaway2 03 August 2018 05:52:30PM *  20 points [-]

Thanks for making this post, it was long overdue.

Further facts

  • Connection Theory has been criticized as follows: "It is incomplete and inadequate, has flawed methodology, and conflicts well established science." The key paper has been removed from their websites and the web archive but is still available at the bottom of this post.
  • More of Geoff Anders's early work can be seen at https://systematicphilosophy.com/ and https://philosophicalresearch.wordpress.com/. (I hope they don't take down these websites as well.)
  • Former Leverage staff have launched a stablecoin cryptocurrency called Reserve (formerly "Flamingo"), which was backed by Peter Thiel and Coinbase.
  • In 2012-2014, they ran THINK.
  • The main person at LEAN is closely involved with Paradigm Academy and helps them recruit people.

Recruitment transparency

  • I have spoken with four former interns/staff who pointed out that Leverage Research (and its affiliated organizations) resembles a cult according to the criteria listed here.
  • The EA Summit 2018 website lists LEAN, Charity Science, and Paradigm Academy as "participating organizations," implying they're equally involved. However, Charity Science is merely giving a talk there. In private conversation, at least one potential attendee was told that Charity Science was more heavily involved. (Edit: This issue seems to be fixed now.)
  • (low confidence) I've heard through the grapevine that the EA Summit 2018 wasn't coordinated with other EA organizations except for LEAN and Charity Science.

Overall, I am under the impression that a majority of EAs think that Leverage is quite culty and ineffective. Leverage staff usually respond by claiming that their unpublished research is valuable, but the insiders mentioned above seemed to disagree.

If someone has strong counterevidence to this skeptical view of Leverage, I would be very interested and open to changing my mind.

Comment author: Jacy_Reese 04 August 2018 07:22:29AM *  21 points [-]

Just to add a bit of info: I helped with THINK when I was a college student. It wasn't the most effective strategy (largely, it was founded before we knew people would coalesce so strongly into the EA identity, and we didn't predict that), but Leverage's involvement with it was professional and thoughtful. I didn't get any vibes of cultishness from my time with THINK, though I did find Connection Theory a bit weird and not very useful when I learned about it.

Comment author: Evan_Gaensbauer 03 August 2018 10:38:38PM *  8 points [-]

helps them recruit people

Do you mind clarifying what you mean by "recruits people?" I.e., do you mean they recruit people to attend the workshops, or to join the organizational staff.

I have spoken with four former interns/staff who pointed out that Leverage Research (and its affiliated organizations) resembles a cult according to the criteria listed here.

In this comment I laid out the threat to EA as a cohesive community itself for those within to like the worst detractors of EA and adjacent communities to level blanket accusations of an organization of being a cult. Also, that comment was only able to provide mention of a handful of people describing Leverage like a cult, admitting they could not recall any specific details. I already explained that that report doesn't not qualify as a fact, nor even an anecdote, but hearsay, especially since further details aren't being provided.

I'm disinclined to take seriously more hearsay of a mysterious impression of Leverage as cultish given the poor faith in which my other interlocutor was acting in. Since none of the former interns or staff this hearsay of Leverage being like a cult are coming forward to corroborate what features of a cult from the linked Lifehacker article Leverage shares, I'm unconvinced your or the other reports of Leverage as being like a cult aren't being taken out of context from the individuals you originally heard them from, nor that this post and the comments aren't a deliberate attempt to do nothing but tarnish Leverage.

The EA Summit 2018 website lists LEAN, Charity Science, and Paradigm Academy as "participating organizations," implying they're equally involved. However, Charity Science is merely giving a talk there. In private conversation, at least one potential attendee was told that Charity Science was more heavily involved. (Edit: This issue seems to be fixed now.)

Paradigm Academy was incubated by Leverage Research, as many organizations in and around EA are by others (e.g., MIRI incubated CFAR; CEA incubated ACE, etc.). As far as I can tell now, like with those other organizations, Paradigm and Leverage should be viewed as two distinct organizations. So that itself is not a fact about Leverage, which I also went over in this comment.

The EA Summit 2018 website lists LEAN, Charity Science, and Paradigm Academy as "participating organizations," implying they're equally involved. However, Charity Science is merely giving a talk there. In private conversation, at least one potential attendee was told that Charity Science was more heavily involved. (Edit: This issue seems to be fixed now.)

As I stated in that comment as well, there is a double standard at play here. EA Global each year is organized by the CEA. They aren't even the only organization in EA with the letters "EA" in their name, nor are they exclusively considered among EA organizations able to wield the EA brand. And yet despite all this nobody objects on priors to the CEA as a single organization branding these events each year. As we shouldn't. Of course, none of this necessary to invalidate the point you're trying to make. Julia Wise as the Community Liaison for the CEA has already clarified the CEA themselves support the Summit.

So the EA Summit has already been legitimized by multiple EA organizations as a genuine EA event, including the one which is seen as the default legitimate representation for the whole movement.

(low confidence) I've heard through the grapevine that the EA Summit 2018 wasn't coordinated with other EA organizations except for LEAN and Charity Science.

As above, that the EA Summit wasn't coordinated by more than one organization means nothing. There are already EA retreat- and conference-like events organized by local university groups and national foundations all over the world, which have gone well, such as the Czech EA Retreat in 2017. So the idea EA should be so centralized only registered non-profits with some given caliber of prestige in the EA movement, or those they approve, can organize events to be viewed as legitimate by the community is unfounded. Not even the CEA wants that centralized. Nobody does. So whatever point you're trying to prove about the EA Summit using facts about Leverage Research is still invalid.

For what it's worth, while no other organizations are officially participating, here are some effective altruists who will be speaking at the EA Summit, and the organizations they're associated with. This is sufficient to warrant a correct identification that those organizations are in spirit welcome and included at EAG. So the same standard should apply to the EA Summit.

  • Ben Pace, Ray Arnold and Oliver Habryka: LessWrong isn't an organization, but it's played a formative role in EA, and with LW's new codebase being the kernel of for the next version of the EA Forum, Ben and Oliver as admins and architects of the new LW are as important representatives of this online community as any in EA's history.

  • Rob Mather is the ED of the AMF. AMF isn't typically regarded as an "EA organization" because they're not a metacharity in need of dependence directly on the EA movement. But that Givewell's top-recommended charity since EA began, which continues to receive more donations from effective altruists than any other, to not been given consideration would be senseless.

  • Sarah Spikes runs the Berkeley REACH.

  • Holly Morgan is a staffer for the EA London organization.

In reviewing these speakers, and seeing so many from LEAN and Rethink Charity, with Kerry Vaughan being a director for individual outreach at CEA, I see what the EA Summit is trying to do. They're trying to have as speakers at the event to rally local EA group organizers from around the world to more coordinated action and spirited projects. Which is exactly what the organizers of the EA Summit have been saying the whole time. This is also why as an organizer for rationality and EA projects in Vancouver, Canada, trying to develop a project to scale both here and cities everywhere a system for organizing local groups to do direct work; and as a very involved volunteer online community organizer in EA, I was invited to attend the EA Summit. It's also why one the event organizers consulted with me before they announced the EA Summit how they thought it should be presented in the EA community.

This isn't counterevidence to be skeptical of Leverage. This is evidence counter to the thesis the EA Summit is nothing but a launchpad for Leverage's rebranding within the EA community as "Paradigm Academy," being advanced in these facts about Leverage Research. No logical evidence has been presented that the tenuous links between Leverage and the organization of the 2018 EA Summit entails the negative reputation Leverage has acquired over the years should be transferred onto the upcoming Summit.

Comment author: hollymorgan 04 August 2018 01:48:26AM *  11 points [-]

CEA incubated EAF

I don't think this is accurate. (Please excuse the lack of engagement with anything else here; I'm just skimming some of it for now but I did notice this.)

[Edit: Unless you meant EA Funds (rather than Effective Altruism Foundation, as I read it)?]

Comment author: Evan_Gaensbauer 04 August 2018 02:13:45AM 0 points [-]

I meant the EA Foundation, who I was under the impression received incubation from CEA. Since apparently my ambiguous perception of those events might be wrong, I've switched the example of one CEA's incubees to ACE.

Comment author: hollymorgan 04 August 2018 03:42:42AM 13 points [-]

That one is accurate.

Also "incubees" is my new favourite word.

Comment author: Jeff_Kaufman 06 August 2018 12:29:29PM *  9 points [-]

Paradigm Academy was incubated by Leverage Research, as many organizations in and around EA are by others (e.g., MIRI incubated CFAR; CEA incubated ACE, etc.). As far as I can tell now, like with those other organizations, Paradigm and Leverage should be viewed as two distinct organizations.

See Geoff's reply to me above: Paradigm and Leverage will at some point be separate, but right now they're closely related (both under Geoff etc). I don't think viewing them as separate organizations, where learning something about Leverage should not much affect your view of Paradigm, makes sense, at least not yet.

Comment author: Habryka 05 August 2018 06:48:43PM *  7 points [-]

(While LessWrong.com was historically run by MIRI, the new LessWrong is indeed for most intents and purposes an independent organization (while legally under the umbrella of CFAR) and we are currently filing documents to get our own 501c3 registered, and are planning to stick around as an organization for at least another 5 years or so. Since we don't yet have a name that is different from "LessWrong", it's easy to get confused about whether we are an actual independent organization, and I figured I would comment to clarify that.)

Comment author: throwaway2 04 August 2018 09:57:58AM 7 points [-]

Also, that comment was only able to provide mention of a handful of people describing Leverage like a cult, admitting they could not recall any specific details.

I could list a number of specific details, but not without violating the preferences of the people who shared their experiences with me, and not without causing even more unnecessary drama.

These details wouldn't make for a watertight case that they're a "cult". I deliberately didn't claim that Leverage is a cult. (See also this.) But the details are quite alarming for anyone who strives to have well-calibrated beliefs and an open-minded and welcoming EA community. I do think their cultishness led to unnecessary harm to well-meaning, young people who wanted to do good in the world.

Comment author: kbog  (EA Profile) 05 August 2018 08:09:27AM *  10 points [-]

There's a big difference between feeling cultlike, as in "weird", "disorienting", "bizarre" etc, and exhibiting the epistemic flaws of a cult, as in having people be afraid to disagree with the thought leader, a disproportionate reverence for a single idea or corpus, the excommunication of dissenters, the application of one idea or corpus to explain everything in the world, instinctively explaining away all possible counterarguments, refusal to look seriously at outside ideas, and so on.

If you could provide any sanitized, abstracted details to indicate that the latter is going on rather than merely the former, then it would go a long way towards indicating that LR is contrary to the goal of well-calibrated beliefs and open-mindedness.

Comment author: anonymous 03 August 2018 06:41:57PM *  28 points [-]

I was interviewed by Peter Buckley and Tyler Alterman when I applied for the Pareto fellowship. It was one of the strangest, most uncomfortable experiences I've had over several years of being involved in EA. I'm posting this from notes I took right after the call, so I am confident that I remember this accurately.

The first question asked about what I would do if Peter Singer presented me with a great argument for doing an effective thing that's socially unacceptable. The argument was left as an unspecified black box.

Next, for about 25 minutes, they taught me the technique of "belief reporting". (See some information here and here). They made me try it out live on the call, for example by making me do "sentence completion". This made me feel extremely uncomfortable. It seemed like unscientific, crackpot psychology. It was the sort of thing you'd expect from a New Age group or Scientology.

In the second part of the interview (30 minutes?), I was asked to verbalise what my system one believes will happen in the future of humanity. They asked me to just speak freely without thinking, even if it sounds incoherent. Again it felt extremely cultish. I expected this to last max 5 minutes and to form the basis for a subsequent discussion. But they let me ramble on for what felt like an eternity, and there were zero follow up questions. The interview ended immediately.

The experience left me feeling humiliated and manipulated.

Comment author: gray 06 August 2018 07:04:35PM 18 points [-]

I had an interview with them under the same circumstances and also had the belief reporting trial. (I forget if I had the Peter Singer question.) I can confirm that it was supremely disconcerting.

At the very least, it's insensitive - they were asking for a huge amount of vulnerability and trust in a situation where we both knew I was trying to impress them in a professional context. I sort of understand why that exercise might have seemed like a good idea, but I really hope nobody does this in interviews anymore.

Comment author: throwaway 03 August 2018 05:44:29AM *  25 points [-]

The reason for posting these facts now is that as of the time of writing, Leverage's successor, the Paradigm Academy is seeking to host the EA Summit in one week. The hope is that these facts would firstly help to inform effective altruists on the matter of whether they would be well-advised to attend, and secondly, what approach they may want to take if they do attend.

Leverage Research has recruited from the EA community using mind-maps and other psychological techniques, obtaining dozens of years of work, but doing little apparent good. As a result, the author views it as inadvisable for EAs to engage with Leverage Research and its successor, Paradigm Academy. Rather, they should seek the advice of mentors outside of the Leverage orbit before deciding to attend such an event. Based on past events such as the Pareto Fellowship, invitees who ultimately decide to attend would be well-advised to be cautious about recruitment, by keeping in touch with friends and mentors throughout.

Comment author: Khorton 03 August 2018 11:12:51PM 13 points [-]

I think this would be more useful as part of the main post than as a comment.

Comment author: Evan_Gaensbauer 03 August 2018 11:20:48PM *  0 points [-]

The reason for posting these facts now is that as of the time of writing, Leverage's successor, the Paradigm Academy is seeking to host the EA Summit in one week. The hope is that these facts would firstly help to inform effective altruists on the matter of whether they would be well-advised to attend, and secondly, what approach they may want to take if they do attend.

I've provided my explanations for the following in this comment:

  • No evidence has been provided Paradigm Academy is Leverage's successor. While the OP stated facts about Leverage, all the comments declaring more facts about Leverage Research are merely casting spurious associations between Leverage Research and the EA Summit. Along with the facts, you've smuggled in an assumption amounting to nothing more than a conspiracy theory about Leverage rebranding themselves as Paradigm Academy and is organizing the 2018 EA Summit for some unclear and ominous reason. In addition to no logical reason or sound evidence being provided for how Leverage's negative reputation in EA should be transferred to the upcoming Summit, my interlocutors have admitted themselves or revealed their evidence from personal experience to be weak. I've provided my direct personal experience knowing the parties involved in organizing the EA Summit, and also having paid close attention from afar of Leverage's trajectory in and around EA, contrary to the unsubstantiated thesis the 2018 EA Summit is some opaque machination by Leverage Research.

  • There is no logical connection between the facts about Leverage Research and the purpose of the upcoming EA Summit. Further, the claims presented as facts about the upcoming Summit aren't actually facts.

Leverage Research has recruited from the EA community using mind-maps and other psychological techniques, obtaining dozens of years of work, but doing little apparent good. As a result, the author views it as inadvisable for EAs to engage with Leverage Research and its successor, Paradigm Academy.

At this point, I'll just point out the idea Paradigm is somehow necessarily in any sense Leverage's successor is based on no apparent evidence. So the author's advice doesn't logically follow from the claims made about Leverage Research. What's more, as I demonstrated in my other comments, this event isn't some unilateral attempt by Paradigm Academy to steer EA in some unknown direction.

Rather, they should seek the advice of mentors outside of the Leverage orbit before deciding to attend such an event.

As one of the primary organizers for the EA community in Vancouver, Canada; the primary organizer for the rationality community in Vancouver; a liaison for local representation of these communities with adjacent communities; and an organizer for many novel efforts to coordinate effective altruists, including the EA Newsletter, I don't know if I'd describe myself as a "mentor." But I know others who see me that way, and it wouldn't be unfair of me to say both digitally, and geographically on the west coast; in Vancouver; and in Canada, I am someone who creates more opportunities for many individuals to connect to EA.

Also, if it wasn't clear, I'm well outside the Leverage orbit. If someone wants to accuse me of being a hack for Leverage, I can make some effort to prove I'm not part of their orbit (though I'd like to state that I would still see that as unnecessarily poor faith in this conversation). Anyway, as an outsider and veteran EA community organizer, I'm willing to provide earnest and individuated answers to questions about why I'm going to the 2018 EA Summit; or why and what kind of other effective altruists should also attend. I am not speaking for anyone but myself. I'm willing to do this in-thread as replies to this comment; or, if others would prefer, on social media or in another EA Forum post. Because I don't have as much time, and I'd to answer such questions transparently, I will only answer questions publicly asked of me.

Based on past events such as the Pareto Fellowship, invitees who ultimately decide to attend would be well-advised to be cautious about recruitment, by keeping in touch with friends and mentors throughout.

Unlike the author of this post and comment stated, it doesn't follow this event will be anything like the Pareto Fellowship, as there aren't any facts linking Leverage Research's past track record as an organization to the 2018 EA Summit.

For what it's worth to anyone, I intend to attend the 2018 EA Summit, and I offer as a friend my support and contact regarding any concerns other attendees may have.

Comment author: Jeff_Kaufman 06 August 2018 12:12:00PM *  10 points [-]

See Geoff's reply to me below: Paradigm and Leverage will at some point be separate, but right now they're closely related (both under Geoff etc). I think it's reasonable for people to use Leverage's history and track record in evaluating Paradigm.

Comment author: anonymous 03 August 2018 06:12:31PM 14 points [-]

CEA appears as a "participating organisation" of the EA Summit. What does this mean? Does CEA endorse paradigm academy?

Comment author: Julia_Wise  (EA Profile) 03 August 2018 08:39:20PM *  29 points [-]

CEA is not involved in the organizing of the conference, but we support efforts to build the EA community. One of our staff will be speaking at the event.

Comment author: Evan_Gaensbauer 03 August 2018 08:44:48PM *  12 points [-]

As an attendee to the 2018 EA Summit, I've been informed by the staff of Paradigm Academy that not even the whole organization, nor Leverage Research, initiated this idea. Geoff Anders nor the executive leadership of Leverage Research are the authors of this Summit. I don't know the hierarchy of Paradigm Academy or where Mindy McTeigue or Peter Buckley, the primary organizers of the Summit, fall in it. As far as I can tell, the EA Summit was independently initiated by these staff at Paradigm and other individual effective altruists they connected with. In the run-up to organizing this Summit, the organizations these individual community members are staff at became sponsors of the EA Summit.

Thus, the Local Effective Altruism Network; Charity Science; Paradigm Academy and the CEA are all participants at this event, endorsing the goal of the Summit within EA, without those organizations needing to endorse each other. That's an odd question to ask. Must each EA organization endorse every other involved at EA Global, or any other EA event, prior to its beginning for the community to regard it as "genuinely EA?"

As far as I can tell, while Paradigm is obviously physically hosting the event, what it means for the CEA and the other organizations to be participating organizations is just that, officially supporting these efforts at the EA Summit itself. It means no more and no less than for any organization other than what Julia stated in her comment.

Also, I oppose using or pressuring the CEA in a form of triangulation, and to be cast by default as the most legitimate representation of the whole EA movement. Nothing I know about the CEA would lead me to believe they condone the type of treatment where someone tries speaking on their behalf in any sense without prior consent. Also, past my own expectations, the EA community recently made clear they don't as a whole give license to the CEA represent EA as a whole however they want. Nonetheless, to the point of vocally disagreeing with what I saw as a needless pile-on Nick Beckstead and the CEA, in that thread I've made an effort to maintain an ongoing and mutually respectful conversation.

Comment author: ole_koksvik 04 August 2018 04:50:49AM 36 points [-]

Evan, thank you for these comments here. I just wanted to register, in case it's at all useful, that I find it a bit difficult to understand your posts sometimes. It struck me that shorter and simpler sentences would probably make this easier for me. But I may be totally ideosyncratic here (English isn't my first language), so do ignore this if it doesn't strike you as useful.

Comment author: Evan_Gaensbauer 04 August 2018 05:46:42AM *  -1 points [-]

Thanks. This is useful feedback :)

Yeah, to be fair, I was writing these comments in rapid succession based on information unique to me to quickly prevent the mischaracterization of the EA Summit next week. I am both attending the EA Summit next week, and I am significantly personally invested in it as representing efforts in EA I'd like to see greatly advanced. I also have EA projects I've been working on I intend to talk about at the EA Summit next week. (In spite of acknowledging my own motive here, I still made all my previous comments with as much fidelity as I could muster.)

All this made me write these comments hastily enough that I write in long sentences. Mentally, when writing quickly, it's how I condense as much information into as few clauses as possible in making arguments. You're not the first person to tell me writing shorter and simpler sentences would be easier to read. In general, when I'm making public comments without a time crunch, these days I'm making more of a conscious effort to be more comprehensible :)

But I may be totally ideosyncratic here (English isn't my first language), so do ignore this if it doesn't strike you as useful.

This is useful feedback, but English not being your first language is a factor too, because that isn't how "idiosyncratic" is spelled. :P

I also would not expect effective altruists not fluent in English to be able to follow a lot of what I write (or a lot of posts written in EA, for that matter). Often because of the continually complicated discourse exclusively in English in EA, I forget to write with a readership which largely doesn't speak English as a first language. I'll keep this more in mind for how I write my posts in the future.

Comment author: Milan_Griffes 04 August 2018 05:01:53PM 16 points [-]

when I'm making public comments without a time crunch

My hunch is even when there's a time crunch, fewer words will be bigger bang for buck :-)

Comment author: Khorton 04 August 2018 05:13:43PM 13 points [-]

Seconded. As a time-saving measure, I skip any comments longer than three paragraphs unless the first couple of sentences makes their importance very clear. Unfortunately, that means I miss most of Evan's posts. :(

Comment author: Evan_Gaensbauer 04 August 2018 11:38:59PM 1 point [-]

Would it help if I included a summary of my posts at the top of them?

Often I write for a specific audience, which is more limited and exclusive. I don't think there is anything necessarily wrong with taking this approach to discourse in EA. Top-level posts on the EA Forum are made specific to a single cause, written in an academic style for a niche audience. I've mentally generalized this to how I write about anything on the internet.

It turns out not writing in a more inclusive way is harming the impact of my messages more than I thought. I'll make more effort to change this. Thanks for the feedback.

Comment author: Raemon 05 August 2018 08:41:16PM *  14 points [-]

FYI, I a) struggle to read most of your posts (and seem like I'm supposed to be in the target audience)

b) the technique I myself use is "write the post the way I'd naturally write it (i.e. long and meandering), and then write a tldr of the post summarizing it with a few bullet points... and then realize that the tldr was all I actually needed to say in the first place.

Comment author: Khorton 05 August 2018 11:46:34AM 5 points [-]

Yes, an early summary would help. It doesn't have to be very formal; just a clear statement of your argument in the first paragraph.

If you're going to argue multiple things, you could use different comments.

Comment author: Evan_Gaensbauer 04 August 2018 11:21:25PM 3 points [-]

Of course. What I was trying to explain is when there is a time crunch, I've habituated myself to use more words. Obviously it's a habit worth changing. Thanks for the feedback :)

Comment author: ole_koksvik 09 August 2018 11:55:03PM 1 point [-]

Yes, the old adage: "I don't have time to write short texts."

Comment author: nbouscal 04 August 2018 05:08:07PM 28 points [-]

I’m unconvinced that ole_koksvik’s fluency in English has anything to do with it. Fluent English speakers misspell words like “idiosyncratic” regularly, and I and other fluent English speakers also find your posts difficult to follow. I generally end up skimming them, because the ratio of content to unnecessary verbosity is really low. If your goal is to get your evidence and views onto the table as quickly as possible, consider that your current strategy isn’t getting them out there at all for some portion of your audience, and that a short delay for editing could significantly expand your reach.

Comment author: Evan_Gaensbauer 04 August 2018 11:47:16PM 3 points [-]

Yeah, that has become abundantly clear to me with how many upvotes these comments were receiving. I've received feedback on this before, but never with such a strong signal before. Sometimes I have different goals with my public writing at different times. So it's not always my intention for how I write to be maximally accessible to everyone. I usually know who reads my posts, and why they appreciate them, as I receive a lot of positive feedback as well. It's evident I've generalized that in this thread to the point it's hurting the general impact of spreading my message. So I completely agree. Thanks for the feedback :)

Comment author: Geoff_Anders 03 August 2018 04:15:03PM *  23 points [-]

Hi everyone,

I'd like to start off by apologizing. I realize that it has been hard to understand what Leverage has been doing, and I think that that's my fault. Last month Kerry Vaughan convinced me that I needed a new approach to PR and public engagement, and so I've been thinking about what to write and say about this. My plan, apart from the post here, was to post something over the next month. So I'll give a brief response to the points here and then make a longer main post early next week.

(1) I'm sorry for the lack of transparency and public engagement. We did a lot more of this in 2011-2012, but did not really succeed in getting people to understand us. After that, I decided we should just focus on our research. I think people expect more public engagement, even very early in the research process, and that I did not understand this.

(2) We do not consider ourselves an EA organization. We do not solicit funds from individual EAs. Instead, we are EA-friendly, in that (a) we employ many EAs, (b) we let people run EA projects, and (c) we contribute to EA causes, especially EA movement building. As noted in the post, we ran the EA Summit 2013 and EA Summit 2014. These were the precursors to the EA Global conferences. For a sense of what these were like, see the EA Summit 2013 video. We also ran the EA Retreat 2014 and helped out operationally with EA Global 2015. We also founded THINK, the first EA movement group network.

(3) We are probably not the correct donation choice for most EAs. We care about reason, evidence, and impact, but we are much higher variance than most EAs would like. We believe there is evidence that we are more effective than regular charities due to our contributions to movement building. These can be thought of as "impact offsets". (See (6) for more on the EV calculation.)

(4) We are also probably not the correct employment choice for most EAs. We are looking for people with particular skills and characteristics (e.g., ambition, dedication to reason and evidence, self-improvement). These make CFAR our largest EA competitor for talent, though in actual fact we have not ended up competing substantially with them. In general if people are excited about CEA or 80k or Charity Science or GiveWell or OPP, then we typically also expect that they are better served by working there.

(5) Despite (3) and (4), we are of course very interested in meeting EAs who would be good potential donors or recruits. We definitely recruit at EA events, though again we think that most EAs would be better served by working elsewhere.

(6) To do a full EV calculation on Leverage, it is important to take into account the counterfactual cost of employees who would work on other EA projects. We think that taking this into account, counting our research as 0 value, and using the movement building impact estimates from LEAN, we come out well on EV compared to an average charity. This is because of our contribution to EA movement building and because EA movement building is so valuable. (Rather than give a specific Fermi estimate, I will let readers make their own calculations.) Of course, under these assumptions donating to movement building alone is higher EV than donating to Leverage. Donors should only consider us if they assign greater than 0 value to our research.

I hope that that clarifies to some degree Leverage's relation to the EA movement. I'll respond to the specific points above later today.

As for the EA Summit 2018, we agree that everyone should talk with people they know before attending. This is true of any multi-day event. Time is valuable, and it's a good idea to get evidence of the value prior to attending.

(Leverage will not be officially presenting any content at the EA Summit 2018, so people who would like to learn more should contact us here. My own talk will be about how to plan ambitious projects.)

EDIT: I said in my earlier comment that I would write again this evening. I’ll just add a few things to my original post.

— Many of the things listed in the original post are simply good practice. Workshops should track participants to ensure the quality of their experience and that they are receiving value. CFAR also does this. Organizations engaged in recruitment should seek to proactively identify qualified candidates. I’ve spoken to the leaders of multiple organizations who do this.

— Part of what we do is help people to understand themselves better via introspection and psychological frameworks. Many people find this both interesting and useful. All of the mind mapping we did was with the full knowledge and consent of the person, at their request, typically with them watching and error-checking as we went. (I say “did” because we stopped making full mind maps in 2015.) This is just a normal part of showing people what we do. It makes sense for prospective recruits and donors to seek an in-depth look at our tools prior to becoming more involved. We also have strict privacy rules and do not share personal information from charting sessions without explicit permission from the person. This is true for everyone we work with, including prospective recruits and donors.

Comment author: Jeff_Kaufman 04 August 2018 02:34:54PM 30 points [-]

Hi Geoff,

In reading this I'm confused about the relationship between Paradigm and Leverage. People in this thread (well, mostly Evan) seem to be talking about them as if Leverage incubated Paradigm but the two are now fully separate. My understanding, however, was that the two organizations function more like two branches of a single entity? I don't have a full picture or anything, but I thought you ran both organizations, staff of both mostly live at Leverage, people move freely between the two as needed by projects, and what happens under each organization is more a matter of strategy than separate direction?

By analogy, I had thought the relationship of Leverage to Paradigm was much more like CEA vs GWWC (two brands of the same organization) or even CEA UK vs CEA USA (two organizations acting together as one brand) than CEA vs ACE (one organization that spun off another one, which is now operates entirely independently with no overlap of staff etc).

Jeff

Comment author: Geoff_Anders 05 August 2018 03:52:50AM 13 points [-]

Hi Jeff,

Sure, happy to try to clarify. I run both Leverage and Paradigm. Leverage is a non-profit and focuses on research. Paradigm is a for-profit and focuses on training and project incubation. The people in both organizations closely coordinate. My current expectation is that I will eventually hand Leverage off while working to keep the people on both projects working together.

I think this means we’re similar to MIRI/CFAR. They started with a single organization which led to the creation of a new organization. Over time, their organizations came to be under distinct leadership, while still closely coordinating.

To understand Leverage and Paradigm, it’s also important to note that we are much more decentralized than most organizations. We grant members of our teams substantial autonomy in both determining their day-to-day work and with regard to starting new projects.

On residence, new hires typically live at our main building for a few months to give them a place to land and then move out. Currently less than 1/3 of the total staff live on-site.

Comment author: Jeff_Kaufman 06 August 2018 12:27:08PM 21 points [-]

Thanks for clarifying!

Two takeaways for me:

  • Use of both the "Paradigm" and "Leverage" names isn't a reputational dodge, contra throwaway in the original post. The two groups focus on different work and are in the process of fully dividing.

  • People using what they know about Leverage to inform their views of Paradigm is reasonable given their level of overlap in staff and culture, contra Evan here and here.

Comment author: weeatquince  (EA Profile) 05 August 2018 09:57:53PM *  23 points [-]

counting our research as 0 value, and using the movement building impact estimates from LEAN, we come out well on EV compared to an average charity ... I will let readers make their own calculations

Hi Geoff. I gave this a little thought and I am not sure it works. In fact it looks quite plausible that someone's EV (expected value) calculation on Leverage might actually come out as negative (ie. Leverage would be causing harm to the world).

This is because:

  • Most EA orgs calculate their counterfactual expected value by taking into account what the people in that organisation would be doing otherwise if they were not in that organisation and then deduct this from their impact. (I believe at least 80K, Charity Science and EA London do this)

  • Given Leverage's tendency to hire ambitious altruistic people and to look for people at EA events it is plausible that a significant proportion of Leverage staff might well have ended up at other EA organisations.

  • There is a talent gap at other EA organisations (see 80K on this)

  • Leverage does spend some time on movement building but I estimate that this is a tiny proportion of the time, >5%, best guess 3%, (based on having talked to people at leverage and also based on looking at your achievements to date compared it to the apparent 100 person-years figure)

  • Therefore if the amount of staff who could be expected to have found jobs in other EA organisations is thought to be above 3% (which seems reasonable) then Leverage is actually displacing EAs from productive action so the total EV of Leverage is negative

Of course this is all assuming the value of your research is 0. This is the assumption you set out in your post. Obviously in practice I don’t think the value of your research is 0 and as such I think it is possible that the total EV of Leverage is positive*. I think more transparency would help here. Given that almost no research is available I do think it would be reasonable for someone who is not at Leverage to give your research an EV of close to 0 and therefore conclude that Leverage is causing harm.

I hope this helps and maybe explains why Leverage gets a bad rep. I am excited to see a more transparency and a new approach to public engagement. Keep on fighting for a better world!

*sentence edited to better match views

Comment author: Paul_Crowley 04 August 2018 10:35:43PM 14 points [-]

Could you comment specifically on the Wayback Machine exclusion? Thanks!

Comment author: Khorton 03 August 2018 11:15:21PM 5 points [-]

What have you done to promote movement building? I didn't see anything on the post or your website, other than the summit next week.

Comment author: Geoff_Anders 04 August 2018 06:12:46AM *  17 points [-]

Leverage:

(1) founded THINK, the first EA student group network

(2) ran the EA Summit 2013, the first large EA conference (video)

(3) ran the EA Summit 2014

(4) ran the EA Retreat 2014, the first weeklong retreat for EA leaders

(5) handed off the EA Summit series to CEA; CEA renamed it EA Global

(6) helped out operationally with EA Global 2015.

Comment author: Dunja 04 August 2018 02:23:52PM 3 points [-]

Part of what we do is help people to understand themselves better via introspection and psychological frameworks.

Could you please specify which methods of introspection and psychological frameworks you employ to this end, and which evidence you use to assure these frameworks are based on the adequate scientific evidence, obtained by reliable methods?

Comment author: kbog  (EA Profile) 05 August 2018 05:41:26AM *  6 points [-]

I honestly don't get all this stuff about not publishing your work. Time to brag, boy will I get shit on for this comment, but it's really relevant to the issue here: I never even had a minor in the subject, but when I had a good philosophical argument I got it published in a journal, and it wasn't that hard. Peer reviewed, not predatory, went through three rounds of revisions. Not a prestigious journal by any stretch of the imagination, but it proves that I knew what I was doing, which is good enough. You think that peer review is bullshit, fine: that means it's not that hard. With your supposedly superior understanding of academic incentives and meta-science and all that stuff, I'm sure you can dress up something so that it tickles the reviewers in the right way. Not wanting to mess with it most of the time is understandable, but you can still do us the courtesy of at least getting one or two things through the gauntlet so that we aren't left scratching our heads in confusion about whether we're looking at Kripke or Timecube or something in between. MIRI did it so you can too. Plus, it sounds like lots of this research is being kept hidden from public view entirely, which I just can't fathom.

The movement building sounds like good stuff however, I'm happy to see that.

Comment author: anonymous 03 August 2018 06:55:26PM *  3 points [-]

Some participants of the Pareto fellowship have told me that Leverage resembles a cult. I can't remember many specifics. One thing is that the main guy (Geoff Anders?) thinks, 100% in earnest, that he's the greatest philosopher who's ever lived.

Comment author: Evan_Gaensbauer 03 August 2018 10:03:28PM *  4 points [-]
  1. The CEA, the very organization you juxtaposed with Leverage and Paradigm in this comment has in the past been compared to a Ponzi scheme. Effective altruists who otherwise appreciated that criticism thought much of the value was lost in comparing it to a Ponzi scheme, and without it, the criticism may been better received. Additionally, LessWrong and the rationality community; CFAR and MIRI; and all of AI safety have been for years been smeared as a cult by their detractors. The rationality community isn't perfect. There is no guarantee interactions with a self-identified (aspiring) rationality community will "rationally" go however an individual or small group of people interacting with the community, online or in person, hope or expect. But the vast majority of effective altruists, even those who are cynical about these organizations or sub-communities within EA, disagree with how these organizations have been treated, for it poisons the well of good will in EA for everyone. In this comment, you stated your past experience with the Pareto Fellowship and Leverage left you feeling humiliated and manipulated. I've also been a vocal critic in person throughout the EA community of both Leverage Research and how Geoff Anders has led the organization. But that to elevate a personal opposition of them to a public exposure of opposition research in an attempt to tarnish an event they're supporting alongside many other parties in EA is not something I ever did, or will do. My contacts in EA and myself have followed Leverage. I've desisted in making posts like this myself, because digging for context I found Leverage has changed from any impression I've gotten of them. And that's why at first I was skeptical of attending the EA Summit. But upon reflection, I realized it wasn't supported by the evidence to conclude Leverage is so incapable of change that anything they're associated with should be distrusted. But what you're trying to do with Leverage Research is no different than what EA's worst critics do not in an effort to change EA or its members, but to tarnish them. From within or outside of EA, to criticize any EA organization in such a fashion is below any acceptable epistemic standard in this movement.

  2. If the post and comments here are stating facts about Leverage Research, and you're reporting impressions with no ability to remember specific details that Leverage is like a cult, those are barely facts. The only fact is some people perceived Leverage to be like a cult in the past, which are only anecdotes. And without details, they're only hearsay. Combined with the severity of the consequences if this hearsay was borne out, to be unable to produce actual facts invalidates the point you're trying to make.

Comment author: Jeff_Kaufman 04 August 2018 12:48:16PM *  10 points [-]

Given there are usernames like "throwaway" and "throwaway2," and knowing the EA Forum, and its precursor, LessWrong, I'm confident there is only be one account under the username "anonymous," and that all the comments on this post using this account are coming from the same individual.

I'm confused: the comments on Less Wrong you'd see by "person" and "personN" that were the same person happened when importing from Overcoming Bias. That wouldn't be happening here.

They might still be the same person, but I don't think this forum being descended from LessWrong's code tells us things one way or the other.

Comment author: Evan_Gaensbauer 05 August 2018 12:24:15AM 1 point [-]

Thanks. I wasn't aware of that. I'll redact that part of my comment.

Comment author: throwaway2 04 August 2018 09:37:47AM 8 points [-]

Given there are usernames like "throwaway" and "throwaway2," and knowing the EA Forum, and its precursor, LessWrong, I'm confident there is only be one account under the username "anonymous," and that all the comments on this post using this account are coming from the same individual.

I don't feel comfortable sharing the reasons for remaining anonymous in public, but I would be happy to disclose my identity to a trustworthy person to prove that this is my only fake account.

Comment author: Evan_Gaensbauer 04 August 2018 11:15:20PM 1 point [-]

Upvoted. I'm sorry for the ambiguity of my comment. I meant each of the posts here under the usernames "throwaway," "throwaway2," and "anonymous" are each consistently being made by same three people, respectively. I was just clarifying up front as I was addressing you for others reading it's almost certainly the same anonymous individual making the comments under the same account. I wouldn't expect you to forgo your anonymity.

Comment author: kbog  (EA Profile) 05 August 2018 09:02:18AM *  4 points [-]

Your comments seem to be way longer than they need to be because you don't trust other users here. Like, if someone comes and says they felt like it was a cult, I'm just going to think "OK, someone felt like it was a cult." I'm not going to assume that they are doing secret blood rituals, I'm not going to assume that it's a proven fact. I don't need all these qualifications about the difference between cultishness and a stereotypical cult, I don't need all these qualifications about the inherent uncertainty of the issue, that stuff is old hat. This is the EA Forum, an internal space where issues are supposed to be worked out calmly; surely here, if anywhere, is a place where frank criticism is okay, and where we can extend the benefit of the doubt. I think you're wasting a lot of time, and implicitly signaling that the issue is more of a drama mine than it should be.

Comment author: Evan_Gaensbauer 05 August 2018 06:56:07PM 1 point [-]

I admit I'm coming from a place of not entirely trusting all other users here. That may be a factor in why my comments are longer in this thread than they need to be. I tend to write more than is necessary in general. For what it's worth, I treat the EA Forum not as an internal space but how I'd ideally like to see it be used. That is as a primary platform for EA discourse, on par with a level of activity more akin to the 'Effective Altruism' Facebook group, or LessWrong.

I admit I've been wasting time. I've stopped responding directly to the OP because if I'm coming across as implicitly signaling this issue is a drama mine, I should come out and say what I actually believe. I may make a top-level post about. I haven't decided yet.

Comment author: BenHoffman 05 August 2018 01:14:25AM 4 points [-]

"Compared to a Ponzi scheme" seems like a pretty unfortunate compression of what I actually wrote. Better would be to say that I claimed that a large share of ventures, including a large subset of EA, and the US government, have substantial structural similarities to Ponzi schemes.

Maybe my criticism would have been better received if I'd left out the part that seems to be hard for people to understand; but then it would have been different and less important criticism.

Comment author: Evan_Gaensbauer 05 August 2018 06:49:39PM *  0 points [-]

[epistemic status: meta]

Summary: Reading comments in this thread which are similar reactions I've seen you or other rationality bloggers receive from effective altruists on critical posts regarding EA, I think there is a pattern to how rationalists may tend to write on important topics that doesn't gel with the typical EA mindset. Consequentially, it seems the pragmatic thing for us to do would be to figure out how to alter how we write to get our message across to a broader audience.

"Compared to a Ponzi scheme" seems like a pretty unfortunate compression of what I actually wrote. Better would be to say that I claimed that a large share of ventures, including a large subset of EA, and the US government, have substantial structural similarities to Ponzi schemes.

Upvoted.

I don't if you've read some of the other comments in this thread. But some of the most upvoted ones are about how I need to change up my writing style. So unfortunate compressions of what I actually write aren't new to me, either. I'm sorry I compressed what you actually wrote. But even an accurate compression of what you actually wrote might make my comments too long for what most users prefer on the EA Forum. If I just linked to your original post, it would be too long for us to read.

I spend more of my time on EA projects. If there were more promising projects coming out of the rationality community, I'd spend more time on them relative to how much time I dedicate to EA projects. But I go where the action is. Socially, I'm as if not more socially involved with the rationality community than I am with EA.

From my inside view, here is how I'd describe the common problem with my writing on the EA Forum: I came here from LessWrong. Relative to LW, I haven't found how or what I write on the EA Forum to be too long. But that's because I'm anchoring off EA discourse looking like SSC 100% of the time. But since the majority of EAs don't self-identify as rationalists, and the movement is so intellectually diverse, the expectation is the EA Forum won't be formatted on any discourse style common to the rationalist diaspora.

I've touched upon this issue with Ray Arnold before. Zvi has touched on it too in some of his blog posts about EA. A crude rationalist impression might be the problem with discourse on the EA Forum is it isn't LW. In terms of genres of creative non-fiction writing, the EA Forum is less tolerant of diversity than LW. That's fine. Thinking about this consequentially, I think rationalists who want their message heard by EA more don't need to learn to write better, but write different.

Comment author: Evan_Gaensbauer 04 August 2018 06:39:07AM *  0 points [-]

Leverage Research has now existed for over 7.5 years1 Since 2011, it has consumed over 100 person-years of human capital.

Given by their own admission in a comment response to their original post, the author of this post is providing these facts so effective altruists can make an informed decision regarding potentially attending the 2018 EA Summit, with the expectation these facts can or will discourage EAs from attending the EA Summit, it’s unclear how these facts are relevant information.

  • In particular, no calculation or citation is provided for the estimate Leverage has consumed over 100 person-years of human capital. Numbers from nowhere aren’t facts, so this isn’t even a fact.

  • Regardless, no context or reference for why these numbers matter, e.g., by contrasting Leverage with what popular EA organizations have accomplished over similar timeframes or person-years of human capital consumed.

From 2012-16, Leverage Research spent $2.02 million, and the associated Institute for Philosophical Research spent $310k.23

As comments from myself; Tara MacAulay, former CEO of the CEA; and Geoff Anders, executive director of Leverage, has made clear, Leverage:

  • has never and does not intend to solicit donations from individuals part of the EA community at large.

  • has in the past identified as part of EA movement, and was formative to the movement in its earlier years, but now identifies as distinct from EA, while still respecting EA, and collaborating with EA organizations where their goals overlap with Leverage.

  • does not present itself as effective or impactful using the evaluation criteria most typical of EA, and shouldn’t be evaluated on those grounds, as has been corroborated by EA organizations which have collaborated with Leverage in the past.

Based on this, the ~$2 million Leverage spend from 2012-16 shouldn’t be, as a lump sum, regarded as having been spent under an EA framework, or on EA grounds, nor evaluated as a means to discourage individual effective altruists from forming independent associations with Leverage distinct from EA as a community. Both EA and Leverage confirm Leverage has in the past but for the present and last few years should not be thought of as an EA organization. Thus, arguing Leverage is deceiving the EA movement on the grounds they stake a claim on EA without being effective is invalid, because Leverage does no such thing.

Leverage Research previous organized the Pareto Fellowship in collaboration with another effective altruism organization. According to one attendee, Leverage staff were secretly discussing attendees using an individual Slack channel for each.

While like the facts in the above section, this is a fact, I fail to see how it’s notable regarding recruitment transparency regarding Leverage. I’ve also in the past criticized double standards regarding transparency in the EA movement, that organizations in EA should not form secret fora to the exclusion of others. That’s because it should be sufficient to ensure necessary privacy among and between EA organizations using things like private email, Slack channels, etc. What’s more, every EA organization I or others I’ve talked to have volunteered has something like a Slack channel. When digital communications internal to an organization are necessary to its operation, it has become standard practice for every organization in that boat to use something like an internal mailing list or Slack channel exclusive to their staff. That the Pareto Fellowship or Leverage Research would have Slack channels for evaluating potential fellows for recruitment on an individual basis may be unusual among EA organizations. But it’s not unheard of among how competent organizations operate. Also, it has no bearing on how Leverage might appeal to transparency while being opaque in a way other organizations associated with EA aren’t.

Also, since you’re seeking as much transparency about Leverage as possible, I expect your presentation will be transparent in kind. Thus, would you mind identifying the EA organization in question which was part of the collaboration with Leverage and the Pareto Fellowship you’re referring to?

Leverage Research sends staff to effective altruism organizations to recruit specific lists of people from the effective altruism community, as is apparent from discussions with and observations Leverage Research staff at these events.

As with the last statement, this may be unusual among EA organizations, but this is in Leverage’s past identifying as an EA organization, which they no longer do. There is nothing about this which is inherently a counter-effective organizational or community practice inside or outside of the EA movement, nor does have direct relevance to transparency, nor the author’s goal with this post.

Leverage Research has spread negative information about organisations and leaders that would compete for EA talent.

Who?

Leverage Research has had a strategy of using multiple organizations to tailor conversations to the topics of interest to different donors.

Like with other statements, I don’t understand how transparently exposing this practice is meant to as a fact back the author’s goal with this post, nor move readers’ impression of Leverage in whatever sense.

Leverage Research had longstanding plans to replace Leverage Research with one or more new organizations if the reputational costs of the name Leverage Research ever become too severe.

Given the number of claims in this post and in the comments from the same author presented as facts aren’t indeed facts, and so many of the facts stated are presented without context or relevance to the author’s goal, I’d like to see this claim substantiated by any evidence whatsoever. Otherwise, I won’t find this claim credible enough to be believable.

In short, regarding the assorted facts, the author of this post (by their own admission in a comment response), is trying to prove something. And I can’t perceive how these facts and other claims made advances that goal. So my question to the author is: what is your point?

Comment author: ZachWeems 05 August 2018 12:01:52AM 0 points [-]

Meta:

It might be worthwhile to have some sort of flag or content warning for potentially controversial posts like this.

On the other hand, this could be misused by people who dislike the EA movement, who could use it as a search parameter to find and "signal-boost" content that looks bad when taken out of context.

Comment author: Denise_Melchin 05 August 2018 08:47:01AM 8 points [-]

What are the benefits of this suggestion?

Comment author: kbog  (EA Profile) 05 August 2018 08:40:59AM *  7 points [-]

This is a romp through meadows of daisies and sunflowers compared to what real Internet drama looks like. It's perfectly healthy for a bunch of people to report on their negative experiences and debate the effectiveness of an organization. It will only look controversial if you frame it as controversial; people will only think it is a big deal if you act like it is a big deal.

Comment author: Evan_Gaensbauer 05 August 2018 07:01:29PM 1 point [-]

I agree with kbog, while this is unusual for discourse for the EA Forum, this is still far above a bar where I think it's practical to be worried about controversy. If someone thinks the content of a post on the EA Forum might trigger some reader(s), I don't see anything wrong with including content warnings on posts. I'm unsure what you mean by "flagging" potentially controversial content.