Comment author: MichaelDickens  (EA Profile) 11 September 2018 02:52:34AM 4 points [-]

Another feature that could help people find old posts is to display a few random old posts on a sidebar. For example, on any of Jeff Kaufman's blog posts, five old posts display on the sidebar. I've found lots of interesting old posts on Jeff's blog via this feature.

Comment author: Jeff_Kaufman 19 September 2018 05:58:15PM 0 points [-]

a few random old posts on a sidebar

In my case I just have a list of posts I thought were good and want more people to see, but in a forum with voting you could show highly upvoted older posts.

Comment author: John_Maxwell_IV 27 August 2018 12:29:34AM *  2 points [-]

(I'm assuming that the counterfactual here is someone who wants to do unpaid direct work full time, has some funds available that could be used to either support themselves or could be donated to something high impact, and could either live in SF or Blackpool.)

If you have a high income, though, you can pay other people to do them: for example, instead of cooking you could buy frozen food, buy restaurant food, or hire a cook.

These options don't go away if you move to Blackpool. But your rent does get a lot cheaper.

It seems like maybe there are two questions here which are more or less orthogonal: the value of hiring a very talented full-time manager for your group house (someone who is passing up a job that pays $75K+ in order to be manager), and the value of moving to Blackpool. I think the value of having a very talented full-time manager for your group house is not about reducing expenses, it's about creating a house culture that serves to multiply the impact of all the residents. If that's not possible then it probably makes less sense to hire a manager whose opportunity cost is high.

Comment author: Jeff_Kaufman 27 August 2018 05:16:46PM 1 point [-]

I'm assuming that the counterfactual here is someone who wants to do unpaid direct work full time, has some funds available that could be used to either support themselves or could be donated to something high impact, and could either live in SF or Blackpool.

Is this the counterfactual for the hotel manager, or for a resident? I'm only trying to address the hotel manager role here, but I wouldn't expect the counterfactual for a hotel manager to be unpaid direct work.

I think the value of having a very talented full-time manager for your group house is not about reducing expenses, it's about creating a house culture that serves to multiply the impact of all the residents

This makes a lot of sense to me, but reading the Hotel Manager section the impression I get is that a hotel manager would be too busy to do much in that direction. There's no discussion of their role in setting culture, and a lot of operations work.

Comment author: John_Maxwell_IV 21 August 2018 09:30:15PM 2 points [-]

Whether you live in a hotel or not, there are certain chores that need to be done for your life to run smoothly: grocery shopping, cooking, laundry, etc. These chores don't go away if you live in an expensive housing market or make a high income. But if you live with roommates, it's possible to coordinate with your roommates to achieve economies of scale in these tasks. Right now at the EA hotel, we are trading off so we each take turns cooking for the entire hotel (currently ~6 people) one night per week. This creates economies of scale because cooking for 6 people is much less than 6 times as hard as cooking for one person. I expect that these economies of scale effects will become even more valuable as the number of people in the hotel grows.

Comment author: Jeff_Kaufman 24 August 2018 06:14:25PM *  1 point [-]

These chores don't go away if you live in an expensive housing market or make a high income.

If you have a high income, though, you can pay other people to do them: for example, instead of cooking you could buy frozen food, buy restaurant food, or hire a cook.

I expect that these economies of scale effects will become even more valuable as the number of people in the hotel grows.

My experience with cooking is that above about 6-10 people the economies of scale drop off a lot. I really like living in a house with enough adults that I can cook about once a week, but as the number of people (and combinations of dietary restrictions) grows you get beyond what one person can cook easily.

Overall, though, it sounds like you're more arguing for "group houses are great" (which I agree on) and not "taking the hotel manager job has high counterfactual impact" (which I think is much more important?)

Comment author: PeterSinger 13 May 2017 11:47:33PM 8 points [-]

These are good points and I'm suitably chastened for not being sufficiently thorough in checking Toby Ord's claims,
I'm pleased to see that GiveWell is again investigating treating blindness: http://blog.givewell.org/2017/05/11/update-on-our-views-on-cataract-surgery/. In this very recent post, they say: "We believe there is evidence that cataract surgeries substantially improve vision. Very roughly, we estimate that the cost-effectiveness of cataract surgery is ~$1,000 per severe visual impairment reversed.[1]"
The footnote reads: "This estimate is on the higher end of the range we calculated, because it assumes additional costs due to demand generation activities, or identifying patients who would not otherwise have known about surgery. We use this figure because we expect that GiveWell is more likely to recommend an organization that can demonstrate, through its demand generation activities, that it is causing additional surgeries to happen. The $1,000 figure also reflects our sense that cost-effectiveness in general tends to worsen (become more expensive) as we spend more time building our model of any intervention. Finally, it is a round figure that communicates our uncertainty about this estimate overall. But it's reasonable to say that until they complete this investigation, which will be years rather than months, it may be better to avoid using the example of preventing or curing blindness." So the options seem to be either not using the example of blindness at all, or using this rough figure of $1000, with suitable disclaimers. It still leads to 40 cases of severe visual impairment reversed v. 1 case of providing a blind person with a guide dog.

Comment author: Jeff_Kaufman 20 August 2018 12:58:04PM 1 point [-]

It looks like GiveWell put that project on hold in January 2018: https://www.givewell.org/charities/IDinsight/partnership-with-idinsight/cataract-surgery-project

Comment author: Ro-bot-tens 07 August 2018 10:21:49PM 2 points [-]

And a reminder that a 30” door has 29” of entry clearance if the door is taken off the hinges (because of the stops on the frame). If the door is opened at 90 degrees, a 30” door might barely allow 27” of clearance. The challenge might draw some attention the event, so use the type of skills described here to make everybody comfortable.

Comment author: Jeff_Kaufman 09 August 2018 12:44:42AM 0 points [-]

Good point! I just measured some standard cheap new construction doors and found:

  • You lose 3/8" on each side to the jamb.

  • The door open to 90° loses you 1 5/8" on top of the jamb.

So a 30" door has a clear opening of 27 5/8" (or 29 1/4" with the door off).

Comment author: Evan_Gaensbauer 03 August 2018 10:38:38PM *  8 points [-]

helps them recruit people

Do you mind clarifying what you mean by "recruits people?" I.e., do you mean they recruit people to attend the workshops, or to join the organizational staff.

I have spoken with four former interns/staff who pointed out that Leverage Research (and its affiliated organizations) resembles a cult according to the criteria listed here.

In this comment I laid out the threat to EA as a cohesive community itself for those within to like the worst detractors of EA and adjacent communities to level blanket accusations of an organization of being a cult. Also, that comment was only able to provide mention of a handful of people describing Leverage like a cult, admitting they could not recall any specific details. I already explained that that report doesn't not qualify as a fact, nor even an anecdote, but hearsay, especially since further details aren't being provided.

I'm disinclined to take seriously more hearsay of a mysterious impression of Leverage as cultish given the poor faith in which my other interlocutor was acting in. Since none of the former interns or staff this hearsay of Leverage being like a cult are coming forward to corroborate what features of a cult from the linked Lifehacker article Leverage shares, I'm unconvinced your or the other reports of Leverage as being like a cult aren't being taken out of context from the individuals you originally heard them from, nor that this post and the comments aren't a deliberate attempt to do nothing but tarnish Leverage.

The EA Summit 2018 website lists LEAN, Charity Science, and Paradigm Academy as "participating organizations," implying they're equally involved. However, Charity Science is merely giving a talk there. In private conversation, at least one potential attendee was told that Charity Science was more heavily involved. (Edit: This issue seems to be fixed now.)

Paradigm Academy was incubated by Leverage Research, as many organizations in and around EA are by others (e.g., MIRI incubated CFAR; CEA incubated ACE, etc.). As far as I can tell now, like with those other organizations, Paradigm and Leverage should be viewed as two distinct organizations. So that itself is not a fact about Leverage, which I also went over in this comment.

The EA Summit 2018 website lists LEAN, Charity Science, and Paradigm Academy as "participating organizations," implying they're equally involved. However, Charity Science is merely giving a talk there. In private conversation, at least one potential attendee was told that Charity Science was more heavily involved. (Edit: This issue seems to be fixed now.)

As I stated in that comment as well, there is a double standard at play here. EA Global each year is organized by the CEA. They aren't even the only organization in EA with the letters "EA" in their name, nor are they exclusively considered among EA organizations able to wield the EA brand. And yet despite all this nobody objects on priors to the CEA as a single organization branding these events each year. As we shouldn't. Of course, none of this necessary to invalidate the point you're trying to make. Julia Wise as the Community Liaison for the CEA has already clarified the CEA themselves support the Summit.

So the EA Summit has already been legitimized by multiple EA organizations as a genuine EA event, including the one which is seen as the default legitimate representation for the whole movement.

(low confidence) I've heard through the grapevine that the EA Summit 2018 wasn't coordinated with other EA organizations except for LEAN and Charity Science.

As above, that the EA Summit wasn't coordinated by more than one organization means nothing. There are already EA retreat- and conference-like events organized by local university groups and national foundations all over the world, which have gone well, such as the Czech EA Retreat in 2017. So the idea EA should be so centralized only registered non-profits with some given caliber of prestige in the EA movement, or those they approve, can organize events to be viewed as legitimate by the community is unfounded. Not even the CEA wants that centralized. Nobody does. So whatever point you're trying to prove about the EA Summit using facts about Leverage Research is still invalid.

For what it's worth, while no other organizations are officially participating, here are some effective altruists who will be speaking at the EA Summit, and the organizations they're associated with. This is sufficient to warrant a correct identification that those organizations are in spirit welcome and included at EAG. So the same standard should apply to the EA Summit.

  • Ben Pace, Ray Arnold and Oliver Habryka: LessWrong isn't an organization, but it's played a formative role in EA, and with LW's new codebase being the kernel of for the next version of the EA Forum, Ben and Oliver as admins and architects of the new LW are as important representatives of this online community as any in EA's history.

  • Rob Mather is the ED of the AMF. AMF isn't typically regarded as an "EA organization" because they're not a metacharity in need of dependence directly on the EA movement. But that Givewell's top-recommended charity since EA began, which continues to receive more donations from effective altruists than any other, to not been given consideration would be senseless.

  • Sarah Spikes runs the Berkeley REACH.

  • Holly Morgan is a staffer for the EA London organization.

In reviewing these speakers, and seeing so many from LEAN and Rethink Charity, with Kerry Vaughan being a director for individual outreach at CEA, I see what the EA Summit is trying to do. They're trying to have as speakers at the event to rally local EA group organizers from around the world to more coordinated action and spirited projects. Which is exactly what the organizers of the EA Summit have been saying the whole time. This is also why as an organizer for rationality and EA projects in Vancouver, Canada, trying to develop a project to scale both here and cities everywhere a system for organizing local groups to do direct work; and as a very involved volunteer online community organizer in EA, I was invited to attend the EA Summit. It's also why one the event organizers consulted with me before they announced the EA Summit how they thought it should be presented in the EA community.

This isn't counterevidence to be skeptical of Leverage. This is evidence counter to the thesis the EA Summit is nothing but a launchpad for Leverage's rebranding within the EA community as "Paradigm Academy," being advanced in these facts about Leverage Research. No logical evidence has been presented that the tenuous links between Leverage and the organization of the 2018 EA Summit entails the negative reputation Leverage has acquired over the years should be transferred onto the upcoming Summit.

Comment author: Jeff_Kaufman 06 August 2018 12:29:29PM *  10 points [-]

Paradigm Academy was incubated by Leverage Research, as many organizations in and around EA are by others (e.g., MIRI incubated CFAR; CEA incubated ACE, etc.). As far as I can tell now, like with those other organizations, Paradigm and Leverage should be viewed as two distinct organizations.

See Geoff's reply to me above: Paradigm and Leverage will at some point be separate, but right now they're closely related (both under Geoff etc). I don't think viewing them as separate organizations, where learning something about Leverage should not much affect your view of Paradigm, makes sense, at least not yet.

Comment author: Geoff_Anders 05 August 2018 03:52:50AM 13 points [-]

Hi Jeff,

Sure, happy to try to clarify. I run both Leverage and Paradigm. Leverage is a non-profit and focuses on research. Paradigm is a for-profit and focuses on training and project incubation. The people in both organizations closely coordinate. My current expectation is that I will eventually hand Leverage off while working to keep the people on both projects working together.

I think this means we’re similar to MIRI/CFAR. They started with a single organization which led to the creation of a new organization. Over time, their organizations came to be under distinct leadership, while still closely coordinating.

To understand Leverage and Paradigm, it’s also important to note that we are much more decentralized than most organizations. We grant members of our teams substantial autonomy in both determining their day-to-day work and with regard to starting new projects.

On residence, new hires typically live at our main building for a few months to give them a place to land and then move out. Currently less than 1/3 of the total staff live on-site.

Comment author: Jeff_Kaufman 06 August 2018 12:27:08PM 21 points [-]

Thanks for clarifying!

Two takeaways for me:

  • Use of both the "Paradigm" and "Leverage" names isn't a reputational dodge, contra throwaway in the original post. The two groups focus on different work and are in the process of fully dividing.

  • People using what they know about Leverage to inform their views of Paradigm is reasonable given their level of overlap in staff and culture, contra Evan here and here.

Comment author: Evan_Gaensbauer 03 August 2018 11:20:48PM *  -1 points [-]

The reason for posting these facts now is that as of the time of writing, Leverage's successor, the Paradigm Academy is seeking to host the EA Summit in one week. The hope is that these facts would firstly help to inform effective altruists on the matter of whether they would be well-advised to attend, and secondly, what approach they may want to take if they do attend.

I've provided my explanations for the following in this comment:

  • No evidence has been provided Paradigm Academy is Leverage's successor. While the OP stated facts about Leverage, all the comments declaring more facts about Leverage Research are merely casting spurious associations between Leverage Research and the EA Summit. Along with the facts, you've smuggled in an assumption amounting to nothing more than a conspiracy theory about Leverage rebranding themselves as Paradigm Academy and is organizing the 2018 EA Summit for some unclear and ominous reason. In addition to no logical reason or sound evidence being provided for how Leverage's negative reputation in EA should be transferred to the upcoming Summit, my interlocutors have admitted themselves or revealed their evidence from personal experience to be weak. I've provided my direct personal experience knowing the parties involved in organizing the EA Summit, and also having paid close attention from afar of Leverage's trajectory in and around EA, contrary to the unsubstantiated thesis the 2018 EA Summit is some opaque machination by Leverage Research.

  • There is no logical connection between the facts about Leverage Research and the purpose of the upcoming EA Summit. Further, the claims presented as facts about the upcoming Summit aren't actually facts.

Leverage Research has recruited from the EA community using mind-maps and other psychological techniques, obtaining dozens of years of work, but doing little apparent good. As a result, the author views it as inadvisable for EAs to engage with Leverage Research and its successor, Paradigm Academy.

At this point, I'll just point out the idea Paradigm is somehow necessarily in any sense Leverage's successor is based on no apparent evidence. So the author's advice doesn't logically follow from the claims made about Leverage Research. What's more, as I demonstrated in my other comments, this event isn't some unilateral attempt by Paradigm Academy to steer EA in some unknown direction.

Rather, they should seek the advice of mentors outside of the Leverage orbit before deciding to attend such an event.

As one of the primary organizers for the EA community in Vancouver, Canada; the primary organizer for the rationality community in Vancouver; a liaison for local representation of these communities with adjacent communities; and an organizer for many novel efforts to coordinate effective altruists, including the EA Newsletter, I don't know if I'd describe myself as a "mentor." But I know others who see me that way, and it wouldn't be unfair of me to say both digitally, and geographically on the west coast; in Vancouver; and in Canada, I am someone who creates more opportunities for many individuals to connect to EA.

Also, if it wasn't clear, I'm well outside the Leverage orbit. If someone wants to accuse me of being a hack for Leverage, I can make some effort to prove I'm not part of their orbit (though I'd like to state that I would still see that as unnecessarily poor faith in this conversation). Anyway, as an outsider and veteran EA community organizer, I'm willing to provide earnest and individuated answers to questions about why I'm going to the 2018 EA Summit; or why and what kind of other effective altruists should also attend. I am not speaking for anyone but myself. I'm willing to do this in-thread as replies to this comment; or, if others would prefer, on social media or in another EA Forum post. Because I don't have as much time, and I'd to answer such questions transparently, I will only answer questions publicly asked of me.

Based on past events such as the Pareto Fellowship, invitees who ultimately decide to attend would be well-advised to be cautious about recruitment, by keeping in touch with friends and mentors throughout.

Unlike the author of this post and comment stated, it doesn't follow this event will be anything like the Pareto Fellowship, as there aren't any facts linking Leverage Research's past track record as an organization to the 2018 EA Summit.

For what it's worth to anyone, I intend to attend the 2018 EA Summit, and I offer as a friend my support and contact regarding any concerns other attendees may have.

Comment author: Jeff_Kaufman 06 August 2018 12:12:00PM *  12 points [-]

See Geoff's reply to me below: Paradigm and Leverage will at some point be separate, but right now they're closely related (both under Geoff etc). I think it's reasonable for people to use Leverage's history and track record in evaluating Paradigm.

Comment author: Geoff_Anders 03 August 2018 04:15:03PM *  24 points [-]

Hi everyone,

I'd like to start off by apologizing. I realize that it has been hard to understand what Leverage has been doing, and I think that that's my fault. Last month Kerry Vaughan convinced me that I needed a new approach to PR and public engagement, and so I've been thinking about what to write and say about this. My plan, apart from the post here, was to post something over the next month. So I'll give a brief response to the points here and then make a longer main post early next week.

(1) I'm sorry for the lack of transparency and public engagement. We did a lot more of this in 2011-2012, but did not really succeed in getting people to understand us. After that, I decided we should just focus on our research. I think people expect more public engagement, even very early in the research process, and that I did not understand this.

(2) We do not consider ourselves an EA organization. We do not solicit funds from individual EAs. Instead, we are EA-friendly, in that (a) we employ many EAs, (b) we let people run EA projects, and (c) we contribute to EA causes, especially EA movement building. As noted in the post, we ran the EA Summit 2013 and EA Summit 2014. These were the precursors to the EA Global conferences. For a sense of what these were like, see the EA Summit 2013 video. We also ran the EA Retreat 2014 and helped out operationally with EA Global 2015. We also founded THINK, the first EA movement group network.

(3) We are probably not the correct donation choice for most EAs. We care about reason, evidence, and impact, but we are much higher variance than most EAs would like. We believe there is evidence that we are more effective than regular charities due to our contributions to movement building. These can be thought of as "impact offsets". (See (6) for more on the EV calculation.)

(4) We are also probably not the correct employment choice for most EAs. We are looking for people with particular skills and characteristics (e.g., ambition, dedication to reason and evidence, self-improvement). These make CFAR our largest EA competitor for talent, though in actual fact we have not ended up competing substantially with them. In general if people are excited about CEA or 80k or Charity Science or GiveWell or OPP, then we typically also expect that they are better served by working there.

(5) Despite (3) and (4), we are of course very interested in meeting EAs who would be good potential donors or recruits. We definitely recruit at EA events, though again we think that most EAs would be better served by working elsewhere.

(6) To do a full EV calculation on Leverage, it is important to take into account the counterfactual cost of employees who would work on other EA projects. We think that taking this into account, counting our research as 0 value, and using the movement building impact estimates from LEAN, we come out well on EV compared to an average charity. This is because of our contribution to EA movement building and because EA movement building is so valuable. (Rather than give a specific Fermi estimate, I will let readers make their own calculations.) Of course, under these assumptions donating to movement building alone is higher EV than donating to Leverage. Donors should only consider us if they assign greater than 0 value to our research.

I hope that that clarifies to some degree Leverage's relation to the EA movement. I'll respond to the specific points above later today.

As for the EA Summit 2018, we agree that everyone should talk with people they know before attending. This is true of any multi-day event. Time is valuable, and it's a good idea to get evidence of the value prior to attending.

(Leverage will not be officially presenting any content at the EA Summit 2018, so people who would like to learn more should contact us here. My own talk will be about how to plan ambitious projects.)

EDIT: I said in my earlier comment that I would write again this evening. I’ll just add a few things to my original post.

— Many of the things listed in the original post are simply good practice. Workshops should track participants to ensure the quality of their experience and that they are receiving value. CFAR also does this. Organizations engaged in recruitment should seek to proactively identify qualified candidates. I’ve spoken to the leaders of multiple organizations who do this.

— Part of what we do is help people to understand themselves better via introspection and psychological frameworks. Many people find this both interesting and useful. All of the mind mapping we did was with the full knowledge and consent of the person, at their request, typically with them watching and error-checking as we went. (I say “did” because we stopped making full mind maps in 2015.) This is just a normal part of showing people what we do. It makes sense for prospective recruits and donors to seek an in-depth look at our tools prior to becoming more involved. We also have strict privacy rules and do not share personal information from charting sessions without explicit permission from the person. This is true for everyone we work with, including prospective recruits and donors.

Comment author: Jeff_Kaufman 04 August 2018 02:34:54PM 30 points [-]

Hi Geoff,

In reading this I'm confused about the relationship between Paradigm and Leverage. People in this thread (well, mostly Evan) seem to be talking about them as if Leverage incubated Paradigm but the two are now fully separate. My understanding, however, was that the two organizations function more like two branches of a single entity? I don't have a full picture or anything, but I thought you ran both organizations, staff of both mostly live at Leverage, people move freely between the two as needed by projects, and what happens under each organization is more a matter of strategy than separate direction?

By analogy, I had thought the relationship of Leverage to Paradigm was much more like CEA vs GWWC (two brands of the same organization) or even CEA UK vs CEA USA (two organizations acting together as one brand) than CEA vs ACE (one organization that spun off another one, which is now operates entirely independently with no overlap of staff etc).

Jeff

Comment author: Evan_Gaensbauer 03 August 2018 10:03:28PM *  4 points [-]
  1. The CEA, the very organization you juxtaposed with Leverage and Paradigm in this comment has in the past been compared to a Ponzi scheme. Effective altruists who otherwise appreciated that criticism thought much of the value was lost in comparing it to a Ponzi scheme, and without it, the criticism may been better received. Additionally, LessWrong and the rationality community; CFAR and MIRI; and all of AI safety have been for years been smeared as a cult by their detractors. The rationality community isn't perfect. There is no guarantee interactions with a self-identified (aspiring) rationality community will "rationally" go however an individual or small group of people interacting with the community, online or in person, hope or expect. But the vast majority of effective altruists, even those who are cynical about these organizations or sub-communities within EA, disagree with how these organizations have been treated, for it poisons the well of good will in EA for everyone. In this comment, you stated your past experience with the Pareto Fellowship and Leverage left you feeling humiliated and manipulated. I've also been a vocal critic in person throughout the EA community of both Leverage Research and how Geoff Anders has led the organization. But that to elevate a personal opposition of them to a public exposure of opposition research in an attempt to tarnish an event they're supporting alongside many other parties in EA is not something I ever did, or will do. My contacts in EA and myself have followed Leverage. I've desisted in making posts like this myself, because digging for context I found Leverage has changed from any impression I've gotten of them. And that's why at first I was skeptical of attending the EA Summit. But upon reflection, I realized it wasn't supported by the evidence to conclude Leverage is so incapable of change that anything they're associated with should be distrusted. But what you're trying to do with Leverage Research is no different than what EA's worst critics do not in an effort to change EA or its members, but to tarnish them. From within or outside of EA, to criticize any EA organization in such a fashion is below any acceptable epistemic standard in this movement.

  2. If the post and comments here are stating facts about Leverage Research, and you're reporting impressions with no ability to remember specific details that Leverage is like a cult, those are barely facts. The only fact is some people perceived Leverage to be like a cult in the past, which are only anecdotes. And without details, they're only hearsay. Combined with the severity of the consequences if this hearsay was borne out, to be unable to produce actual facts invalidates the point you're trying to make.

Comment author: Jeff_Kaufman 04 August 2018 12:48:16PM *  10 points [-]

Given there are usernames like "throwaway" and "throwaway2," and knowing the EA Forum, and its precursor, LessWrong, I'm confident there is only be one account under the username "anonymous," and that all the comments on this post using this account are coming from the same individual.

I'm confused: the comments on Less Wrong you'd see by "person" and "personN" that were the same person happened when importing from Overcoming Bias. That wouldn't be happening here.

They might still be the same person, but I don't think this forum being descended from LessWrong's code tells us things one way or the other.

View more: Next