Comment author: lukeprog 16 October 2018 04:51:37AM 0 points [-]
Comment author: SammyF 15 October 2018 10:10:28PM 0 points [-]

Thanks! Kelsey flagged this and I was able to fix it. I really appreciate you letting me know Jal :)

Comment author: jai 15 October 2018 08:16:55PM 1 point [-]

Hello and welcome Sammy! Excited about Future Perfect and looking forward to what Vox does with it. It looks like your comment may have gotten cut off, to the detriment of anyone who wishes to stay informed.

Comment author: SammyF 15 October 2018 06:32:51PM *  7 points [-]

Thanks for sharing Kelsey! I'm the Engagement Manager Kelsey mentioned. I'd love to hear any questions or comments you have here. Also, feel free to shoot us a message at FuturePerfect@Vox.com.

If you're interested in staying informed, I highly recommend subscribing to our newsletter here.

Comment author: turchin 15 October 2018 01:28:11PM 0 points [-]

Thanks - just saw this comment now. Not really miss the idea, but not decoded include it here.

Comment author: Sean_o_h 15 October 2018 11:56:13AM 0 points [-]

I can say anecdotally that at different points, access to excellent executive assistants (or people effectively functioning in such roles) has been hugely helpful for me and other folks in xrisk leadership positions I've known.

Comment author: Khorton 14 October 2018 11:15:53PM *  2 points [-]

Personally, I still think it would be very useful to find more talented people and for more people to consider applying to these roles (Ben)

It needs to be high enough to offset the burnt value from people's investments in those application processes. (Denise)

In theory, there's no real conflict between these two statements. Doubling the number of people who would consider applying for a post shouldn't impose a major cost on those potential applicants. At the same time, we could take steps to make it clearer to potential applicants what exactly the hiring criteria is, so we're not wasting people's time.

In fact, I think it would probably be ideal if we increased the number of people who consider applying for each EA job but decreased the number actually applying!

Comment author: Denise_Melchin 14 October 2018 08:47:29PM 1 point [-]

Oh I agree people will often learn useful things during application processes. I just think the opportunity cost can be very high, especially when processes take months and people have to wait to figure out whether they got into their top options. I also think those costs are especially high in the top applicants - they have to invest the most and might learn the most useful things, but they also lose the most due to higher opportunity costs.

And as you said, people who get filtered out early lose less time and other resources on application processes. But they might still feel negatively about it, especially given the messaging. Maybe their equally rejected friends feel just as bad, which in the future could dissuade other friends who might be potential top hires to even try.

Comment author: Peter_Hurford  (EA Profile) 14 October 2018 08:41:41PM 1 point [-]

You made it to five karma.

Comment author: levitation 14 October 2018 08:06:25PM *  0 points [-]

Testing EA timezone by comment timestamp. Current UTC time = 20:05.

Result: looks like EA is using UTC time since comment timestamp is same as UTC time.

Comment author: jonleighton  (EA Profile) 14 October 2018 07:27:24PM *  0 points [-]

I would urge them to contact me to have a conversation about the specific ways we would use funding in support of local efforts. Contact and donation info are on the OPIS website: http://www.preventsuffering.org/

Comment author: jonleighton  (EA Profile) 14 October 2018 07:14:57PM 1 point [-]

Yes, sure. For patients in the end stages of terminal diseases such as cancer or AIDS who are in severe pain, dependence is clearly not an issue. For others, short-term treatment with opioids has been shown in studies to lead to dependence in only a small fraction of cases. And for those with chronic pain, dependence on medication is arguably much less of a concern than for them to suffer.

The opioid crisis in the US and the irrational response by the authorities to drastically limit opioid prescriptions have been devastating to chronic pain patients, often suddenly deprived of a medication they have used stably for years that allowed them to function. It has also created the false impression that prescribing opioids to patients in need is a significant cause of drug deaths. Although overprescribing of pain medications in the past likely contributed to overuse and dependence, most of the overdoses today are due to street heroin and illegally imported fentanyl, a powerful drug which is also used to lace heroin.

There are also means to limit the risk of prescribed morphine getting into the wrong hands, such as distributing it in diluted oral form, which is of much less use to those with drug dependence - this has been done successfully in Uganda. So although opioids need to be managed carefully and precautions taken so that only those who need it can obtain it, there is ample evidence for how morphine can be provided to patients in need without reasonable grounds for opiophobia.

Comment author: SiebeRozendal 14 October 2018 06:14:53PM 3 points [-]

I just want to note that not every rejected application has been burnt value for me and most have actually been positive, especially in terms of things learned. In the ones I got far it has resulted in more rather than less motivation. In the case I had to do work-related tasks (write research proposal, or execute a sample of typical research) I learned a lot.

On the other hand, increasing the applicants:hired-ratio would mostly increase the proportion of people not getting far in the application process which is where least of the value positive factors are and most of the negative.

Comment author: Annica 14 October 2018 01:37:53PM *  2 points [-]

Hi Kerry! Thanks for the clear description of the application process. One question: What time do applications close today? I was assuming midnight since no time is specified, but came to think of just now that we might be located in different time zones (I am based in Sweden i.e. Central European Time) and that midnight therefore could mean different things to us :) Thanks in advance, Annica

Comment author: vollmer  (EA Profile) 14 October 2018 10:38:03AM 0 points [-]

What would be the next steps funders could take if they would like to support this type of work?

Comment author: RyanCarey 14 October 2018 08:59:42AM *  1 point [-]

we just need to bear in mind that these roles require a very unusual skill-set, so people should always have a good back-up plan

Hmm, if EA work is valuable but the selection bar excludes most EAs, that could actually mean some/many of the following:

  • many people should just have a different plan A.
  • we need to get much better at selecting the best talent
  • we need to recruit much more selectively
  • EAs should have stronger backup plans
Comment author: Gregory_Lewis 14 October 2018 08:47:21AM *  18 points [-]

My hunch is (as implied elsewhere) 'talent-constraint' with 'talent' not further specified is apt to mislead. My impression for longtermist orgs (I understand from Peter and others this may apply less to orgs without this as the predominant focus) is there are two broad classes, which imperfectly line up with 'senior' versus 'junior'.

The 'senior' class probably does fit (commensensically understood) 'talent-constraint', in that orgs or the wider ecosystem want to take everyone who clears a given bar. Yet these bars are high even when conditioned on the already able cohort of (longtermist/)EAs. It might be things like 'ready to run a research group', 'can manage operations for an org (cf. Tara's and Tanya's podcasts)', 'subject matter expertise/ability/track record'.

One common feature is that these people add little further load on current (limited) management capacity, either because they are managing others or are already 'up to speed' to contribute themselves without extensive training or supervision. (Aside: I suspect this is a under-emphasised bonus of 'value-aligned operations staff' - their tacit knowledge of the community/mission/wider ecosystem may permit looser management than bringing on able professionals 'from outside'.) From the perspective of the archetypal 'pluripotent EA' a few years out from undergrad, these are skills which are hard to develop and harder to demonstrate.

More 'junior' roles are those where the criteria are broader (at least in terms of legible ones: 'what it takes' to be a good generalist researcher may be similarly rare to 'what it takes' to be a good technical AI safety researcher, but more can easily 'rule themselves out' of the latter than the former), where 'upskilling' is a major objective, or where there's expectation of extensive 'hands-on' management.

There might be similarly convex returns to getting a slightly better top candidate (e.g. 'excellent versus very good' might be 3x rather than 1.3x). Regardless, there will not be enough positions for all the talented candidates available: even if someone at an org decided to spend their time only managing and training junior staff (and haste considerations might lead them to spending more of their time doing work themselves than investing in the 'next generation'), they can't a dozen at a time.

I think confusing these two broad classes is an easy way of burning a lot of good people (cf. Denise's remarks). If Alice the 23-year-old management consultant might reason on current messaging, "EA jobs are much better for the world than management consultancy, and they're after good people - I seem to fit the bill, so I should switch career into this". She might then forsake her promising early career for an unedifying and unsuccessful period as 'EA perennial applicant', ending up worse than she was at the start. EA has a vocational quality to it - key it does not become a siren song.

There seem a few ways to do this better, as alluded to in prior discussions here and elsewhere:

0) If I'm right, it'd be worth communicating the 'person spec' for cases where (common-sense) talent constraint applies, and where we really would absorb basically as many as we could get (e.g. "We want philosophers to contribute to GPR, and we're after people who either already have a publication record in this area, or have signals of 'superstar' ability even conditioned on philosophy academia. If this is you, please get in touch.").

1) Concurrently, it'd be worth publicising typical applicants:place or similar measures of competition for hiring rounds in more junior roles to allow applicants to be better calibrated/emphasise the importance of plan B. (e.g. "We have early-career roles for people thinking of working as GPR researchers, which serves the purpose of talent identification and development. We generally look for XYZ. Applications for this are extremely competitive (~12:1). Other good first steps for people who want to work in this field are these"). {MIRI's research fellows page does a lot of this well}.

2) It would be good for there to be further work addressed to avoiding 'EA unemployment', as I would guess growth in strong candidates for EA roles will outstrip intra-EA opportunities. Some possibilities:

2.1) There are some areas I'd want to add to the longtermist portfolio which might be broadened into useful niches for people with comparative advantage in them (macrohistory, productivity coaching and nearby versions, EA-relevant bits of psychology, etc.) I don't think these are 'easier' than the existing 'hot' areas, but they are hard in different ways, and so broaden opportunities.

2.2) Another option would be 'pre-caching human capital' into areas which are plausible candidates for becoming important as time goes on. I imagine something like international relations turning out to be crucial (or, contrariwise, relatively unimportant), but it seems better rather than waiting for this to be figured out for instead people to coordinate and invest themselves across the portfolio of plausible candidates. (Easier said than done from the first person perspective, as such a strategy potentially involves making an uncertain bet with many years of one's career, and if it turns out to be a bust ex post the good ex ante EV may not be complete consolation).

2.3) There seem a lot of stakeholders where it would be good for EAs to enter due to the second-order benefits even if their direct work is of limited direct relevance (e.g. having more EAs in tech companies looks promising to me, even if they aren't doing AI safety). (Again, not easy from the first person-perspective).

2.4) A lot of skills for more 'senior' roles can and have been attained outside of the EA community. Grad school is often a good idea for researchers, and professional/management aptitude is often a transferable skill. So some of the options above can be seen as a holding-pattern/bet hedging approach: they hopefully make one a stronger applicant for such roles, but in the meanwhile one is doing useful things (and also potentially earning to give, although I think this should be a minor consideration for longtermist EAs given the field is increasingly flush with cash).

If the framing is changed to something like, "These positions are very valuable, but very competitive - it is definitely worth you applying (as you in expectation increase the quality of the appointed candidate, and the returns of a slightly better candidate are very high), don't bet the farm (or quit the day job) on your application - and if you don't get in, here's things you could do to slant your career to have a bigger impact", I'd hope the burn risk falls dramatically: in many fields there are lots of competitive oversubscribed positions which don't impose huge costs to unsuccessful applicants.

In response to Open Thread #41
Comment author: benjamin-pence 14 October 2018 06:28:52AM 5 points [-]

Title: Shamelessly asking for karma

Hello! My name is Benjamin Pence. I am a multi-year RSS lurker, first time poster. Can the lovely people of the community please give me enough karma to post? I swear I'm not a robot. Probably. I can do CAPTCHAs after all.

Comment author: Peter_Hurford  (EA Profile) 14 October 2018 02:51:22AM *  2 points [-]

I do see large differences in expected impact of potential new hires, but I see a lot of hires who would be net positive additions (even after accounting all the various obvious costs enumerated by Rob) and even had to unfortunately turn away a few people I think would have been rather enormously net positive.

We're not constrained by management capacity but we will be soon.

Comment author: Denise_Melchin 13 October 2018 10:37:03PM 10 points [-]

Personally, I still think it would be very useful to find more talented people and for more people to consider applying to these roles; we just need to bear in mind that these roles require a very unusual skill-set, so people should always have a good back-up plan.

I'm curious what your model of the % value increase in the top hire is when you, say, double current hiring pools. It needs to be high enough to offset the burnt value from people's investments in those application processes. This is not only expensive for individual applicants in the moment, but also carries the long term risk of demotivating people - and thereby having a counterfactually smaller hiring pool in future years.

EA seems to be already at the point where lots of applicants are frustrated and might value drift, thereby dropping out of the hiring pool. I am not keen on making this situation worse. It might cause permanent harm.

Do you agree there's a trade-off here? If so, I'm not sure whether our disagreement comes from different assessments of value increases in the top hire or burnt value in the hiring pool.

View more: Next