Comment author: Dunja 19 June 2018 12:09:17AM *  0 points [-]

Oh I agree that for many ideas to be attractive, they have to gain a promising character. I wouldn't reduce the measure of pursuit worthiness of scientific hypotheses to the evidence of their success though: this measure is rather a matter of prospective values, which have to do with a feasible methodology (how many research paths we have despite current problems and anomalies?). But indeed, sometimes research may proceed simply as tapping in the dark, in spite all the good methodological proposals (as e.g. it might have been the case in the research on protein synthesis in the mid 20th c.).

However, my point was simply the question: does such an investment in future proposals outweigh the investment in other topics, so that it should be funded from an EA budget rather than from existing public funds? Again: I very much encourage such camps. Just not on the account of spending the cash meant for effectively reducing suffering (due to these projects being highly risky and due to the fact that they are already heavily funded by say OpPhil).

Comment author: Greg_Colbourn 19 June 2018 12:20:00AM 0 points [-]

My point (and remmelt's) was that public funds would be harder/more time (and resource) consuming to get.

There is currently a gap at the low end (OpenPhil is too big to spend time on funding such small projects).

And Good Ventures/OpenPhil also already fill a lot of the gap in funding programs with track records of effectively reducing suffering.

Comment author: Dunja 18 June 2018 11:14:40PM *  0 points [-]

But this is not about whether academia is on the same page or not; it's about the importance of pushing the results via academic channels because otherwise they won't be recognized by anyone (policy makers especially). Moreover, what I mention above are funding institutions offering the finances of individual projects - assessed in terms of their significance and feasibility. If there is a decent methodology to address the given objectives, even if the issue is controversial, this doesn't mean the project won't be financed. Alternatively, if you actually know of decent project applications that have been rejected, well let's see those and examine whether there is indeed a bias in the field. Finally, why do you think that academia is averse towards risky projects?! Take for instance ERC schemes: they are intentionally designed for high-risk/high-gain project proposals, that are transformative and groundbreaking in character.

Comment author: Greg_Colbourn 18 June 2018 11:51:27PM 0 points [-]

There is an analogy with speculative investing here I think - for something to be widely regarded as worthwhile investing in (i.e. research funded by mainstream academia) it has to already have evidence of success (e.g. Bitcoin now). By which point it is no longer new and highly promising in terms of expected value (like Bitcoin was in, say, 2011) i.e. it is necesssarily the case that all things very high in (relative) expected value are outside the mainstream.

AGI alignment is gaining more credibility, but it still doesn't seem like it's that accepted in mainstream academia.

Anyway, I think we are probably on a bit of a tangent to what AISC is trying to achieve - namely help new researchers level up (/get a foot in the door in academic research).

Comment author: Dunja 17 June 2018 05:41:14PM *  0 points [-]

I very much understand your hope concerning the AI talent and the promising value of this camp. However, I'd also like to see the objective assessment of effectiveness (as in effective altruism) concerning such research attempts. To do so, you would have to show that such research has a comparatively higher chance of producing something outstanding than the existing academic research. Of course, that needs to be done in view of empirical evidence, which I very much hope you can provide. Otherwise, I don't know what sense of "effective" is still present in the meaning of "effective altruism".

Again: I think these kinds of research camps are great as such, i.e. in view of overall epistemic values. They are as valuable as, say, a logic camp, or a camp in agent-based models. However, I would never argue that a camp in agent-based models should be financed by EA funds unless I have empirically grounded reasons that such a research can contribute to effective charity and prevention of possible dangers better than the existing academic research can.

As for the talent search, you seem to assume that academic institutions cannot uncover such talents. I don't know where you get this evidence from, but PhD grants across EU, for instance, are precisely geared towards such talents. Why would talented individuals not apply for those? And where do you get the idea that the topic of AI safety won't be funded by, say, Belgian FWO or German DFG? Again, you would need to provide empirical reasons that such systematic bias against projects on these topics exists.

Finally, if the EA community wants to fund reliable project initiators for the topic of AI safety, why not make an open call for experts in the field to apply with project proposals and form the teams who can immediately execute these projects within the existing academic institutions? Where is this fear of academia coming from? Why would a camp like this be more streamlined than an expert proposal, where a PI of the given project employs the junior researchers and systematically guides them in the given research? In all other aspects of EA this is precisely how we wish to proceed (think of medical research).

Comment author: Greg_Colbourn 18 June 2018 11:09:49PM 0 points [-]

For more on the thinking behind streamlined non-mainstream funding, see https://www.openphilanthropy.org/blog/hits-based-giving

I don't think academia is yet on the same page as EA with regard to AI Safety, but may well be soon hopefully (with credibility coming from the likes of Stuart Russell and Max Tegmark).

Comment author: Denkenberger 18 June 2018 04:15:13AM 0 points [-]

This says 20% of EA is vegan or vegetarian, so I would guess less than 10% vegan. Granted, the hard core EAs you are attracting may be more likely vegan, and you are lowering the barrier if someone else is reading labels and is hopefully a good cook. But I still think you are really limiting your pool by having all meals vegan. I understand you want to be frugal, and vegan from scratch is cheaper, but animal product substitutes are generally more expensive than animal products.

Comment author: Greg_Colbourn 18 June 2018 02:14:29PM 1 point [-]

I've not yet had anyone say it's a dealbreaker (and of course people are allowed to buy meat from takeaways - or microwaveable burgers etc - with their spending money if they are really craving it..). Whilst frugality comes into it, the main reason for the all vegan catering is ethics.

Also, I'd put money on the 2018 survey coming out with higher numbers for veg*anism :)

Comment author: Khorton 11 June 2018 05:17:40PM *  3 points [-]

"In many ways this won’t be a typical hotel (non-profit, longer term stays, self-service breakfast and lunch, simplified dinner menu, weekly linen/towel changes, EA evening events etc), so I’m not sure how much prior hotel experience is relevant. Really anyone who is a reasonably skilled generalist, passionate about the project, and friendly should be able to do it."

I think this is where we disagree. It's taken me years to develop the (rather basic) domestic skills I have. I think it would be quite a challenge for someone like me, who can competently manage a household, to competently manage a hotel with 17 people. For example, when I organized EA London's weekend retreat and oversaw the housing, cooking and cleaning for 25 people, it was really hard and I made some significant mistakes.

This worries me because a large majority of the EAs I meet in London are worse at cooking/cleaning/household management than I am. If I'm not currently capable of the task, and most EAs are less capable than I am, then I wonder who CAN do the job.

There are a couple of things I might be wrong about: maybe people are better at domestic tasks outside of London, or maybe there are one or two exceptional candidates (and that's really all it takes!). But based on my experience, I really don't think "anyone who is a reasonably skilled generalist, passionate about the project, and friendly should be able to do it" - or at least, not to a high standard, not right away.

Comment author: Greg_Colbourn 12 June 2018 01:36:48PM 0 points [-]

when I organized EA London's weekend retreat and oversaw the housing, cooking and cleaning for 25 people, it was really hard and I made some significant mistakes.

Would be interested to hear more details about this (fine to PM).

Also, it's unlikely to be 17 guests all at once to start with, things are ramping up gradually so far (have a few people booked in over the next few weeks), so the learning curve should be relatively gentle.

Comment author: Liam_Donovan 10 June 2018 05:59:29AM *  3 points [-]

From my perspective, the manager should 1. Not (necessarily) be an EA 2. Be paid more (even if this trades off against capacity, etc) 3. Not also be a community mentor

One of the biggest possible failure modes for this project seems to be hiring a not-excellent manager; even a small increase in competence could make a big difference between the project failing and succeeding. Thus, the #1 consideration ought to be "how to maximize the manager's expected skill". Unfortunately, the combination of undesirable location, only hiring EAs, and the low salary seem to restrict the talent pool enormously. My (perhaps totally wrong) impression is that some of these decisions are made on the basis of a vague idea of how things ought to be, rather than a conscious attempt to maximize success.

Brief arguments/responses:

  • Not only are EAs disproportionately unlikely to have operations skills (as 80K points out), but I suspect that the particular role of hotel manager requires even less of the skills we tend to have (such as a flair for optimization), and even more of the skills we tend not to have (consistency, hotel-related metis). I'm unsure of this but it's an important question to evaluate.

  • The manager will only be at the ground floor of a new organization if it doesn't fail. I think failure is more likely than expansion, but it's reasonable to be risk averse considering this is the first project of its kind in EA (diminishing marginal benefit). Consequently, optimizing for initial success seems more important than optimizing for future expansion.

  • The best feasible EA candidate is likely to have less external validation of managerial capability than a similarly qualified external candidate, who might be a hotel manager already! Thus, it'll be harder to actually identify the strong EA candidates, even if they exist.

  • The manager will get free room/board and live in low-CoL Blackpool, but I think this is outweighted by the necessity of moving to an undesirable location, and not being able to choose where you stay/eat. On net, I expect you'd need to offer a higher salary to attract the same level of talent as in, say, Oxford (though with more variance depending on how people perceive Blackpool).

  • You might be able to hire an existing hotel manager in Blackpool, which would reduce risk of turnover and guarantee a reasonable level of competence. This would obviously require separating the hotel manager and the community mentor, but I'm almost certain that doing would maximize the chances of success either way (division of labor!). I'm also not sure what exactly the cost is: the community mentor could just be an extroverted guest working on a particularly flexible project.

  • Presumably many committed and outgoing EAs (i.e. the people you'd want as managers) are already able to live with/near other EAs; moving to Blackpool would just take away their ability to choose who to live with.

Of course, there could already be exceptional candidates expressing interest, but I don't understand why the default isn't hiring a non-EA with direct experience.

Comment author: Greg_Colbourn 11 June 2018 01:40:02PM 0 points [-]

vague idea of how things ought to be, rather than a conscious attempt to maximize success.

I would say it’s a bit more than vague ;) I think it’s important to have someone who really understands and shares the goals of the project. Someone who doesn’t get EA is not likely to care about it much beyond seeing it as a means to get paid. It would then be largely up to part time volunteers (the other Trustees) to direct the project and keep it aligned with EA. This scenario seems more likely to lead to stagnation/failure to me.

less of the skills we tend to have (such as a flair for optimization)

I think a flair for optimisation is needed in any kind of ops role. The more you optimise, the greater your capacity (/free time).

and even more of the skills we tend not to have (consistency, hotel-related metis)

Conscientiousness would be required. But there are a fair amount of EAs with that trait, right?

optimizing for initial success seems more important than optimizing for future expansion.

In practice I think these are mostly the same thing. The more initial success there is, the more likely expansion is. The point I was making is that the manager will have a large stake in the course the project takes, so it will depend on what they make of it (hence meaning it should be seen as an exciting opportunity. I mean yeah, there will be some amount of “boring” (mindfulness promoting?) tasks - but it could be so much more fun than “Hotel Manager in Blackpool” initially sounds).

less external validation of managerial capability than a similarly qualified external candidate, who might be a hotel manager already!

In many ways this won’t be a typical hotel (non-profit, longer term stays, self-service breakfast and lunch, simplified dinner menu, weekly linen/towel changes, EA evening events etc), so I’m not sure how much prior hotel experience is relevant. Really anyone who is a reasonably skilled generalist, passionate about the project, and friendly should be able to do it.

I expect you'd need to offer a higher salary to attract the same level of talent

Salary is open to negotiation (have amended ad).

require separating the hotel manager and the community mentor

I think that once everything is set up, the day-to-day management of the hotel itself won’t require full time hours. Would prefer to have one full time employee rather than two part-time employees, but as I’ve said previously, I am open to splitting the role.

division of labor

As mentioned above, part of optimisation can be outsourcing tasks you are less good at (or don’t like doing). e.g. hiring someone else to do the cooking or laundry (depending on how much you value your time/money).

Comment author: Khorton 07 June 2018 09:58:12PM 2 points [-]

Several of the reasons listed in that article don't matter for the hotel because the hotel manager will be the only full time member of staff. For example, the hotel manager won't be likely to switch into other roles/be promoted at the same organization and won't need to communicate with other staff about EA-specific things. Additionally, the article suggests that being involved in the EA community is a benefit, but not the only thing to consider when hiring. That sounds about right to me.

I would seriously consider splitting up the hotel manager role and the community mentory person. It's hard enough to find an awesome cook who can do 17 people's laundry, keep everything clean, pay all the bills, and keep everything legal. Requiring them to be one of a couple thousand EAs IN THE WORLD sounds really hard.

Comment author: Greg_Colbourn 08 June 2018 11:19:54AM 0 points [-]

the hotel manager won't be likely to switch into other roles/be promoted at the same organization

But they would be in at the ground level of a new organisation that could potentially grow (if the model is franchised, or expands to supporting digital nomads). It should be seen as an exciting opportunity to co-create and mould an institution.

and won't need to communicate with other staff about EA-specific things.

But they will need to communicate with lots of EA guests about EA-specific things.

splitting up the hotel manager role and the community mentory person.

I'm open to doing this as a plan B. A good manager should be able to optimise/outsource the tasks they find tedious though.

Comment author: vollmer 06 June 2018 04:56:20PM *  9 points [-]

First, big kudos for your strong commitment to put your personal funding into this, and for the guts and drive to actually make it happen!

That said, my overall feelings about the project are mixed, mainly for the following reasons (which you also partly discuss in your post):

It seems plausible that most EAs who do valuable work won't be able to benefit from this. If they're students, they'll most likely be studying at a university outside Blackpool and might not be able to do so remotely. If they're launching a new great project, they'll very likely be able to get funding from an EA donor, and there will be major benefits from being in a big city or existing hub such as Oxford, London, or the Bay (so donors should be enthusiastic about covering the living costs of these places). While it's really impressive how low the rent at the hotel will be, rent cost is rarely a major reason for a project's funding constraints (at least outside the SF Bay Area).

Instead, the hotel could become a hub for everyone who doesn't study at a university or work on a project that EA donors find worth funding, i.e. the hotel would mainly support work that the EA community as a whole would view as lower-quality. I'm not saying I'm confident this will happen, but I think the chance is non-trivial without the leadership and presence of highly experienced EAs (who work there as e.g. hotel managers / trustees).

Furthermore, people have repeatedly brought up the argument that the first "bad" EA project in each area can do more harm than an additional "good" EA project, especially if you consider tail risks, and I think this is more likely to be true than not. E.g. the first political protest for AI regulation might in expectation do more harm than a thoughtful AI policy project could prevent. This provides a reason for EAs to be risk-averse. (Specifically, I tentatively disagree with your claims that "we’re probably at the point where there are more false negatives than false positives, so more chances can be taken on people at the low end", and that we should invest "a small amount".) Related: Spencer Greenberg's idea that plenty of startups cause harm.

The fact that this post got way more upvotes than other projects that are similarly exciting in my view (such as Charity Entrepreneurship) also makes me think that the enthusiasm for this project may be partly driven by social reasons (it feels great to have a community hotel hub with likeminded people) as opposed to people's impact assessments. But maybe there's something I'm overlooking, e.g. maybe this post was just shared much more on social media.

What happens if you concentrate a group of EAs who wouldn't get much funding from the broader community in one place and help them work together? I don't know. It could be very positive or very negative. Or it just couldn't lead to much at all. Overall, I think it may not be worth the downside risks.

Comment author: Greg_Colbourn 07 June 2018 05:49:15PM 1 point [-]

maybe this post was just shared much more on social media.

I see Facebook and Twitter share buttons at the bottom of the post (but only when I load the page on my phone). They currently have the numbers 174 and 18 next to them. Seems like an excessive number of Facebook shares!? Surely that can’t be right? (I’ve only seen - and been tagged on - one, in any case. Clicking on the numbers provides no info. as to where the shares went, if indeed they are shares. Ok, actually, clicking on them brings up a share window, but also ups the counter! So maybe that explains a lot as to why the numbers are so high (i.e. people wanting to see where all these shares are going, but only adding to the false counter)).

Comment author: vollmer 06 June 2018 04:56:20PM *  9 points [-]

First, big kudos for your strong commitment to put your personal funding into this, and for the guts and drive to actually make it happen!

That said, my overall feelings about the project are mixed, mainly for the following reasons (which you also partly discuss in your post):

It seems plausible that most EAs who do valuable work won't be able to benefit from this. If they're students, they'll most likely be studying at a university outside Blackpool and might not be able to do so remotely. If they're launching a new great project, they'll very likely be able to get funding from an EA donor, and there will be major benefits from being in a big city or existing hub such as Oxford, London, or the Bay (so donors should be enthusiastic about covering the living costs of these places). While it's really impressive how low the rent at the hotel will be, rent cost is rarely a major reason for a project's funding constraints (at least outside the SF Bay Area).

Instead, the hotel could become a hub for everyone who doesn't study at a university or work on a project that EA donors find worth funding, i.e. the hotel would mainly support work that the EA community as a whole would view as lower-quality. I'm not saying I'm confident this will happen, but I think the chance is non-trivial without the leadership and presence of highly experienced EAs (who work there as e.g. hotel managers / trustees).

Furthermore, people have repeatedly brought up the argument that the first "bad" EA project in each area can do more harm than an additional "good" EA project, especially if you consider tail risks, and I think this is more likely to be true than not. E.g. the first political protest for AI regulation might in expectation do more harm than a thoughtful AI policy project could prevent. This provides a reason for EAs to be risk-averse. (Specifically, I tentatively disagree with your claims that "we’re probably at the point where there are more false negatives than false positives, so more chances can be taken on people at the low end", and that we should invest "a small amount".) Related: Spencer Greenberg's idea that plenty of startups cause harm.

The fact that this post got way more upvotes than other projects that are similarly exciting in my view (such as Charity Entrepreneurship) also makes me think that the enthusiasm for this project may be partly driven by social reasons (it feels great to have a community hotel hub with likeminded people) as opposed to people's impact assessments. But maybe there's something I'm overlooking, e.g. maybe this post was just shared much more on social media.

What happens if you concentrate a group of EAs who wouldn't get much funding from the broader community in one place and help them work together? I don't know. It could be very positive or very negative. Or it just couldn't lead to much at all. Overall, I think it may not be worth the downside risks.

Comment author: Greg_Colbourn 07 June 2018 05:48:58PM *  1 point [-]

If they're students, they'll most likely be studying at a university outside Blackpool and might not be able to do so remotely.

Regarding studying, it would mainly be suitable for those doing so independently online (it’s possible to take many world class courses on EdX and Coursera for free). But could also be of use to university students outside of term time (say to do extra classes online, or an independent research project, over the summer).

they'll very likely be able to get funding from an EA donor

As John Maxwell says, I don’t think we are there yet with current seed funding options.

the hotel would mainly support work that the EA community as a whole would view as lower-quality

This might indeed be so, but given the much lower costs it’s possible that the quality-adjusted-work-per-£-spent rate could still be equal to - or higher than - the community average.

.. without the leadership and presence of highly experienced EAs (who work there as e.g. hotel managers / trustees).

I think it’s important to have experienced EAs in these positions for this reason.

Regarding “bad” EA projects, only one comes to mind, and it doesn’t seem to have caused much lasting damage. In the OP, I say that the “dynamics of status and prestige in the non-profit world seem to be geared toward being averse to risk-of-failure to a much greater extent than in the for-profit world (see e.g. the high rate of failure for VC funded start-ups). Perhaps we need to close this gap, considering that the bottom line results of EA activity are often considered in terms expected utility.” Are PR concerns a solid justification for this discrepancy between EA and VC? Or do Spencer Greenberg’s concerns about start-ups mean that EA is right in this regard and it’s VC that is wrong (even in terms of their approach to maximising monetary value)?

the enthusiasm for this project may be partly driven by social reasons

There’s nothing wrong with this, as long as people participating at the hotel for largely social reasons pay their own way (and don’t disrupt others’ work).

Comment author: Arepo 07 June 2018 12:20:09PM 2 points [-]

Is there any particular reason why the role needs to be filled by an EA? I think we as a community are too focused on hiring internally in general, and in this case almost no engagement with the ideas of EA seems like it would necessary - they just need to be good at running a hotel (and ok with working around a bunch of oddballs).

Comment author: Greg_Colbourn 07 June 2018 04:15:37PM *  1 point [-]

I think 80k make a good case for why it's important to have EAs in ops roles here.

View more: Next