Comment author: John_Maxwell_IV 09 June 2018 12:00:47AM *  3 points [-]

[Disclosure: I'm planning to move to Blackpool before the end of this month.]

only few people with worthwhile projects can move to Blackpool to benefit from it, the funding is not flexible

If you're working on a project full-time, there's a good chance you're not location-constrained.

the project has some time lag

I'm not sure what you're referring to.

Over 3 months passed between the EA grants announcement and disbursement. Does that count as "time lag"?

The disadvantages you cite don't seem compelling to me alongside the advantages cited in this post: dramatically lower costs, supportive EA community, etc. Yes, it's not a great fit for every project--but if you're offered a bargain on supporting one project, it seems silly to avoid taking it just because you weren't offered a bargain on supporting some other project.

I think maybe our core disagreement is that you believe the bottom 95% of EA projects are risky and we should discourage people from funding them. Does that sound like an accurate summary of your beliefs?

I've written some about why I want to see more discussion of downside risk (and I'm compiling notes for several longer posts--maybe if I was living in Blackpool I'd have enough time to write them). However, the position that we should discourage funding the bottom 95% of projects seems really extreme to me, and it also seems like a really crude way to address downside risk.

Even if there is some downside risk from any given project, the expected value is probably positive, solely based on the fact that some EA thinks it's a good idea. Value of information is another consideration in favor of taking action, especially doing research (I'm guessing most people who move to Blackpool will want to do research of some sort).

As for a good way to decrease downside risk, I would like to see a lot more people do what Joey does in this post and ask people to provide the best arguments they can against their project (or maybe ask a question on Metaculus).

The issue with downside risk is not that the world is currently on a wonderful trajectory that must not be disturbed. Rather, any given disturbance is liable to have effects that are hard to predict--but if we spent more time thinking and did more research, maybe we could get a bit better at predicting these effects. Loss aversion will cause us to overweight the possibility of a downside, but if we're trying to maximize expected value then we should weight losses and gains of the same size equally. I'm willing to believe highly experienced EAs are better at thinking about downside risk, but I don't think the advantage is overwhelming, and I suspect being a highly experienced EA can create its own set of blind spots. I am definitely skeptical that CEA can reliably pick the 33 highest-impact projects from a list of 722. Even experienced investors miss winners, and CEA is an inexperienced charitable "investor".

Your example "the first political protest for AI regulation might in expectation do more harm than a thoughtful AI policy project could prevent" is one I agree with, and maybe it illustrates a more general principle that public relations efforts should be done in coordination with major EA organizations. Perhaps it makes sense to think about downside risk differently depending on the kind of work someone is interested in doing.

BTW, if the opinion of experienced EAs is considered reliable, maybe it's worth noting that 80k advocated something like this a few years ago, and ideas like it have been floating around EA for a long time.

Comment author: Liam_Donovan 10 June 2018 06:23:40AM 3 points [-]

I suspect Greg/the manager would not be able to filter projects particularly well based on personal interviews; since the point of the hotel is basically 'hits-based giving', I think a blanket ban on irreversible projects is more useful (and would satisfy most of the concerns in the fb comment vollmer linked)

Comment author: Greg_Colbourn 07 June 2018 05:48:58PM *  1 point [-]

If they're students, they'll most likely be studying at a university outside Blackpool and might not be able to do so remotely.

Regarding studying, it would mainly be suitable for those doing so independently online (it’s possible to take many world class courses on EdX and Coursera for free). But could also be of use to university students outside of term time (say to do extra classes online, or an independent research project, over the summer).

they'll very likely be able to get funding from an EA donor

As John Maxwell says, I don’t think we are there yet with current seed funding options.

the hotel would mainly support work that the EA community as a whole would view as lower-quality

This might indeed be so, but given the much lower costs it’s possible that the quality-adjusted-work-per-£-spent rate could still be equal to - or higher than - the community average.

.. without the leadership and presence of highly experienced EAs (who work there as e.g. hotel managers / trustees).

I think it’s important to have experienced EAs in these positions for this reason.

Regarding “bad” EA projects, only one comes to mind, and it doesn’t seem to have caused much lasting damage. In the OP, I say that the “dynamics of status and prestige in the non-profit world seem to be geared toward being averse to risk-of-failure to a much greater extent than in the for-profit world (see e.g. the high rate of failure for VC funded start-ups). Perhaps we need to close this gap, considering that the bottom line results of EA activity are often considered in terms expected utility.” Are PR concerns a solid justification for this discrepancy between EA and VC? Or do Spencer Greenberg’s concerns about start-ups mean that EA is right in this regard and it’s VC that is wrong (even in terms of their approach to maximising monetary value)?

the enthusiasm for this project may be partly driven by social reasons

There’s nothing wrong with this, as long as people participating at the hotel for largely social reasons pay their own way (and don’t disrupt others’ work).

Comment author: Liam_Donovan 10 June 2018 06:13:10AM *  1 point [-]

Following on vollmer's point, it might be reasonable to have a blanket rule against policy/PR/political/etc work -- anything that is irreversible and difficult to evaluate. "Not being able to get funding from other sources" is definitely a negative signal, so it seems worthwhile to restrict guests to projects whose worst possible outcome is unproductively diverting resources.

On the other hand, I really can't imagine what harm research projects could do; I guess the worst case scenario is someone so persuasive they can convince lots of EAs of their ideas but so bad at research their ideas are all wrong, which doesn't seem very likely. (why not 'malicious & persuasive people'? the community can probably identify those more easily by the subjects they write about)

Furthermore, guests' ability to engage in negative-EV projects will be constrained by the low stipend and terrible location (if I wanted to engage in Irish republican activism, living at the EA hotel wouldn't help very much). I think the largest danger to be alert for is reputation risk, especially from bad popularizations of EA, since this is easier to do remotely (one example is Intentional Insights, the only negative-EV EA project I know of)

Comment author: Greg_Colbourn 08 June 2018 11:19:54AM 0 points [-]

the hotel manager won't be likely to switch into other roles/be promoted at the same organization

But they would be in at the ground level of a new organisation that could potentially grow (if the model is franchised, or expands to supporting digital nomads). It should be seen as an exciting opportunity to co-create and mould an institution.

and won't need to communicate with other staff about EA-specific things.

But they will need to communicate with lots of EA guests about EA-specific things.

splitting up the hotel manager role and the community mentory person.

I'm open to doing this as a plan B. A good manager should be able to optimise/outsource the tasks they find tedious though.

Comment author: Liam_Donovan 10 June 2018 05:59:29AM *  3 points [-]

From my perspective, the manager should 1. Not (necessarily) be an EA 2. Be paid more (even if this trades off against capacity, etc) 3. Not also be a community mentor

One of the biggest possible failure modes for this project seems to be hiring a not-excellent manager; even a small increase in competence could make a big difference between the project failing and succeeding. Thus, the #1 consideration ought to be "how to maximize the manager's expected skill". Unfortunately, the combination of undesirable location, only hiring EAs, and the low salary seem to restrict the talent pool enormously. My (perhaps totally wrong) impression is that some of these decisions are made on the basis of a vague idea of how things ought to be, rather than a conscious attempt to maximize success.

Brief arguments/responses:

  • Not only are EAs disproportionately unlikely to have operations skills (as 80K points out), but I suspect that the particular role of hotel manager requires even less of the skills we tend to have (such as a flair for optimization), and even more of the skills we tend not to have (consistency, hotel-related metis). I'm unsure of this but it's an important question to evaluate.

  • The manager will only be at the ground floor of a new organization if it doesn't fail. I think failure is more likely than expansion, but it's reasonable to be risk averse considering this is the first project of its kind in EA (diminishing marginal benefit). Consequently, optimizing for initial success seems more important than optimizing for future expansion.

  • The best feasible EA candidate is likely to have less external validation of managerial capability than a similarly qualified external candidate, who might be a hotel manager already! Thus, it'll be harder to actually identify the strong EA candidates, even if they exist.

  • The manager will get free room/board and live in low-CoL Blackpool, but I think this is outweighted by the necessity of moving to an undesirable location, and not being able to choose where you stay/eat. On net, I expect you'd need to offer a higher salary to attract the same level of talent as in, say, Oxford (though with more variance depending on how people perceive Blackpool).

  • You might be able to hire an existing hotel manager in Blackpool, which would reduce risk of turnover and guarantee a reasonable level of competence. This would obviously require separating the hotel manager and the community mentor, but I'm almost certain that doing would maximize the chances of success either way (division of labor!). I'm also not sure what exactly the cost is: the community mentor could just be an extroverted guest working on a particularly flexible project.

  • Presumably many committed and outgoing EAs (i.e. the people you'd want as managers) are already able to live with/near other EAs; moving to Blackpool would just take away their ability to choose who to live with.

Of course, there could already be exceptional candidates expressing interest, but I don't understand why the default isn't hiring a non-EA with direct experience.

Comment author: MichaelPlant 06 June 2018 05:51:33PM 0 points [-]

A potential spanner: how would you restrict this to EAs? Is that legal? I doubt you can refuse service to people on the basis of what would be considerd an irrelevant characteristic. Analogy: could you have a hotel only for people of a certain race or sex?

Comment author: Liam_Donovan 09 June 2018 03:43:26AM 0 points [-]

It's pretty simple: just get EAs to move in and don't advertise vacancies the rest of the time. That might sound sketchy, but I think it's essentially what the old owners did -- they let friends/long-time guests stay but didn't rent out the rest of the rooms. It might not fly in, like, Tahiti, but Blackpool has an enormous glut of accomodation. The impression I got from Greg is that lots of hotel owners there are already restricting occupancy to friends/family; a de-facto restriction to EAs shouldn't be a major problem, especially since (at least in the US) non-EAs are not a protected class.

Furthermore, if some random person really wants to stay there at inflated rates despite the complete lack of advertising, that would be a net benefit for the hotel, as Greg mentions in his post.

Comment author: Larks 19 January 2018 03:47:55AM 0 points [-]

I think we are much more organised than most, and hence more able to learn from our mistakes.

Comment author: Liam_Donovan 22 January 2018 04:25:46AM 2 points [-]

But would this view have predicted we'd only get 13% matched, well below the EA consensus prediction?

Comment author: AviN 14 January 2018 02:12:26AM 2 points [-]

Yes, if we try again in 2018, I think we can avoid some of the learning curve and improve efficiency. I'd also hope we can use what we learned to get more than 13% matched.

Comment author: Liam_Donovan 14 January 2018 10:34:58AM 3 points [-]

Hopefully...since it's a zero-sum game though, I'm not necessarily convinced that we can improve efficiency and learn from our mistakes more than other groups. In fact, I'd expect the %matched to go down next year, as the % of the matching funds directed by the EA community was far larger than the % of total annual donations made by EAs (and so we're likely to revert to the mean)

Comment author: jserv 01 December 2017 05:44:42PM *  1 point [-]

Thanks for sharing this. As a previous volunteer I understand where you're coming from completely. Unfortunately the scene you described in the woman's house is one that occurs even in the United Kingdom. The conversation you had with the site visitor is quite moving, if you remember anything more specific about her answers I'd be interested to read them.

I have been doing some research on volunteer programmes, especially those that take volunteers abroad and the 'voluntourism' industry. Like Liam, I'm wondering if there is scope for EA to compile a list of the more effective volunteer organisations.

From what I can tell, the key difference seems to be in whether the charity is searching specifically for volunteers with skills that are not locally available.

I am considering taking a voluntary placement with VSO in 2018, one that I have selected for its emphasis on skills and anti-poverty goals. Any other recommendations or comments would be very welcome.

Comment author: Liam_Donovan 01 December 2017 08:23:37PM 1 point [-]

Maybe JPAL-IPA field research qualifies in some sense?

Comment author: turchin 29 November 2017 03:52:30PM 0 points [-]

There are types of arguments which doesn't depend on my motivation, like "deals" and "questions".

For example, if I say "I will sell you 10 paperclips if you will not kill me", - in that case, my motivation is an evidence that I will stick to my side of the deal.

Comment author: Liam_Donovan 01 December 2017 01:59:34PM 0 points [-]

This doesn't make sense either: for example, your questions could be selected in a biased manner to manipulate the AI, and you could be being disingenuous when dealmaking. Generally, it seems like good epistemic practice to discount arguments of any form, including questions, when the person making them is existentially biased towards one side of the discussion

Comment author: Lila 27 November 2017 05:36:50PM 0 points [-]

Is the ai supposed to read this explanation? Seems like it tips your hand?

Comment author: Liam_Donovan 01 December 2017 01:41:58PM *  1 point [-]

Wouldn't this be an issue with or without an explanation? It seems like an AI can reasonably infer from other actions humans in general, or Alexey in particular, take that they are highly motivated to argue against being exterminated. IDK if I'm missing something obvious -- I don't know much about AI safety.

Comment author: Liam_Donovan 25 November 2017 03:08:57PM 0 points [-]

Are there, in fact, any such trips organized by EA charities?

View more: Next