49

vollmer comments on EA Hotel with free accommodation and board for two years - Effective Altruism Forum

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (65)

You are viewing a single comment's thread.

Comment author: vollmer 06 June 2018 04:56:20PM *  9 points [-]

First, big kudos for your strong commitment to put your personal funding into this, and for the guts and drive to actually make it happen!

That said, my overall feelings about the project are mixed, mainly for the following reasons (which you also partly discuss in your post):

It seems plausible that most EAs who do valuable work won't be able to benefit from this. If they're students, they'll most likely be studying at a university outside Blackpool and might not be able to do so remotely. If they're launching a new great project, they'll very likely be able to get funding from an EA donor, and there will be major benefits from being in a big city or existing hub such as Oxford, London, or the Bay (so donors should be enthusiastic about covering the living costs of these places). While it's really impressive how low the rent at the hotel will be, rent cost is rarely a major reason for a project's funding constraints (at least outside the SF Bay Area).

Instead, the hotel could become a hub for everyone who doesn't study at a university or work on a project that EA donors find worth funding, i.e. the hotel would mainly support work that the EA community as a whole would view as lower-quality. I'm not saying I'm confident this will happen, but I think the chance is non-trivial without the leadership and presence of highly experienced EAs (who work there as e.g. hotel managers / trustees).

Furthermore, people have repeatedly brought up the argument that the first "bad" EA project in each area can do more harm than an additional "good" EA project, especially if you consider tail risks, and I think this is more likely to be true than not. E.g. the first political protest for AI regulation might in expectation do more harm than a thoughtful AI policy project could prevent. This provides a reason for EAs to be risk-averse. (Specifically, I tentatively disagree with your claims that "we’re probably at the point where there are more false negatives than false positives, so more chances can be taken on people at the low end", and that we should invest "a small amount".) Related: Spencer Greenberg's idea that plenty of startups cause harm.

The fact that this post got way more upvotes than other projects that are similarly exciting in my view (such as Charity Entrepreneurship) also makes me think that the enthusiasm for this project may be partly driven by social reasons (it feels great to have a community hotel hub with likeminded people) as opposed to people's impact assessments. But maybe there's something I'm overlooking, e.g. maybe this post was just shared much more on social media.

What happens if you concentrate a group of EAs who wouldn't get much funding from the broader community in one place and help them work together? I don't know. It could be very positive or very negative. Or it just couldn't lead to much at all. Overall, I think it may not be worth the downside risks.

Comment author: John_Maxwell_IV 07 June 2018 09:19:36AM 7 points [-]

If they're launching a new great project, they'll very likely be able to get funding from an EA donor

EA Grants rejected 95% of the applications they got.

Comment author: vollmer 08 June 2018 04:23:41PM *  2 points [-]

Sure, but an EA hotel seems like a weird way to address this inefficiency: only few people with worthwhile projects can move to Blackpool to benefit from it, the funding is not flexible, it's hard to target this well, the project has some time lag, etc. The most reasonable approach to fixing this is simply to give more money to some of the projects that didn't get funded.

Maybe CEA will accept 20-30% of EA Grants applications in the next round, or other donors will jump in to fill the gaps. (I'd expect that a lot of the grants applications (maybe half) might have been submitted by people not really familiar with EA, and some of the others weren't worth funding.)

Comment author: John_Maxwell_IV 09 June 2018 12:00:47AM *  3 points [-]

[Disclosure: I'm planning to move to Blackpool before the end of this month.]

only few people with worthwhile projects can move to Blackpool to benefit from it, the funding is not flexible

If you're working on a project full-time, there's a good chance you're not location-constrained.

the project has some time lag

I'm not sure what you're referring to.

Over 3 months passed between the EA grants announcement and disbursement. Does that count as "time lag"?

The disadvantages you cite don't seem compelling to me alongside the advantages cited in this post: dramatically lower costs, supportive EA community, etc. Yes, it's not a great fit for every project--but if you're offered a bargain on supporting one project, it seems silly to avoid taking it just because you weren't offered a bargain on supporting some other project.

I think maybe our core disagreement is that you believe the bottom 95% of EA projects are risky and we should discourage people from funding them. Does that sound like an accurate summary of your beliefs?

I've written some about why I want to see more discussion of downside risk (and I'm compiling notes for several longer posts--maybe if I was living in Blackpool I'd have enough time to write them). However, the position that we should discourage funding the bottom 95% of projects seems really extreme to me, and it also seems like a really crude way to address downside risk.

Even if there is some downside risk from any given project, the expected value is probably positive, solely based on the fact that some EA thinks it's a good idea. Value of information is another consideration in favor of taking action, especially doing research (I'm guessing most people who move to Blackpool will want to do research of some sort).

As for a good way to decrease downside risk, I would like to see a lot more people do what Joey does in this post and ask people to provide the best arguments they can against their project (or maybe ask a question on Metaculus).

The issue with downside risk is not that the world is currently on a wonderful trajectory that must not be disturbed. Rather, any given disturbance is liable to have effects that are hard to predict--but if we spent more time thinking and did more research, maybe we could get a bit better at predicting these effects. Loss aversion will cause us to overweight the possibility of a downside, but if we're trying to maximize expected value then we should weight losses and gains of the same size equally. I'm willing to believe highly experienced EAs are better at thinking about downside risk, but I don't think the advantage is overwhelming, and I suspect being a highly experienced EA can create its own set of blind spots. I am definitely skeptical that CEA can reliably pick the 33 highest-impact projects from a list of 722. Even experienced investors miss winners, and CEA is an inexperienced charitable "investor".

Your example "the first political protest for AI regulation might in expectation do more harm than a thoughtful AI policy project could prevent" is one I agree with, and maybe it illustrates a more general principle that public relations efforts should be done in coordination with major EA organizations. Perhaps it makes sense to think about downside risk differently depending on the kind of work someone is interested in doing.

BTW, if the opinion of experienced EAs is considered reliable, maybe it's worth noting that 80k advocated something like this a few years ago, and ideas like it have been floating around EA for a long time.

Comment author: Liam_Donovan 10 June 2018 06:23:40AM 3 points [-]

I suspect Greg/the manager would not be able to filter projects particularly well based on personal interviews; since the point of the hotel is basically 'hits-based giving', I think a blanket ban on irreversible projects is more useful (and would satisfy most of the concerns in the fb comment vollmer linked)

Comment author: MichaelPlant 06 June 2018 05:49:40PM 1 point [-]

Furthermore, people have repeatedly brought up the argument that the first "bad" EA project in each area can do more harm than an additional "good" EA project, especially if you consider tail risks, and I think this is more likely to be true than not. E.g. the first political protest for AI regulation might in expectation do more harm than a thoughtful AI policy project could prevent. This provides a reason for EAs to be risk-averse. (Specifically, I tentatively disagree with your claims that "we’re probably at the point where there are more false negatives than false positives, so more chances can be taken on people at the low end", and that we should invest "a small amount".) Related: Spencer Greenberg's idea that plenty of startups cause harm.

I thought this was pretty vague and abstract. You should say why you expect this particular project to suck!

It seems plausible that most EAs who do valuable work won't be able to benefit from this. If they're students, they'll most likely be studying at a university outside Blackpool and might not be able to do so remotely

I also wonder what the target market is. EA doing remote work? EAs need really cheap accommodation for certain time?

Comment author: vollmer 06 June 2018 09:16:44PM 0 points [-]

I thought this was pretty vague and abstract. You should say why you expect this particular project to suck!

I wasn't making a point about this particular project, but about all the projects this particular project would help.

Comment author: toonalfrink 18 June 2018 01:53:40PM 1 point [-]

Hi Vollmer, appreciate your criticism. Upvoted for that.

While it's really impressive how low the rent at the hotel will be, rent cost is rarely a major reason for a project's funding constraints

Do you realise that the figure cited (3-4k a year) isn't rent cost? It's total living cost. At least in my case that's 4 times as little as what I'm running on, and I'm pretty cheap. For others the difference might be much larger.

For example a project might have an actually high-impact idea that doesn't depend on location. Instead of receiving $150k from CEA to run half a year in the bay with 3 people, they could receive $50k and run for 3 years in Blackpool with 6 people instead. CEA could then fund 3 times as many projects, and it's impact would effectively stretch 623=36 times further. Coming from that perspective, staying in the world's most expensive cities is just non-negotiable. At least for projects (coding, research, etc) that wouldn't benefit an even stronger multiplier from being on-location. And this isn't just projection. I know at least one project that is most likely moving their team to the EA hotel.

Instead, the hotel could become a hub for everyone who doesn't study at a university or work on a project that EA donors find worth funding, i.e. the hotel would mainly support work that the EA community as a whole would view as lower-quality.

I'm pretty sure EA projects find many projects net-positive even if they don't find them worth funding. For the same reason that I'd buy a car if I could afford one. Does that mean I find cars lower-quality than my bicycle? Nope.

Imo it's a very simple equation. EA's need money to live. So they trade (waste) a major slice of their resources to ineffective endeavors for money. We can take away those needs for <10% the cost, effectively making a large amount of people go from part-time to full-time EA. Assuming that the distribution of EA effectiveness isn't too steeply inequal (ie there are still effective EA's out there), this intervention is the most effective I've seen thus far.

Comment author: Greg_Colbourn 07 June 2018 05:49:15PM 1 point [-]

maybe this post was just shared much more on social media.

I see Facebook and Twitter share buttons at the bottom of the post (but only when I load the page on my phone). They currently have the numbers 174 and 18 next to them. Seems like an excessive number of Facebook shares!? Surely that can’t be right? (I’ve only seen - and been tagged on - one, in any case. Clicking on the numbers provides no info. as to where the shares went, if indeed they are shares. Ok, actually, clicking on them brings up a share window, but also ups the counter! So maybe that explains a lot as to why the numbers are so high (i.e. people wanting to see where all these shares are going, but only adding to the false counter)).

Comment author: Greg_Colbourn 07 June 2018 05:48:58PM *  1 point [-]

If they're students, they'll most likely be studying at a university outside Blackpool and might not be able to do so remotely.

Regarding studying, it would mainly be suitable for those doing so independently online (it’s possible to take many world class courses on EdX and Coursera for free). But could also be of use to university students outside of term time (say to do extra classes online, or an independent research project, over the summer).

they'll very likely be able to get funding from an EA donor

As John Maxwell says, I don’t think we are there yet with current seed funding options.

the hotel would mainly support work that the EA community as a whole would view as lower-quality

This might indeed be so, but given the much lower costs it’s possible that the quality-adjusted-work-per-£-spent rate could still be equal to - or higher than - the community average.

.. without the leadership and presence of highly experienced EAs (who work there as e.g. hotel managers / trustees).

I think it’s important to have experienced EAs in these positions for this reason.

Regarding “bad” EA projects, only one comes to mind, and it doesn’t seem to have caused much lasting damage. In the OP, I say that the “dynamics of status and prestige in the non-profit world seem to be geared toward being averse to risk-of-failure to a much greater extent than in the for-profit world (see e.g. the high rate of failure for VC funded start-ups). Perhaps we need to close this gap, considering that the bottom line results of EA activity are often considered in terms expected utility.” Are PR concerns a solid justification for this discrepancy between EA and VC? Or do Spencer Greenberg’s concerns about start-ups mean that EA is right in this regard and it’s VC that is wrong (even in terms of their approach to maximising monetary value)?

the enthusiasm for this project may be partly driven by social reasons

There’s nothing wrong with this, as long as people participating at the hotel for largely social reasons pay their own way (and don’t disrupt others’ work).

Comment author: vollmer 08 June 2018 04:30:16PM *  4 points [-]

Regarding “bad” EA projects, only one comes to mind, and it doesn’t seem to have caused much lasting damage. In the OP, I say that the “dynamics of status and prestige in the non-profit world seem to be geared toward being averse to risk-of-failure to a much greater extent than in the for-profit world (see e.g. the high rate of failure for VC funded start-ups). Perhaps we need to close this gap, considering that the bottom line results of EA activity are often considered in terms expected utility.” Are PR concerns a solid justification for this discrepancy between EA and VC? Or do Spencer Greenberg’s concerns about start-ups mean that EA is right in this regard and it’s VC that is wrong (even in terms of their approach to maximising monetary value)?

Just wanted to flag that I disagree with this for a number of reasons. E.g. I think some of EAF's sub-projects probably had negative impact, and I'm skeptical that these plus InIn were the only ones. I might write an EA forum post about how EA projects can have negative impacts at some point but it's not my current priority. See also this facebook comment for some of the ideas.

Regarding your last point, VCs are maximizing their own profit, but Spencer talks about externalities.

Comment author: Liam_Donovan 10 June 2018 06:13:10AM *  1 point [-]

Following on vollmer's point, it might be reasonable to have a blanket rule against policy/PR/political/etc work -- anything that is irreversible and difficult to evaluate. "Not being able to get funding from other sources" is definitely a negative signal, so it seems worthwhile to restrict guests to projects whose worst possible outcome is unproductively diverting resources.

On the other hand, I really can't imagine what harm research projects could do; I guess the worst case scenario is someone so persuasive they can convince lots of EAs of their ideas but so bad at research their ideas are all wrong, which doesn't seem very likely. (why not 'malicious & persuasive people'? the community can probably identify those more easily by the subjects they write about)

Furthermore, guests' ability to engage in negative-EV projects will be constrained by the low stipend and terrible location (if I wanted to engage in Irish republican activism, living at the EA hotel wouldn't help very much). I think the largest danger to be alert for is reputation risk, especially from bad popularizations of EA, since this is easier to do remotely (one example is Intentional Insights, the only negative-EV EA project I know of)

Comment author: MichaelPlant 11 June 2018 12:04:48PM 2 points [-]

irreversible and difficult to evaluate

This basically applies to everything as a matter of degree, so it looks impossible to put in a blanket rule. Suppose I raise £10 and send it to AMF. That's irreversible. Is it difficult to evaluate? Depends what you mean by 'difficult' and what the comparison class is.

Comment author: vollmer 10 June 2018 04:26:08PM 1 point [-]

I agree research projects are more robustly positive. Information hazards are one main way in which they could do a significant amount of harm.