49

John_Maxwell_IV comments on EA Hotel with free accommodation and board for two years - Effective Altruism Forum

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (65)

You are viewing a single comment's thread. Show more comments above.

Comment author: John_Maxwell_IV 07 June 2018 09:19:36AM 7 points [-]

If they're launching a new great project, they'll very likely be able to get funding from an EA donor

EA Grants rejected 95% of the applications they got.

Comment author: vollmer 08 June 2018 04:23:41PM *  2 points [-]

Sure, but an EA hotel seems like a weird way to address this inefficiency: only few people with worthwhile projects can move to Blackpool to benefit from it, the funding is not flexible, it's hard to target this well, the project has some time lag, etc. The most reasonable approach to fixing this is simply to give more money to some of the projects that didn't get funded.

Maybe CEA will accept 20-30% of EA Grants applications in the next round, or other donors will jump in to fill the gaps. (I'd expect that a lot of the grants applications (maybe half) might have been submitted by people not really familiar with EA, and some of the others weren't worth funding.)

Comment author: John_Maxwell_IV 09 June 2018 12:00:47AM *  3 points [-]

[Disclosure: I'm planning to move to Blackpool before the end of this month.]

only few people with worthwhile projects can move to Blackpool to benefit from it, the funding is not flexible

If you're working on a project full-time, there's a good chance you're not location-constrained.

the project has some time lag

I'm not sure what you're referring to.

Over 3 months passed between the EA grants announcement and disbursement. Does that count as "time lag"?

The disadvantages you cite don't seem compelling to me alongside the advantages cited in this post: dramatically lower costs, supportive EA community, etc. Yes, it's not a great fit for every project--but if you're offered a bargain on supporting one project, it seems silly to avoid taking it just because you weren't offered a bargain on supporting some other project.

I think maybe our core disagreement is that you believe the bottom 95% of EA projects are risky and we should discourage people from funding them. Does that sound like an accurate summary of your beliefs?

I've written some about why I want to see more discussion of downside risk (and I'm compiling notes for several longer posts--maybe if I was living in Blackpool I'd have enough time to write them). However, the position that we should discourage funding the bottom 95% of projects seems really extreme to me, and it also seems like a really crude way to address downside risk.

Even if there is some downside risk from any given project, the expected value is probably positive, solely based on the fact that some EA thinks it's a good idea. Value of information is another consideration in favor of taking action, especially doing research (I'm guessing most people who move to Blackpool will want to do research of some sort).

As for a good way to decrease downside risk, I would like to see a lot more people do what Joey does in this post and ask people to provide the best arguments they can against their project (or maybe ask a question on Metaculus).

The issue with downside risk is not that the world is currently on a wonderful trajectory that must not be disturbed. Rather, any given disturbance is liable to have effects that are hard to predict--but if we spent more time thinking and did more research, maybe we could get a bit better at predicting these effects. Loss aversion will cause us to overweight the possibility of a downside, but if we're trying to maximize expected value then we should weight losses and gains of the same size equally. I'm willing to believe highly experienced EAs are better at thinking about downside risk, but I don't think the advantage is overwhelming, and I suspect being a highly experienced EA can create its own set of blind spots. I am definitely skeptical that CEA can reliably pick the 33 highest-impact projects from a list of 722. Even experienced investors miss winners, and CEA is an inexperienced charitable "investor".

Your example "the first political protest for AI regulation might in expectation do more harm than a thoughtful AI policy project could prevent" is one I agree with, and maybe it illustrates a more general principle that public relations efforts should be done in coordination with major EA organizations. Perhaps it makes sense to think about downside risk differently depending on the kind of work someone is interested in doing.

BTW, if the opinion of experienced EAs is considered reliable, maybe it's worth noting that 80k advocated something like this a few years ago, and ideas like it have been floating around EA for a long time.

Comment author: Liam_Donovan 10 June 2018 06:23:40AM 3 points [-]

I suspect Greg/the manager would not be able to filter projects particularly well based on personal interviews; since the point of the hotel is basically 'hits-based giving', I think a blanket ban on irreversible projects is more useful (and would satisfy most of the concerns in the fb comment vollmer linked)