In response to Open Thread #40
Comment author: vollmer 09 July 2018 06:50:16AM *  2 points [-]

Side note: I'd encourage commenters to put a title at the top of their comments (maybe this can be done in the OP).

Comment author: remmelt  (EA Profile) 04 July 2018 03:56:43PM *  2 points [-]

Hmm, I can’t think of a clear alternative to ‘V2ADC’ yet. Perhaps ‘decision chain’?

Comment author: vollmer 04 July 2018 05:06:10PM 2 points [-]

Yeah, that sounds great. Decision chain (abbreviated "DC").

Comment author: vollmer 04 July 2018 02:13:17PM 2 points [-]

I like the model a lot, thanks for posting!

One input: I think it could be useful to find a term for it that's easier to memorize (and has a shorter abbreviation).

Comment author: John_Maxwell_IV 20 June 2018 07:51:42AM *  10 points [-]

It looks like there is another abandoned wiki here, which was moved to here (but the latter link appears to be overrun with spam). Less than 2 years ago, angelinahli started what looks like a really solid attempt to do something kinda similar to what Marcus & Peter appear to be doing here.

I wish there was some way to solve the meta-level problem of projects like this getting forgotten. If you zoom out a bit, this category appears quite crowded. It seems that attempts to prevent things from getting lost in the sands of time may also get lost in the sands of time!

Maybe it's not an issue of fragmentation so much as lack of discoverability. Perhaps what's needed is to persuade people running the EA forum/EA Facebook Group to link to the directory resource from the forum intro/FB group description. If it's known that a wiki has readers, maybe it will be easier to attract writers. Article view counts might be helpful? (Chicken-and-egg problem: For readers, you need writers, but for writers, you need readers.)

Zooming out, I suspect part of the issue is it feels scummy to promote your own work, so good work often ends up gathering dust--even if the purpose of that work was to help prevent other work from gathering dust! I think people should get over their aversion to promoting their own work some, and also try to promote the work of others more often. Then maybe we can have a wiki which reaches "meme status".

Comment author: vollmer 04 July 2018 01:36:26PM 1 point [-]

+1 for merging/coordinating these various platforms more.

The EA concepts platform is also somewhat similar. It could be important to think about how these projects relate to one another.

Comment author: toonalfrink 18 June 2018 01:53:40PM 3 points [-]

Hi Vollmer, appreciate your criticism. Upvoted for that.

While it's really impressive how low the rent at the hotel will be, rent cost is rarely a major reason for a project's funding constraints

Do you realise that the figure cited (3-4k a year) isn't rent cost? It's total living cost. At least in my case that's 4 times as little as what I'm running on, and I'm pretty cheap. For others the difference might be much larger.

For example a project might have an actually high-impact idea that doesn't depend on location. Instead of receiving $150k from CEA to run half a year in the bay with 3 people, they could receive $50k and run for 3 years in Blackpool with 6 people instead. CEA could then fund 3 times as many projects, and it's impact would effectively stretch 623=36 times further. Coming from that perspective, staying in the world's most expensive cities is just non-negotiable. At least for projects (coding, research, etc) that wouldn't benefit an even stronger multiplier from being on-location. And this isn't just projection. I know at least one project that is most likely moving their team to the EA hotel.

Instead, the hotel could become a hub for everyone who doesn't study at a university or work on a project that EA donors find worth funding, i.e. the hotel would mainly support work that the EA community as a whole would view as lower-quality.

I'm pretty sure EA projects find many projects net-positive even if they don't find them worth funding. For the same reason that I'd buy a car if I could afford one. Does that mean I find cars lower-quality than my bicycle? Nope.

Imo it's a very simple equation. EA's need money to live. So they trade (waste) a major slice of their resources to ineffective endeavors for money. We can take away those needs for <10% the cost, effectively making a large amount of people go from part-time to full-time EA. Assuming that the distribution of EA effectiveness isn't too steeply inequal (ie there are still effective EA's out there), this intervention is the most effective I've seen thus far.

Comment author: vollmer 25 June 2018 07:43:21AM *  1 point [-]

Do you realise that the figure cited (3-4k a year) isn't rent cost? It's total living cost. At least in my case that's 4 times as little as what I'm running on, and I'm pretty cheap. For others the difference might be much larger.

Yes, I do. But in times when talent is the bigger constraint than funding, I'd rather create $100k worth of impact at a financial cost of $25k than $50k at a cost of $4k. Often, interacting in-person with specific people in specific places (often in major hubs) will enable you to increase your impact substantially. This isn't true for everyone, and not always, but it will often be the case, even for coding/research projects. E.g. it's commonly accepted wisdom that for-profit (coding) startups can increase their value substantially by moving to the Bay, and individual programmers can increase their salaries by more than the higher living cost by moving there. Similar things might apply to EA projects in Oxford / London / Berkeley / San Francisco.

So the potential benefits of the EA hotel might be somewhat limited, and there might also be some costs / harms (as I mentioned in the other comments).

Comment author: Liam_Donovan 10 June 2018 06:13:10AM *  3 points [-]

Following on vollmer's point, it might be reasonable to have a blanket rule against policy/PR/political/etc work -- anything that is irreversible and difficult to evaluate. "Not being able to get funding from other sources" is definitely a negative signal, so it seems worthwhile to restrict guests to projects whose worst possible outcome is unproductively diverting resources.

On the other hand, I really can't imagine what harm research projects could do; I guess the worst case scenario is someone so persuasive they can convince lots of EAs of their ideas but so bad at research their ideas are all wrong, which doesn't seem very likely. (why not 'malicious & persuasive people'? the community can probably identify those more easily by the subjects they write about)

Furthermore, guests' ability to engage in negative-EV projects will be constrained by the low stipend and terrible location (if I wanted to engage in Irish republican activism, living at the EA hotel wouldn't help very much). I think the largest danger to be alert for is reputation risk, especially from bad popularizations of EA, since this is easier to do remotely (one example is Intentional Insights, the only negative-EV EA project I know of)

Comment author: vollmer 10 June 2018 04:26:08PM 2 points [-]

I agree research projects are more robustly positive. Information hazards are one main way in which they could do a significant amount of harm.

Comment author: Greg_Colbourn 07 June 2018 05:48:58PM *  1 point [-]

If they're students, they'll most likely be studying at a university outside Blackpool and might not be able to do so remotely.

Regarding studying, it would mainly be suitable for those doing so independently online (it’s possible to take many world class courses on EdX and Coursera for free). But could also be of use to university students outside of term time (say to do extra classes online, or an independent research project, over the summer).

they'll very likely be able to get funding from an EA donor

As John Maxwell says, I don’t think we are there yet with current seed funding options.

the hotel would mainly support work that the EA community as a whole would view as lower-quality

This might indeed be so, but given the much lower costs it’s possible that the quality-adjusted-work-per-£-spent rate could still be equal to - or higher than - the community average.

.. without the leadership and presence of highly experienced EAs (who work there as e.g. hotel managers / trustees).

I think it’s important to have experienced EAs in these positions for this reason.

Regarding “bad” EA projects, only one comes to mind, and it doesn’t seem to have caused much lasting damage. In the OP, I say that the “dynamics of status and prestige in the non-profit world seem to be geared toward being averse to risk-of-failure to a much greater extent than in the for-profit world (see e.g. the high rate of failure for VC funded start-ups). Perhaps we need to close this gap, considering that the bottom line results of EA activity are often considered in terms expected utility.” Are PR concerns a solid justification for this discrepancy between EA and VC? Or do Spencer Greenberg’s concerns about start-ups mean that EA is right in this regard and it’s VC that is wrong (even in terms of their approach to maximising monetary value)?

the enthusiasm for this project may be partly driven by social reasons

There’s nothing wrong with this, as long as people participating at the hotel for largely social reasons pay their own way (and don’t disrupt others’ work).

Comment author: vollmer 08 June 2018 04:30:16PM *  4 points [-]

Regarding “bad” EA projects, only one comes to mind, and it doesn’t seem to have caused much lasting damage. In the OP, I say that the “dynamics of status and prestige in the non-profit world seem to be geared toward being averse to risk-of-failure to a much greater extent than in the for-profit world (see e.g. the high rate of failure for VC funded start-ups). Perhaps we need to close this gap, considering that the bottom line results of EA activity are often considered in terms expected utility.” Are PR concerns a solid justification for this discrepancy between EA and VC? Or do Spencer Greenberg’s concerns about start-ups mean that EA is right in this regard and it’s VC that is wrong (even in terms of their approach to maximising monetary value)?

Just wanted to flag that I disagree with this for a number of reasons. E.g. I think some of EAF's sub-projects probably had negative impact, and I'm skeptical that these plus InIn were the only ones. I might write an EA forum post about how EA projects can have negative impacts at some point but it's not my current priority. See also this facebook comment for some of the ideas.

Regarding your last point, VCs are maximizing their own profit, but Spencer talks about externalities.

Comment author: John_Maxwell_IV 07 June 2018 09:19:36AM 8 points [-]

If they're launching a new great project, they'll very likely be able to get funding from an EA donor

EA Grants rejected 95% of the applications they got.

Comment author: vollmer 08 June 2018 04:23:41PM *  2 points [-]

Sure, but an EA hotel seems like a weird way to address this inefficiency: only few people with worthwhile projects can move to Blackpool to benefit from it, the funding is not flexible, it's hard to target this well, the project has some time lag, etc. The most reasonable approach to fixing this is simply to give more money to some of the projects that didn't get funded.

Maybe CEA will accept 20-30% of EA Grants applications in the next round, or other donors will jump in to fill the gaps. (I'd expect that a lot of the grants applications (maybe half) might have been submitted by people not really familiar with EA, and some of the others weren't worth funding.)

Comment author: MichaelPlant 06 June 2018 05:49:40PM 0 points [-]

Furthermore, people have repeatedly brought up the argument that the first "bad" EA project in each area can do more harm than an additional "good" EA project, especially if you consider tail risks, and I think this is more likely to be true than not. E.g. the first political protest for AI regulation might in expectation do more harm than a thoughtful AI policy project could prevent. This provides a reason for EAs to be risk-averse. (Specifically, I tentatively disagree with your claims that "we’re probably at the point where there are more false negatives than false positives, so more chances can be taken on people at the low end", and that we should invest "a small amount".) Related: Spencer Greenberg's idea that plenty of startups cause harm.

I thought this was pretty vague and abstract. You should say why you expect this particular project to suck!

It seems plausible that most EAs who do valuable work won't be able to benefit from this. If they're students, they'll most likely be studying at a university outside Blackpool and might not be able to do so remotely

I also wonder what the target market is. EA doing remote work? EAs need really cheap accommodation for certain time?

Comment author: vollmer 06 June 2018 09:16:44PM 3 points [-]

I thought this was pretty vague and abstract. You should say why you expect this particular project to suck!

I wasn't making a point about this particular project, but about all the projects this particular project would help.

Comment author: vollmer 06 June 2018 04:56:20PM *  11 points [-]

First, big kudos for your strong commitment to put your personal funding into this, and for the guts and drive to actually make it happen!

That said, my overall feelings about the project are mixed, mainly for the following reasons (which you also partly discuss in your post):

It seems plausible that most EAs who do valuable work won't be able to benefit from this. If they're students, they'll most likely be studying at a university outside Blackpool and might not be able to do so remotely. If they're launching a new great project, they'll very likely be able to get funding from an EA donor, and there will be major benefits from being in a big city or existing hub such as Oxford, London, or the Bay (so donors should be enthusiastic about covering the living costs of these places). While it's really impressive how low the rent at the hotel will be, rent cost is rarely a major reason for a project's funding constraints (at least outside the SF Bay Area).

Instead, the hotel could become a hub for everyone who doesn't study at a university or work on a project that EA donors find worth funding, i.e. the hotel would mainly support work that the EA community as a whole would view as lower-quality. I'm not saying I'm confident this will happen, but I think the chance is non-trivial without the leadership and presence of highly experienced EAs (who work there as e.g. hotel managers / trustees).

Furthermore, people have repeatedly brought up the argument that the first "bad" EA project in each area can do more harm than an additional "good" EA project, especially if you consider tail risks, and I think this is more likely to be true than not. E.g. the first political protest for AI regulation might in expectation do more harm than a thoughtful AI policy project could prevent. This provides a reason for EAs to be risk-averse. (Specifically, I tentatively disagree with your claims that "we’re probably at the point where there are more false negatives than false positives, so more chances can be taken on people at the low end", and that we should invest "a small amount".) Related: Spencer Greenberg's idea that plenty of startups cause harm.

The fact that this post got way more upvotes than other projects that are similarly exciting in my view (such as Charity Entrepreneurship) also makes me think that the enthusiasm for this project may be partly driven by social reasons (it feels great to have a community hotel hub with likeminded people) as opposed to people's impact assessments. But maybe there's something I'm overlooking, e.g. maybe this post was just shared much more on social media.

What happens if you concentrate a group of EAs who wouldn't get much funding from the broader community in one place and help them work together? I don't know. It could be very positive or very negative. Or it just couldn't lead to much at all. Overall, I think it may not be worth the downside risks.

View more: Next