In response to Open Thread #40
Comment author: RandomEA 09 July 2018 12:34:23AM *  5 points [-]

Requesting Help for a Compilation of Top EA Facebook Posts

In December 2015, Claire Zabel posted links to all posts in the EA Facebook group with 50 or more likes or comments. I think it's time for a similar post. From what I understand, the most liked and most commented on posts can be found using the "My groups dashboard" feature on Facebook. Unfortunately, I do not have a Facebook account. I am posting in this thread to request that someone with a Facebook account post the most liked and most commented on posts as a reply to this comment. I can then go through each of them and extract the key information about each (see below) so people can see if there are any they want to read without clicking every single one. I would then post this information as its own forum post. Alternatively, you can do the extracting yourself and post it as a forum post yourself.

Format

  • Author: Initials are used to prevent future employers from easily associating the post with the author (unless the person is a prominent EA who is likely to remain in EA, in which case the full name is used).
  • Year: This can give people context as various ideas have become more or less accepted over time.
  • Text: If the full text is too long, an excerpt is chosen that encapsulates the post.
  • URL: This allows people to read the post for themselves.
  • Link Title: This helps people decide whether to click on the link.
  • Link Author: This is included when the identity of the author is relevant (generally only when the author is an EA).
  • Link URL: This allows people to go directly to the link without having to go to the post first.

You can see examples of this formatting below.

Posts with the Most Likes as of December 2015 (based on Claire Zabel's comment)

1)

2)

3)

4)

5)

Posts with the Most Comments as of December 2015 (based on Claire Zabel's comment)

1) Unable to access

2)

3)

4)

5)

In response to comment by RandomEA on Open Thread #40
Comment author: MichaelPlant 09 July 2018 09:39:58AM 1 point [-]

It seems you need the Grytics tool to do this. I can't work out to do it in facebook itself. Would also be interested to see this.

Comment author: Flodorner 25 June 2018 07:25:24AM 2 points [-]

"to prove this argument I would have to present general information which may be regarded as having informational hazard"

Is there any way to assess the credibility of statements like this (or whether this is actually an argument worth considering in a given specific context)? It seems like you could use this as a general purpose argument for almost everything.

Comment author: MichaelPlant 25 June 2018 09:50:20AM *  1 point [-]

"to prove this argument I would have to present general information which may be regarded as having informational hazard"

I agree statements of this kind are very annoying, whether or not they're true.

Comment author: saulius  (EA Profile) 12 June 2018 02:14:41PM *  1 point [-]

1) >"The numbers on how useful things are seem quite low to me..."

On the scale 1 was "Useless" and 10 was "Life-transforming". But just before asking for feedback, I made a change in the slides and added this meaning to the ratings of the events:

"3 - £100, 5 - £1,000, 8 - £10,000, 10 - £100,000 (e. g. career change)"

I explained it to people as well. This was... not smart. Because of this, some respondents gave low scores to all the events. E.g. someone said that the weekend was "Far more valuable (10-30x the counterfactual)" but did not gave any event a rating that is higher than 4. Others ignored the point and gave high ratings for all events.

That's why I weighted and normalised the ratings. If someone said that the weekend was "Vastly more valuable (>30x counterfactual)", I multiplied all their ratings by a constant so that their highest rating would be 10. If they rated the weekend as "Far more valuable (10-30x the counterfactual)", I multiplied all their ratings so that the highest rating would be 9. 8 for "Much more valuable", 7 for "Somewhat more valuable", and 6 for "About as valuable".

Comment author: MichaelPlant 12 June 2018 03:03:06PM 0 points [-]

Yeah, I thought the ends of the scales might have been more extreme than we'd normally use. It's probably quite hard to get people to sensibly answer unfamiliar, tricky questions.

Comment author: MichaelPlant 12 June 2018 01:43:12PM 4 points [-]

Thanks for writing this up. Three questions

The numbers on how useful things are seem quite low to me. What did you write as the ends of the scale? I'm thinking in terms of net promoter scores where anything below a 9 or a 10 is considered neutral or bad.

Can you explain Hamming circles? I couldn't find out how they worked even after a quick google.

Did you ask people if there was anything they wanted to do on the weekend but didn't do? I'd be curious to see if people came up with anything.

Comment author: Liam_Donovan 10 June 2018 06:13:10AM *  3 points [-]

Following on vollmer's point, it might be reasonable to have a blanket rule against policy/PR/political/etc work -- anything that is irreversible and difficult to evaluate. "Not being able to get funding from other sources" is definitely a negative signal, so it seems worthwhile to restrict guests to projects whose worst possible outcome is unproductively diverting resources.

On the other hand, I really can't imagine what harm research projects could do; I guess the worst case scenario is someone so persuasive they can convince lots of EAs of their ideas but so bad at research their ideas are all wrong, which doesn't seem very likely. (why not 'malicious & persuasive people'? the community can probably identify those more easily by the subjects they write about)

Furthermore, guests' ability to engage in negative-EV projects will be constrained by the low stipend and terrible location (if I wanted to engage in Irish republican activism, living at the EA hotel wouldn't help very much). I think the largest danger to be alert for is reputation risk, especially from bad popularizations of EA, since this is easier to do remotely (one example is Intentional Insights, the only negative-EV EA project I know of)

Comment author: MichaelPlant 11 June 2018 12:04:48PM 2 points [-]

irreversible and difficult to evaluate

This basically applies to everything as a matter of degree, so it looks impossible to put in a blanket rule. Suppose I raise £10 and send it to AMF. That's irreversible. Is it difficult to evaluate? Depends what you mean by 'difficult' and what the comparison class is.

Comment author: Joey 06 June 2018 05:42:31PM 3 points [-]

I expect ~10 people to attend the camp although I do not expect 100% of them will start charities (I would guess ~60% would). Out of charities founded I expect about 50% of them would be GiveWell incubation/ACE recommended. Although it would depend on the year and focus.

Comment author: MichaelPlant 06 June 2018 07:09:00PM 1 point [-]

I expect ~10 people to attend the camp although I do not expect 100% of them will start charities (I would guess ~60% would)

So you mean you expect 6 different charities to start, or that 6 people will be involved in starting a charity, possibly the same one(s)?

Comment author: MichaelPlant 06 June 2018 05:51:33PM 0 points [-]

A potential spanner: how would you restrict this to EAs? Is that legal? I doubt you can refuse service to people on the basis of what would be considerd an irrelevant characteristic. Analogy: could you have a hotel only for people of a certain race or sex?

Comment author: vollmer 06 June 2018 04:56:20PM *  11 points [-]

First, big kudos for your strong commitment to put your personal funding into this, and for the guts and drive to actually make it happen!

That said, my overall feelings about the project are mixed, mainly for the following reasons (which you also partly discuss in your post):

It seems plausible that most EAs who do valuable work won't be able to benefit from this. If they're students, they'll most likely be studying at a university outside Blackpool and might not be able to do so remotely. If they're launching a new great project, they'll very likely be able to get funding from an EA donor, and there will be major benefits from being in a big city or existing hub such as Oxford, London, or the Bay (so donors should be enthusiastic about covering the living costs of these places). While it's really impressive how low the rent at the hotel will be, rent cost is rarely a major reason for a project's funding constraints (at least outside the SF Bay Area).

Instead, the hotel could become a hub for everyone who doesn't study at a university or work on a project that EA donors find worth funding, i.e. the hotel would mainly support work that the EA community as a whole would view as lower-quality. I'm not saying I'm confident this will happen, but I think the chance is non-trivial without the leadership and presence of highly experienced EAs (who work there as e.g. hotel managers / trustees).

Furthermore, people have repeatedly brought up the argument that the first "bad" EA project in each area can do more harm than an additional "good" EA project, especially if you consider tail risks, and I think this is more likely to be true than not. E.g. the first political protest for AI regulation might in expectation do more harm than a thoughtful AI policy project could prevent. This provides a reason for EAs to be risk-averse. (Specifically, I tentatively disagree with your claims that "we’re probably at the point where there are more false negatives than false positives, so more chances can be taken on people at the low end", and that we should invest "a small amount".) Related: Spencer Greenberg's idea that plenty of startups cause harm.

The fact that this post got way more upvotes than other projects that are similarly exciting in my view (such as Charity Entrepreneurship) also makes me think that the enthusiasm for this project may be partly driven by social reasons (it feels great to have a community hotel hub with likeminded people) as opposed to people's impact assessments. But maybe there's something I'm overlooking, e.g. maybe this post was just shared much more on social media.

What happens if you concentrate a group of EAs who wouldn't get much funding from the broader community in one place and help them work together? I don't know. It could be very positive or very negative. Or it just couldn't lead to much at all. Overall, I think it may not be worth the downside risks.

Comment author: MichaelPlant 06 June 2018 05:49:40PM 0 points [-]

Furthermore, people have repeatedly brought up the argument that the first "bad" EA project in each area can do more harm than an additional "good" EA project, especially if you consider tail risks, and I think this is more likely to be true than not. E.g. the first political protest for AI regulation might in expectation do more harm than a thoughtful AI policy project could prevent. This provides a reason for EAs to be risk-averse. (Specifically, I tentatively disagree with your claims that "we’re probably at the point where there are more false negatives than false positives, so more chances can be taken on people at the low end", and that we should invest "a small amount".) Related: Spencer Greenberg's idea that plenty of startups cause harm.

I thought this was pretty vague and abstract. You should say why you expect this particular project to suck!

It seems plausible that most EAs who do valuable work won't be able to benefit from this. If they're students, they'll most likely be studying at a university outside Blackpool and might not be able to do so remotely

I also wonder what the target market is. EA doing remote work? EAs need really cheap accommodation for certain time?

Comment author: Greg_Colbourn 05 June 2018 11:05:44PM 6 points [-]

Maybe I should stress more the fact that the Hotel Manager will get to hang out with loads of cool EAs and make them happy (the number of cool EAs, and their happiness, being somewhat correlated to how well a job they do as Manager). £20k is not bad for Blackpool. And given they also have free accommodation and board too, they should have quite a bit left over to save/donate.

Comment author: MichaelPlant 06 June 2018 05:47:11PM 2 points [-]

I sypathise with Gregory (Lewis') point and it not being an attractive role for an EA. It might work better if billed as a short duration role, possibily for someone who wants to develop operational experience to do so in another EA org.

Comment author: MichaelPlant 04 June 2018 09:59:47PM 9 points [-]

Yeah, this is really cool, good work on this. £45 a month... Just crazy you can by a 17 room hotel in Blackpool for 1/3 the price of a 2 bed flat in London.

View more: Next