Comment author: rohinmshah  (EA Profile) 30 October 2017 12:45:33AM 1 point [-]

As one data point, I did not have this association with "impressions" vs. "beliefs", even though I do in fact distinguish between these two kinds of credences and often report both (usually with a long clunky explanation since I don't know of good terminology for it).

Comment author: vipulnaik 30 October 2017 01:00:23AM 6 points [-]

The comments on naming beliefs by Robin Hanson (2008) appears to be how the consensus around the impressions/beliefs distinction began to form (the commenters include such movers and shakers as Eliezer and Anna Salamon).

Also, impression track records by Katja (September 2017) recent blog post/article circulated in the rationalist community that revived the terminology.

Comment author: Ben_Kuhn 22 August 2015 01:24:04AM 2 points [-]

Empirically, discussions of diversity here do seem to be doing a lot better than the ones on the FB group. (I'm thinking particularly of this thread and AGB's post from a while ago.)

Comment author: vipulnaik 29 October 2017 04:26:09PM 3 points [-]

Still think so, in light of the heated discussion in the comments at http://effective-altruism.com/ea/1g3/why_how_to_make_progress_on_diversity_inclusion/ ?

Comment author: DonyChristie 28 October 2017 01:42:18AM *  4 points [-]

That is awesome and exciting!

What made you decide to go down this path? What decision-making procedure was used? How would you advise other people determine whether they are a fit for charity entrepreneurship?

How do you plan on overcoming the lack of expertise? How does the reference class of nonprofit startups founded by non-experts compare to the reference class of nonprofit startups founded by experts?

fortify hEAlth

Is this the actual name? I personally think it's cute, but it might be confusing to those not familiar with the acronym.

I think what you're doing could be very high-impact compared to the counterfactual; indeed, it may be outright heroic. ^_^

Comment author: vipulnaik 28 October 2017 03:35:17AM 6 points [-]

Against Malaria Foundation was started by a guy who had some business and marketing experience but no global health chops. It is now a GiveWell top charity

https://issarice.com/against-malaria-foundation

https://timelines.issarice.com/wiki/Timeline_of_Against_Malaria_Foundation

Disclosure: I funded the creation of the latter page, which inspired the creation of the former.

Comment author: MichaelPlant 27 October 2017 10:29:09AM 2 points [-]

I'm really not sure why my comment was so heavily downvoted without explanation. I'm assuming people think discussion of inclusion issues is a terrible idea. Assuming that is what I've been downvoted for, that makes me feel disappointed in the online EA community and increases my belief this is a problem.

I tried to avoid things that have already been discussed heavily and publicly in the community

I think this may be part of the problem in this context. Some EAs seem to take the attitude (i'm exaggerating a bit for effect) that if there was a post on the internet about it once, it's been discussed. This itself is pretty unwelcoming and exclusive, and it penalises people who haven't been in the community for multiple or haven't spend many hours reading around internet posts. My subjective view is that this topic is under-discussed relative to how much I feel it should be discussed.

Comment author: vipulnaik 27 October 2017 02:02:08PM 7 points [-]

I'm not sure why you brought up the downvoting in your reply to my reply to your comment, rather than replying directly to the downvoted comment. To be clear, though, I did not downvote the comment, ask others to downvote the comment, or hear from others saying they had downvoted the comment.

Also, I could (and should) have been clearer that I was focusing only on points that I didn't see covered in the post, rather than providing an exhaustive list of points. I generally try to comment with marginal value-add rather than reiterating things already mentioned in the post, which I think is sound, but for others who don't know I'm doing that, it can be misleading. Thank you for making me notice that.

Also:

I think this may be part of the problem in this context. Some EAs seem to take the attitude (i'm exaggerating a bit for effect) that if there was a post on the internet about it once, it's been discussed.

In my case, I was basing it on stuff explicitly, directly mentioned in the post on which I am commenting, and a prominently linked post. This isn't "there was a post on the internet about it once" this is more like "it is mentioned right here, in this post". So I don't think my comment is an example of this problem you highlight.

Speaking to the general problem you claim happens, I think it is a reasonable concern. I don't generally endorse expecting people to have intricate knowledge of years' worth of community material. People who cite previous discussions should generally try to link as specifically as possible to them, so that others can easily know what they're talking about without having had a full map of past discussions.

But imo it's also bad to bring up points as if they are brand new, when they have already been discussed before, and especially when others in the discussion have already explicitly linked to past discussions of those points.

Comment author: MichaelPlant 27 October 2017 01:54:20AM 0 points [-]

I find it interesting that most of the examples given in the article conform to mainstream, politically correct opinion about who is and isn't overrepresented

I think about this a different way. I think it weird, given there's so much mainstream discussion of inclusion, that it hasn't seemed to penetrate into EA. That makes EA the odd one out. Hence it might be good to identify the generic blindspots, even if we haven't yet honed in on EA specific ones.

I think you're approach of looking for over-represented people is interested and promising. What I find surprising is that you didn't zone in on the most obvious one, which is that EA is really heavily weighed with philosophers and maths-y types, such as software engineers.

Comment author: vipulnaik 27 October 2017 02:16:59AM *  3 points [-]

I tried to avoid things that have already been discussed heavily and publicly in the community, and I think the math/philosopher angle is one that is often mentioned in the context of EA not being diverse enough. The post itself notes:

"""people who are both that and young, white, cis-male, upper middle class, from men-dominated fields, technology-focused, status-driven, with a propensity for chest-beating, overconfidence, narrow-picture thinking/micro-optimization, and discomfort with emotions."""

This also mentioned in the post by Alexander Gordon-Brown that Kelly links to: http://effective-altruism.com/ea/ek/ea_diversity_unpacking_pandoras_box/

"""EA is heavy on mathematicians, programmers, economists and philosophers. Those groups can get a lot done, but they can't get everything done. If we want to grow, I think we could do with more PR types. Because we're largely web-based, people who understand how to make things visually appealing also seem valuable. My personal experience in London is that we would love more organisers, though I can imagine this varying by location."""

Comment author: MichaelPlant 26 October 2017 07:06:14PM *  2 points [-]

I take your point that skews can happen, but it seems a bit suspicious to me that desire to be effective and altruistic should be so heavily skewed towards white dudes.

Edit: I previous said "straight white dudes" but removed the "straight". See below.

Comment author: vipulnaik 27 October 2017 12:29:34AM *  7 points [-]

"I take your point that skews can happen, but it seems a bit suspicious to me that desire to be effective and altruistic should be so heavily skewed towards straight, white dudes."

(1) Where did "straight" come into this picture? The author says that EAs are well-represented on sexual diversity (and maybe even overrepresented on some fairly atypical sexual orientations), and my comment (and the data I used) had nothing to say about sexual orientation?

(2) """it seems a bit suspicious to me that desire to be effective and altruistic should be so heavily skewed towards straight, white dudes"""

I didn't say that desire to be effective and altruistic is heavily skewed toward men. I just said that membership in a specific community, or readership of a specific website, and things like that, can have significant gender skews, and that is not atypical. The audience for a specific community, like the effective altruist community, can be far smaller than the set of people with desire to be effective and altruistic.

For instance, if a fashion website has a 90% female audience (a not atypical number), that is not a claim that the "desire to look good" is that heavily skewed toward female. It means that the specific things that website caters to, the way it has marketed itself, etc. have resulted in it getting a female audience. Men could also desire to look good, albeit in ways that are very different from those catered to by that fashion website (or more broadly by the majority of present-day fashion websites).

Comment author: vipulnaik 27 October 2017 12:16:19AM *  15 points [-]

I find it interesting that most of the examples given in the article conform to mainstream, politically correct opinion about who is and isn't overrepresented. A pretty similar article could be written about e.g. math graduate students with almost the exact list of overrepresented and underrepresented groups. In that sense it doesn't seem to get to the core of what unique blind spots or expansion problems EA might have.

An alternate perspective would be to look at minorities, subgroups, and geographical patterns that are way overrepresented in EAs relative to the world population, or even, say, the US population; this could help triangulate to blind spots in EA or ways that make it difficult for EA to connect with broader populations. A few things stand out.

Of these, I know at least (1) and (2) have put off people or been major points of concern.

(1) Heavy clustering in the San Francisco Bay Area and a few other population centers, excluding large numbers of people from being able to participate in EA while feeling a meaningful sense of in-person community. It doesn't help that the San Francisco Bay Area is one of the most notoriously expensive in the world, and also located in a country (the United States) that is hard for most people to enter and live in.

(2) Overrepresentation of "poly" sexual orientations and behaviors relative to larger populations -- so that even those who aren't poly have trouble getting along in EA if they don't like rubbing shoulders with poly folks.

(3) Large proportion of people of Jewish descent. I don't think there's any problem with this, but some people might argue that this makes the ethics of EA heavily influenced by traditional Jewish ethical approaches, to the exclusion of other religious and ethical traditions. [This isn't just a reflection of greater success of people of Jewish descent; I think EAs are overrepresented among Jews even after education and income controls].

(4) Overrepresentation of vegetarians and vegans. I'm not complaining, but others might argue that this reduces EAs' ability to connect with the culinary habits and food-related traditions of a lot of cultures.

Comment author: vipulnaik 26 October 2017 02:42:40PM 18 points [-]

You report EA as being 70% male. How unusual is that for a skew? One comparison point for this, for which data is easily abundant, is readerships of websites that are open-to-read (no entry criteria, no member fees). Looking at the distribution of such websites, 70% seems like a relatively low end of skew. For instance, Politico and The Hill, politics news sites, see 70-75% male audiences (https://www.quantcast.com/politico.com#demographicsCard and https://www.quantcast.com/thehill.com#demographicsCard) whereas nbc.com, a mainstream TV, entertainment, and celebrity site, sees a 70% female audience: https://www.quantcast.com/nbc.com#demographicsCard

(I'm not trying to pick anything too extreme, I'm picking things pretty close to the middle. A lot of topics have far more extreme skews, like programming, hardcore gaming, fashion, see https://www.wikihow.com/Understand-Your-Website-Audience-Profile#Understanding_the_gender_composition_and_index_of_your_website_sub for more details on how the gender skew of websites differs based on the topic).

Based on this, and similar data I've seen, a 70% skew in either gender direction feels pretty unremarkable to me in the context of today's broader society and the domain-specific skews that are common across both mainstream and niche domains. I expect something similar to be true for race/ethnicity based on the Quantcast and similar data but I haven't obtained that much familiarity with the numbers or their reliability.

Comment author: vipulnaik 24 October 2017 03:31:09PM *  4 points [-]

Related: Top posts on LessWrong 1.0: http://lesswrong.com/lw/owa/lesswrong_analytics_february_2009_to_january_2017/

Mirror of the same post on LW 2.0 (but still top posts of LW 1.0): https://www.lesserwrong.com/posts/SWNn53RryQgTzT7NQ/lesswrong-analytics-february-2009-to-january-2017

Disclosure: I sponsored work on this post.

Comment author: ClaireZabel 03 October 2017 05:49:37AM 1 point [-]

An Open Phil staff member made a rough guess that it takes them 13-75 hours per grant distributed. Their average grant size is quite a bit larger, so it seems reasonable to assume it would take them about 25 hours to distribute a pot the size of EA Grants.

My experience making grants at Open Phil suggests it would take us substantially more than 25 hours to evaluate the number of grant applications you received, decide which ones to fund, and disburse the money (counting grant investigator, logistics, and communications staff time). I haven't found that time spent scales completely linearly with grant size, though it generally scales up somewhat. So while it seems about right that most grants take 13-75 hours, I don't think it's true that grants that are only a small fraction of the size of most OP grants would take an equally small fraction of that amount of time.

Comment author: vipulnaik 04 October 2017 04:22:41PM 1 point [-]

View more: Next