Comment author: Dunja 02 August 2018 10:17:10AM *  0 points [-]

Hi Evan, Here's my response to your comments (including another post of yours from above). By the way, that's a nice example of an industry-compatible research, I agree that such and similar cases can indeed fall into what EAs wish to fund, as long as they are assessed as effective and efficient. I think this is an important debate, so let me challenge some of your points.

Your arguments seem to be based on the assumption that EAs can do EA-related topics more effectively and efficiently than a non-explicitly EA-affiliated academics (but please correct me if I've misunderstood you!), and I think this is a prevalent assumption across this forum (at least when it comes to the topic of AI risks & safety). While I agree that being an EA can contribute to one's motivation for the given research topic, I don't see any rationale for the claim that EAs are more qualified to do scientific research relevant for EA than non-explicit-EAs. That would mean that, say, Christians are a priori more qualified to do research that goes towards some Christian values. I think this is a non sequitur.

Whether a certain group of people can conduct a given project in an effective and efficient way shouldn't primarily depend on their ethical and political mindset (though this may play a motivating role as I've mentioned above), but on the methodological prospects of the given project, on its programmatic character and the capacity of the given scientific group to make an impact. I don't see why EAs --as such-- would qualify for such values anymore than an expert in the given domain can, when placed within the framework of the given project. It is important to keep in mind that we are not talking here about a political activity of spreading EA ideas, but about scientific research which has to be conducted with a necessary rigor in order to make an impact in the scientific community and wider (otherwise nobody will care about the output of the given researchers). This is the kind of criteria that I wished would be present in the assessment of the given grants, rather than who is an EA and who not.

Second, by prioritizing a certain type of group in the given domain of research, the danger of confirmation bias gets increased. This is why feminist epistemologists have been arguing for diversity across the scientific community (rather than for the claim that only feminists should do feminist-compatible scientific research).

Finally, if there is a worry that academic projects focus too much on other issues, the call for funding can always be formulated in such a way that it specifies the desired topics. In this way, academic project proposals can be formulated having EA goals in mind.

Comment author: Evan_Gaensbauer 06 August 2018 06:24:11AM -1 points [-]

Your arguments seem to be based on the assumption that EAs can do EA-related topics more effectively and efficiently than a non-explicitly EA-affiliated academics (but please correct me if I've misunderstood you!), and I think this is a prevalent assumption across this forum (at least when it comes to the topic of AI risks & safety). While I agree that being an EA can contribute to one's motivation for the given research topic, I don't see any rationale for the claim that EAs are more qualified to do scientific research relevant for EA than non-explicit-EAs. That would mean that, say, Christians are a priori more qualified to do research that goes towards some Christian values. I think this is a non sequitur.

I think it's a common perception in EA effective altruists can often do work as efficiently and effectively as academics not explicitly affiliated with EA. Often EAs also think academics can do some if not most EA work than a random non-academic EA. AI safety is more populated with and stems from the rationality community. On average it's more ambivalent towards academia than EA. It's my personal opinion there are a variety of reasons why EA may often have a comparative advantage of doing the research in-house. There are a number of reasons for this.

One is practical. Academics would often have to divide their time doing EA-relevant research with teaching duties. EA tends to focus on unsexy research topics, so academics may be likelier to get grants for focusing on irrelevant research. Depending on the field, the politics of research can distort the epistemology of academia so it won't work for EA's purposes. These are constraints effective altruists working full-time at NPOs funded by other effective altruists don't face, allowing them to dedicate all their attention to their organization's mission.

Personally, my confidence in EA to make progress on research and other projects for a wide variety of goals is bolstered by some original research in multiple causes being lauded by academics as some of the best on the subject they've seen. Of course, these are NPOs focused on addressing neglected problems in global poverty, animal advocacy campaigns, and other niche areas. Some of the biggest successes in EA come from close collaborations with academia. I think most EAs would encourage more cooperation between academia and EA. I've pushed in the past for EA making more grants to academics doing sympathetic research. Attracting talent with an academic research background to EA can be difficult. I agree with you overall EA's current approach doesn't make sense.

I think you've got a lot of good points. I'd encourage you to make a post out of some of the comments I made here. I think one reason your posts might be poorly received is because some causes in EA, especially AI safety/alignment, have received a lot of poor criticism in the past merely for trying to do formal research outside of academia. I could review a post before you post it to the EA Forum to suggest edits so it would be better received. Either way, I think EA integrating more with academia is a great idea.

Comment author: ea247 03 August 2018 08:27:12PM 8 points [-]

I think EA Forum karma isn't the best because a lot of the people who are particularly engaged in EA do not spend much time on the forum and instead focus on more action-relevant things for their org. The EA Forum will be biased towards people more interested in research and community related things as opposed to direct actions. For example, New Incentives is a very EA aligned org in direct poverty, but they spend most of their time doing cash transfers in Nigeria instead of posting on the forum.

To build on your idea though, I think forming some sort of index of involvement would get away from any one particular thing biasing the results. I think including karma in the index makes sense, along with length of involvement, hours per week involved in EA, percent donated, etc.

Comment author: Evan_Gaensbauer 06 August 2018 04:10:52AM 0 points [-]

I'm working on a project to scale up volunteer work opportunities with all kinds of EA organizations. Part of what I wanted to do is develop a system for EA organizations to delegate tasks to volunteers, including writing blog posts. This could help EA orgs like New Incentives get more of their content up on the EA Forum, such as research summaries and progress updates. Do you think orgs would find this valuable.

Comment author: ZachWeems 05 August 2018 12:01:52AM 0 points [-]

Meta:

It might be worthwhile to have some sort of flag or content warning for potentially controversial posts like this.

On the other hand, this could be misused by people who dislike the EA movement, who could use it as a search parameter to find and "signal-boost" content that looks bad when taken out of context.

Comment author: Evan_Gaensbauer 05 August 2018 07:01:29PM 1 point [-]

I agree with kbog, while this is unusual for discourse for the EA Forum, this is still far above a bar where I think it's practical to be worried about controversy. If someone thinks the content of a post on the EA Forum might trigger some reader(s), I don't see anything wrong with including content warnings on posts. I'm unsure what you mean by "flagging" potentially controversial content.

Comment author: kbog  (EA Profile) 05 August 2018 09:02:18AM *  4 points [-]

Your comments seem to be way longer than they need to be because you don't trust other users here. Like, if someone comes and says they felt like it was a cult, I'm just going to think "OK, someone felt like it was a cult." I'm not going to assume that they are doing secret blood rituals, I'm not going to assume that it's a proven fact. I don't need all these qualifications about the difference between cultishness and a stereotypical cult, I don't need all these qualifications about the inherent uncertainty of the issue, that stuff is old hat. This is the EA Forum, an internal space where issues are supposed to be worked out calmly; surely here, if anywhere, is a place where frank criticism is okay, and where we can extend the benefit of the doubt. I think you're wasting a lot of time, and implicitly signaling that the issue is more of a drama mine than it should be.

Comment author: Evan_Gaensbauer 05 August 2018 06:56:07PM 1 point [-]

I admit I'm coming from a place of not entirely trusting all other users here. That may be a factor in why my comments are longer in this thread than they need to be. I tend to write more than is necessary in general. For what it's worth, I treat the EA Forum not as an internal space but how I'd ideally like to see it be used. That is as a primary platform for EA discourse, on par with a level of activity more akin to the 'Effective Altruism' Facebook group, or LessWrong.

I admit I've been wasting time. I've stopped responding directly to the OP because if I'm coming across as implicitly signaling this issue is a drama mine, I should come out and say what I actually believe. I may make a top-level post about. I haven't decided yet.

Comment author: BenHoffman 05 August 2018 01:14:25AM 4 points [-]

"Compared to a Ponzi scheme" seems like a pretty unfortunate compression of what I actually wrote. Better would be to say that I claimed that a large share of ventures, including a large subset of EA, and the US government, have substantial structural similarities to Ponzi schemes.

Maybe my criticism would have been better received if I'd left out the part that seems to be hard for people to understand; but then it would have been different and less important criticism.

Comment author: Evan_Gaensbauer 05 August 2018 06:49:39PM *  0 points [-]

[epistemic status: meta]

Summary: Reading comments in this thread which are similar reactions I've seen you or other rationality bloggers receive from effective altruists on critical posts regarding EA, I think there is a pattern to how rationalists may tend to write on important topics that doesn't gel with the typical EA mindset. Consequentially, it seems the pragmatic thing for us to do would be to figure out how to alter how we write to get our message across to a broader audience.

"Compared to a Ponzi scheme" seems like a pretty unfortunate compression of what I actually wrote. Better would be to say that I claimed that a large share of ventures, including a large subset of EA, and the US government, have substantial structural similarities to Ponzi schemes.

Upvoted.

I don't if you've read some of the other comments in this thread. But some of the most upvoted ones are about how I need to change up my writing style. So unfortunate compressions of what I actually write aren't new to me, either. I'm sorry I compressed what you actually wrote. But even an accurate compression of what you actually wrote might make my comments too long for what most users prefer on the EA Forum. If I just linked to your original post, it would be too long for us to read.

I spend more of my time on EA projects. If there were more promising projects coming out of the rationality community, I'd spend more time on them relative to how much time I dedicate to EA projects. But I go where the action is. Socially, I'm as if not more socially involved with the rationality community than I am with EA.

From my inside view, here is how I'd describe the common problem with my writing on the EA Forum: I came here from LessWrong. Relative to LW, I haven't found how or what I write on the EA Forum to be too long. But that's because I'm anchoring off EA discourse looking like SSC 100% of the time. But since the majority of EAs don't self-identify as rationalists, and the movement is so intellectually diverse, the expectation is the EA Forum won't be formatted on any discourse style common to the rationalist diaspora.

I've touched upon this issue with Ray Arnold before. Zvi has touched on it too in some of his blog posts about EA. A crude rationalist impression might be the problem with discourse on the EA Forum is it isn't LW. In terms of genres of creative non-fiction writing, the EA Forum is less tolerant of diversity than LW. That's fine. Thinking about this consequentially, I think rationalists who want their message heard by EA more don't need to learn to write better, but write different.

Comment author: Jeff_Kaufman 04 August 2018 12:48:16PM *  10 points [-]

Given there are usernames like "throwaway" and "throwaway2," and knowing the EA Forum, and its precursor, LessWrong, I'm confident there is only be one account under the username "anonymous," and that all the comments on this post using this account are coming from the same individual.

I'm confused: the comments on Less Wrong you'd see by "person" and "personN" that were the same person happened when importing from Overcoming Bias. That wouldn't be happening here.

They might still be the same person, but I don't think this forum being descended from LessWrong's code tells us things one way or the other.

Comment author: Evan_Gaensbauer 05 August 2018 12:24:15AM 1 point [-]

Thanks. I wasn't aware of that. I'll redact that part of my comment.

Comment author: nbouscal 04 August 2018 05:08:07PM 28 points [-]

I’m unconvinced that ole_koksvik’s fluency in English has anything to do with it. Fluent English speakers misspell words like “idiosyncratic” regularly, and I and other fluent English speakers also find your posts difficult to follow. I generally end up skimming them, because the ratio of content to unnecessary verbosity is really low. If your goal is to get your evidence and views onto the table as quickly as possible, consider that your current strategy isn’t getting them out there at all for some portion of your audience, and that a short delay for editing could significantly expand your reach.

Comment author: Evan_Gaensbauer 04 August 2018 11:47:16PM 3 points [-]

Yeah, that has become abundantly clear to me with how many upvotes these comments were receiving. I've received feedback on this before, but never with such a strong signal before. Sometimes I have different goals with my public writing at different times. So it's not always my intention for how I write to be maximally accessible to everyone. I usually know who reads my posts, and why they appreciate them, as I receive a lot of positive feedback as well. It's evident I've generalized that in this thread to the point it's hurting the general impact of spreading my message. So I completely agree. Thanks for the feedback :)

Comment author: Khorton 04 August 2018 05:13:43PM 13 points [-]

Seconded. As a time-saving measure, I skip any comments longer than three paragraphs unless the first couple of sentences makes their importance very clear. Unfortunately, that means I miss most of Evan's posts. :(

Comment author: Evan_Gaensbauer 04 August 2018 11:38:59PM 1 point [-]

Would it help if I included a summary of my posts at the top of them?

Often I write for a specific audience, which is more limited and exclusive. I don't think there is anything necessarily wrong with taking this approach to discourse in EA. Top-level posts on the EA Forum are made specific to a single cause, written in an academic style for a niche audience. I've mentally generalized this to how I write about anything on the internet.

It turns out not writing in a more inclusive way is harming the impact of my messages more than I thought. I'll make more effort to change this. Thanks for the feedback.

Comment author: Milan_Griffes 04 August 2018 05:01:53PM 16 points [-]

when I'm making public comments without a time crunch

My hunch is even when there's a time crunch, fewer words will be bigger bang for buck :-)

Comment author: Evan_Gaensbauer 04 August 2018 11:21:25PM 3 points [-]

Of course. What I was trying to explain is when there is a time crunch, I've habituated myself to use more words. Obviously it's a habit worth changing. Thanks for the feedback :)

Comment author: throwaway2 04 August 2018 09:37:47AM 8 points [-]

Given there are usernames like "throwaway" and "throwaway2," and knowing the EA Forum, and its precursor, LessWrong, I'm confident there is only be one account under the username "anonymous," and that all the comments on this post using this account are coming from the same individual.

I don't feel comfortable sharing the reasons for remaining anonymous in public, but I would be happy to disclose my identity to a trustworthy person to prove that this is my only fake account.

Comment author: Evan_Gaensbauer 04 August 2018 11:15:20PM 1 point [-]

Upvoted. I'm sorry for the ambiguity of my comment. I meant each of the posts here under the usernames "throwaway," "throwaway2," and "anonymous" are each consistently being made by same three people, respectively. I was just clarifying up front as I was addressing you for others reading it's almost certainly the same anonymous individual making the comments under the same account. I wouldn't expect you to forgo your anonymity.

View more: Next