16

atucker comments on Building Cooperative Epistemology (Response to "EA has a Lying Problem", among other things) - Effective Altruism Forum

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (68)

You are viewing a single comment's thread. Show more comments above.

Comment author: kbog  (EA Profile) 11 January 2017 10:25:39PM *  1 point [-]

I think that the main point here isn't that the strategy of building power and then do good never works, so much as that someone claiming that this is their plan isn't actually strong evidence that they're going to follow through,

True. But if we already know each other and trust each other's intentions then it's different. Most of us have already done extremely costly activities without clear gain as altruists.

and that it encourages you to be slightly evil more than you have to be.

Maybe, but this is common folk wisdom where you should demand more applicable psychological evidence, instead of assuming that it's actually true to a significant degree. Especially among the atypical subset of the population which is core to EA. Plus, it can be defeated/mitigated, just like other kinds of biases and flaws in people's thinking.

Comment author: atucker 12 January 2017 03:12:29AM 1 point [-]

But if we already know each other and trust each other's intentions then it's different. Most of us have already done extremely costly activities without clear gain as altruists.

That signals altruism, not effectiveness. My main concern is that the EA movement will not be able to maintain the epistemic standards necessary to discover and execute on abnormally effective ways of doing good, not primarily that people won't donate at all. In this light, concerns about core metrics of the EA movement are very relevant. I think the main risk is compromising standards to grow faster rather than people turning out to have been "evil" all along, and I think that growth at the expense of rigor is mostly bad.

Being at all intellectually dishonest is much worse for an intellectual movement's prospects than it is for normal groups.

instead of assuming that it's actually true to a significant degree

The OP cites particular instances of cases where she thinks this accusation is true -- I'm not worried that this is likely in the future, I'm worried that this happens.

Plus, it can be defeated/mitigated, just like other kinds of biases and flaws in people's thinking.

I agree, but I think more likely ways of dealing with the issues involve more credible signals of dealing with the issues than just saying that they should be solvable.

Comment author: kbog  (EA Profile) 12 January 2017 03:45:39AM *  1 point [-]

I think the main risk is compromising standards to grow faster rather than people turning out to have been "evil" all along, and I think that growth at the expense of rigor is mostly bad.

Okay, so there's some optimal balance to be had (there are always ways you can be more rigorous and less growth-oriented, towards a very unreasonable extreme). And we're trying to find the right point, so we can err on either side if we're not careful. I agree that dishonesty is very bad, but I'm just a bit worried that if we all start treating errors on one side like a large controversy then we're going to miss the occasions where we err on the other side, and then go a little too far, because we get really strong and socially damning feedback on one side, and nothing on the other side.

The OP cites particular instances of cases where she thinks this accusation is true -- I'm not worried that this is likely in the future, I'm worried that this happens.

To be perfectly blunt and honest, it's a blog post with some anecdotes. That's fine for saying that there's a problem to be investigated, but not for making conclusions about particular causal mechanisms. We don't have an idea of how these people's motivations changed (maybe they'd have the exact same plans before having come into their positions, maybe they become more fair and careful the more experience and power they get).

Anyway the reason I said that was just to defend the idea that obtaining power can be good overall. Not that there are no such problems associated with it.